Oct 03 07:48:17 crc systemd[1]: Starting Kubernetes Kubelet... Oct 03 07:48:18 crc restorecon[4663]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 07:48:18 crc restorecon[4663]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 07:48:19 crc restorecon[4663]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 07:48:19 crc restorecon[4663]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 07:48:19 crc restorecon[4663]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 07:48:19 crc restorecon[4663]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 07:48:19 crc restorecon[4663]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 07:48:19 crc restorecon[4663]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 07:48:19 crc restorecon[4663]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 07:48:19 crc restorecon[4663]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 07:48:19 crc restorecon[4663]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 07:48:19 crc restorecon[4663]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 07:48:19 crc restorecon[4663]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 07:48:19 crc restorecon[4663]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 07:48:19 crc restorecon[4663]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 07:48:19 crc restorecon[4663]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 07:48:19 crc restorecon[4663]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 07:48:19 crc restorecon[4663]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 03 07:48:19 crc restorecon[4663]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 07:48:19 crc restorecon[4663]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 07:48:19 crc restorecon[4663]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 07:48:19 crc restorecon[4663]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 07:48:19 crc restorecon[4663]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 07:48:19 crc restorecon[4663]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 07:48:19 crc restorecon[4663]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 07:48:19 crc restorecon[4663]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 07:48:19 crc restorecon[4663]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 07:48:19 crc restorecon[4663]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 07:48:19 crc restorecon[4663]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 07:48:19 crc restorecon[4663]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 03 07:48:19 crc kubenswrapper[4664]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 03 07:48:19 crc kubenswrapper[4664]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 03 07:48:19 crc kubenswrapper[4664]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 03 07:48:19 crc kubenswrapper[4664]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 03 07:48:19 crc kubenswrapper[4664]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 03 07:48:19 crc kubenswrapper[4664]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.658513 4664 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664376 4664 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664430 4664 feature_gate.go:330] unrecognized feature gate: Example Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664436 4664 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664440 4664 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664444 4664 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664447 4664 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664451 4664 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664456 4664 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664459 4664 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664466 4664 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664470 4664 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664474 4664 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664477 4664 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664481 4664 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664484 4664 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664488 4664 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664491 4664 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664495 4664 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664499 4664 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664503 4664 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664507 4664 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664511 4664 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664515 4664 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664519 4664 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664522 4664 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664526 4664 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664529 4664 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664533 4664 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664536 4664 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664540 4664 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664543 4664 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664547 4664 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664550 4664 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664554 4664 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664558 4664 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664561 4664 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664564 4664 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664568 4664 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664572 4664 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664575 4664 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664578 4664 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664582 4664 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664585 4664 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664589 4664 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664592 4664 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664598 4664 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664622 4664 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664627 4664 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664632 4664 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664636 4664 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664641 4664 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664646 4664 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664651 4664 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664655 4664 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664659 4664 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664662 4664 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664666 4664 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664669 4664 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664672 4664 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664676 4664 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664679 4664 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664683 4664 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664687 4664 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664692 4664 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664696 4664 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664700 4664 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664704 4664 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664709 4664 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664713 4664 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664718 4664 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.664722 4664 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.664804 4664 flags.go:64] FLAG: --address="0.0.0.0" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.664813 4664 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.664820 4664 flags.go:64] FLAG: --anonymous-auth="true" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.664826 4664 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.664832 4664 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.664836 4664 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.664842 4664 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.664848 4664 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.664852 4664 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.664857 4664 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.664862 4664 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.664867 4664 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.664872 4664 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.664876 4664 flags.go:64] FLAG: --cgroup-root="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.664881 4664 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.664885 4664 flags.go:64] FLAG: --client-ca-file="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.664888 4664 flags.go:64] FLAG: --cloud-config="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.664893 4664 flags.go:64] FLAG: --cloud-provider="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.664897 4664 flags.go:64] FLAG: --cluster-dns="[]" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.664902 4664 flags.go:64] FLAG: --cluster-domain="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.664906 4664 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.664910 4664 flags.go:64] FLAG: --config-dir="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.664915 4664 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.664919 4664 flags.go:64] FLAG: --container-log-max-files="5" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.664924 4664 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.664928 4664 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.664933 4664 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.664937 4664 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.664941 4664 flags.go:64] FLAG: --contention-profiling="false" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.664945 4664 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.664950 4664 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.664954 4664 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.664958 4664 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.664963 4664 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.664967 4664 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.664971 4664 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.664975 4664 flags.go:64] FLAG: --enable-load-reader="false" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.664979 4664 flags.go:64] FLAG: --enable-server="true" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.664983 4664 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.664988 4664 flags.go:64] FLAG: --event-burst="100" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.664993 4664 flags.go:64] FLAG: --event-qps="50" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.664997 4664 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665001 4664 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665006 4664 flags.go:64] FLAG: --eviction-hard="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665012 4664 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665016 4664 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665021 4664 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665025 4664 flags.go:64] FLAG: --eviction-soft="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665029 4664 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665033 4664 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665037 4664 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665041 4664 flags.go:64] FLAG: --experimental-mounter-path="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665045 4664 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665049 4664 flags.go:64] FLAG: --fail-swap-on="true" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665053 4664 flags.go:64] FLAG: --feature-gates="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665058 4664 flags.go:64] FLAG: --file-check-frequency="20s" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665062 4664 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665066 4664 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665071 4664 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665075 4664 flags.go:64] FLAG: --healthz-port="10248" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665079 4664 flags.go:64] FLAG: --help="false" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665084 4664 flags.go:64] FLAG: --hostname-override="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665088 4664 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665092 4664 flags.go:64] FLAG: --http-check-frequency="20s" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665097 4664 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665101 4664 flags.go:64] FLAG: --image-credential-provider-config="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665105 4664 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665109 4664 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665113 4664 flags.go:64] FLAG: --image-service-endpoint="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665117 4664 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665121 4664 flags.go:64] FLAG: --kube-api-burst="100" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665125 4664 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665130 4664 flags.go:64] FLAG: --kube-api-qps="50" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665134 4664 flags.go:64] FLAG: --kube-reserved="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665138 4664 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665142 4664 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665148 4664 flags.go:64] FLAG: --kubelet-cgroups="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665152 4664 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665156 4664 flags.go:64] FLAG: --lock-file="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665160 4664 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665164 4664 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665168 4664 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665178 4664 flags.go:64] FLAG: --log-json-split-stream="false" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665182 4664 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665186 4664 flags.go:64] FLAG: --log-text-split-stream="false" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665190 4664 flags.go:64] FLAG: --logging-format="text" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665194 4664 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665199 4664 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665203 4664 flags.go:64] FLAG: --manifest-url="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665207 4664 flags.go:64] FLAG: --manifest-url-header="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665219 4664 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665223 4664 flags.go:64] FLAG: --max-open-files="1000000" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665229 4664 flags.go:64] FLAG: --max-pods="110" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665233 4664 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665237 4664 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665241 4664 flags.go:64] FLAG: --memory-manager-policy="None" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665245 4664 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665249 4664 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665253 4664 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665258 4664 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665268 4664 flags.go:64] FLAG: --node-status-max-images="50" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665272 4664 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665276 4664 flags.go:64] FLAG: --oom-score-adj="-999" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665280 4664 flags.go:64] FLAG: --pod-cidr="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665284 4664 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665290 4664 flags.go:64] FLAG: --pod-manifest-path="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665294 4664 flags.go:64] FLAG: --pod-max-pids="-1" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665298 4664 flags.go:64] FLAG: --pods-per-core="0" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665302 4664 flags.go:64] FLAG: --port="10250" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665307 4664 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665311 4664 flags.go:64] FLAG: --provider-id="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665315 4664 flags.go:64] FLAG: --qos-reserved="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665319 4664 flags.go:64] FLAG: --read-only-port="10255" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665323 4664 flags.go:64] FLAG: --register-node="true" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665327 4664 flags.go:64] FLAG: --register-schedulable="true" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665332 4664 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665338 4664 flags.go:64] FLAG: --registry-burst="10" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665342 4664 flags.go:64] FLAG: --registry-qps="5" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665347 4664 flags.go:64] FLAG: --reserved-cpus="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665351 4664 flags.go:64] FLAG: --reserved-memory="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665356 4664 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665360 4664 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665364 4664 flags.go:64] FLAG: --rotate-certificates="false" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665369 4664 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665373 4664 flags.go:64] FLAG: --runonce="false" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665377 4664 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665381 4664 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665385 4664 flags.go:64] FLAG: --seccomp-default="false" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665389 4664 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665394 4664 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665398 4664 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665402 4664 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665406 4664 flags.go:64] FLAG: --storage-driver-password="root" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665410 4664 flags.go:64] FLAG: --storage-driver-secure="false" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665414 4664 flags.go:64] FLAG: --storage-driver-table="stats" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665418 4664 flags.go:64] FLAG: --storage-driver-user="root" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665422 4664 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665427 4664 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665431 4664 flags.go:64] FLAG: --system-cgroups="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665435 4664 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665441 4664 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665445 4664 flags.go:64] FLAG: --tls-cert-file="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665449 4664 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665454 4664 flags.go:64] FLAG: --tls-min-version="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665458 4664 flags.go:64] FLAG: --tls-private-key-file="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665462 4664 flags.go:64] FLAG: --topology-manager-policy="none" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665466 4664 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665470 4664 flags.go:64] FLAG: --topology-manager-scope="container" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665474 4664 flags.go:64] FLAG: --v="2" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665480 4664 flags.go:64] FLAG: --version="false" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665485 4664 flags.go:64] FLAG: --vmodule="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665490 4664 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665494 4664 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665628 4664 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665635 4664 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665640 4664 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665644 4664 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665647 4664 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665651 4664 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665655 4664 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665659 4664 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665667 4664 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665670 4664 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665674 4664 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665677 4664 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665681 4664 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665685 4664 feature_gate.go:330] unrecognized feature gate: Example Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665689 4664 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665692 4664 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665696 4664 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665699 4664 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665703 4664 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665706 4664 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665710 4664 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665713 4664 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665717 4664 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665720 4664 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665724 4664 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665727 4664 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665731 4664 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665734 4664 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665738 4664 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665741 4664 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665745 4664 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665748 4664 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665752 4664 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665755 4664 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665760 4664 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665764 4664 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665769 4664 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665773 4664 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665778 4664 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665782 4664 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665787 4664 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665790 4664 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665794 4664 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665798 4664 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665801 4664 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665805 4664 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665808 4664 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665811 4664 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665816 4664 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665820 4664 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665823 4664 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665827 4664 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665831 4664 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665835 4664 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665840 4664 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665844 4664 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665848 4664 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665851 4664 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665855 4664 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665859 4664 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665864 4664 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665868 4664 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665871 4664 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665875 4664 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665879 4664 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665883 4664 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665886 4664 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665890 4664 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665893 4664 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665897 4664 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.665900 4664 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.665906 4664 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.675544 4664 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.675592 4664 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675693 4664 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675704 4664 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675709 4664 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675713 4664 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675718 4664 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675722 4664 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675728 4664 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675732 4664 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675739 4664 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675746 4664 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675750 4664 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675755 4664 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675758 4664 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675762 4664 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675766 4664 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675770 4664 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675773 4664 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675777 4664 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675780 4664 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675785 4664 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675789 4664 feature_gate.go:330] unrecognized feature gate: Example Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675794 4664 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675799 4664 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675804 4664 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675808 4664 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675811 4664 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675815 4664 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675819 4664 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675822 4664 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675826 4664 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675830 4664 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675833 4664 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675836 4664 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675840 4664 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675843 4664 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675847 4664 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675850 4664 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675854 4664 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675857 4664 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675861 4664 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675864 4664 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675868 4664 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675871 4664 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675875 4664 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675878 4664 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675882 4664 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675885 4664 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675889 4664 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675893 4664 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675896 4664 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675899 4664 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675903 4664 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675907 4664 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675911 4664 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675914 4664 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675918 4664 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675921 4664 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675926 4664 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675930 4664 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675935 4664 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675939 4664 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675943 4664 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675947 4664 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675952 4664 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675956 4664 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675960 4664 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675964 4664 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675970 4664 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675975 4664 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675981 4664 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.675985 4664 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.675993 4664 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676134 4664 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676143 4664 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676147 4664 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676151 4664 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676154 4664 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676158 4664 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676162 4664 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676166 4664 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676169 4664 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676172 4664 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676176 4664 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676181 4664 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676185 4664 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676189 4664 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676194 4664 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676199 4664 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676204 4664 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676208 4664 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676212 4664 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676217 4664 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676221 4664 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676225 4664 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676229 4664 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676234 4664 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676238 4664 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676242 4664 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676245 4664 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676249 4664 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676253 4664 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676256 4664 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676259 4664 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676263 4664 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676266 4664 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676270 4664 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676273 4664 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676277 4664 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676281 4664 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676284 4664 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676288 4664 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676292 4664 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676295 4664 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676299 4664 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676302 4664 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676306 4664 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676311 4664 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676315 4664 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676319 4664 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676322 4664 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676326 4664 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676330 4664 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676333 4664 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676337 4664 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676340 4664 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676344 4664 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676347 4664 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676351 4664 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676354 4664 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676358 4664 feature_gate.go:330] unrecognized feature gate: Example Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676361 4664 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676365 4664 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676368 4664 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676372 4664 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676375 4664 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676379 4664 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676382 4664 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676386 4664 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676389 4664 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676392 4664 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676396 4664 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676399 4664 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.676403 4664 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.676409 4664 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.676587 4664 server.go:940] "Client rotation is on, will bootstrap in background" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.680961 4664 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.681055 4664 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.683040 4664 server.go:997] "Starting client certificate rotation" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.683066 4664 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.683994 4664 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-07 17:45:14.667567099 +0000 UTC Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.684070 4664 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 2313h56m54.983499131s for next certificate rotation Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.716289 4664 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.719441 4664 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.736401 4664 log.go:25] "Validated CRI v1 runtime API" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.768047 4664 log.go:25] "Validated CRI v1 image API" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.770166 4664 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.778030 4664 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-03-07-02-03-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.778203 4664 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.794745 4664 manager.go:217] Machine: {Timestamp:2025-10-03 07:48:19.792369547 +0000 UTC m=+0.613560057 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:7be6e848-96ef-48b1-8627-9ddc13d5cc87 BootID:eca81c87-676e-4667-a87b-e015ec0be81c Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:33:03:9e Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:33:03:9e Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:9f:73:c8 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:ca:36:fa Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:eb:71:61 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:55:b4:a0 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:ea:9d:54:dc:08:94 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:e6:5c:7e:25:c1:c4 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.794967 4664 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.795106 4664 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.797654 4664 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.797866 4664 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.797906 4664 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.798140 4664 topology_manager.go:138] "Creating topology manager with none policy" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.798153 4664 container_manager_linux.go:303] "Creating device plugin manager" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.798898 4664 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.798928 4664 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.799111 4664 state_mem.go:36] "Initialized new in-memory state store" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.799185 4664 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.807803 4664 kubelet.go:418] "Attempting to sync node with API server" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.807886 4664 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.807934 4664 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.807951 4664 kubelet.go:324] "Adding apiserver pod source" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.807965 4664 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.814358 4664 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.816091 4664 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.818079 4664 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.190:6443: connect: connection refused Oct 03 07:48:19 crc kubenswrapper[4664]: E1003 07:48:19.818220 4664 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.190:6443: connect: connection refused" logger="UnhandledError" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.818242 4664 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.818086 4664 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.190:6443: connect: connection refused Oct 03 07:48:19 crc kubenswrapper[4664]: E1003 07:48:19.818274 4664 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.190:6443: connect: connection refused" logger="UnhandledError" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.819517 4664 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.819543 4664 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.819552 4664 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.819561 4664 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.819575 4664 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.819583 4664 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.819592 4664 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.819617 4664 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.819627 4664 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.819637 4664 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.819647 4664 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.819654 4664 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.823191 4664 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.823681 4664 server.go:1280] "Started kubelet" Oct 03 07:48:19 crc systemd[1]: Started Kubernetes Kubelet. Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.825344 4664 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.825375 4664 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.825433 4664 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 08:40:54.700665924 +0000 UTC Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.825509 4664 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1368h52m34.875159719s for next certificate rotation Oct 03 07:48:19 crc kubenswrapper[4664]: E1003 07:48:19.825565 4664 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.825570 4664 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.825589 4664 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.828668 4664 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.828726 4664 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 03 07:48:19 crc kubenswrapper[4664]: E1003 07:48:19.829205 4664 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.190:6443: connect: connection refused" interval="200ms" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.829364 4664 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.190:6443: connect: connection refused Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.829386 4664 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.829709 4664 factory.go:55] Registering systemd factory Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.829732 4664 factory.go:221] Registration of the systemd container factory successfully Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.830320 4664 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.190:6443: connect: connection refused Oct 03 07:48:19 crc kubenswrapper[4664]: E1003 07:48:19.830389 4664 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.190:6443: connect: connection refused" logger="UnhandledError" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.830396 4664 factory.go:153] Registering CRI-O factory Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.830419 4664 factory.go:221] Registration of the crio container factory successfully Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.828738 4664 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.830582 4664 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.830627 4664 factory.go:103] Registering Raw factory Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.830642 4664 manager.go:1196] Started watching for new ooms in manager Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.831312 4664 manager.go:319] Starting recovery of all containers Oct 03 07:48:19 crc kubenswrapper[4664]: E1003 07:48:19.833817 4664 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.190:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186aeba55835d101 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-03 07:48:19.823653121 +0000 UTC m=+0.644843611,LastTimestamp:2025-10-03 07:48:19.823653121 +0000 UTC m=+0.644843611,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.838099 4664 server.go:460] "Adding debug handlers to kubelet server" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.840448 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.840497 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.840512 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.840525 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.840538 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.840551 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.840563 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.840627 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.840647 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.840660 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.840671 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.840685 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.840696 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.840709 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.840720 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.840733 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.840749 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.840760 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.840771 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.840793 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.840807 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.840819 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.840876 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.840889 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.840899 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.840913 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.840929 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.840942 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.840956 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.840970 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.840982 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.840994 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841007 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841019 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841032 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841043 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841054 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841064 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841075 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841085 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841095 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841104 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841113 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841123 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841133 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841144 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841155 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841166 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841176 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841184 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841192 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841200 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841211 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841221 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841231 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841240 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841250 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841259 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841267 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841276 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841285 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841293 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841302 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841311 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841320 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841329 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841338 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841347 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841354 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841363 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841372 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841381 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841389 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841397 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841407 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841415 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841424 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841432 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841440 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841486 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841498 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841506 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841515 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841526 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841539 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841550 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841562 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841575 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841586 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841596 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841626 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841637 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841649 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841661 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841671 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841682 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841696 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841707 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841717 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841730 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841741 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841751 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841763 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841775 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841819 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841833 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841845 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841859 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841873 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841886 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841900 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841913 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841925 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841937 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841948 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841958 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841970 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841982 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.841993 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842005 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842017 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842027 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842037 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842048 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842058 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842077 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842088 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842098 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842110 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842120 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842130 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842143 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842153 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842164 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842174 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842184 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842196 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842206 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842216 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842227 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842237 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842249 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842259 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842269 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842285 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842295 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842306 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842316 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842326 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842337 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842347 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842357 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842367 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842377 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842386 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842395 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842405 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842418 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842428 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842438 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842447 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842457 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842466 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842476 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842486 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842495 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842505 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842515 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842526 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842537 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842547 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842558 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842587 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842599 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842628 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842640 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842652 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842664 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842676 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842686 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842696 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842707 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842717 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842728 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842738 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842750 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.842776 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.845365 4664 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.845626 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.845647 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.845663 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.845675 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.845689 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.845701 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.845714 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.845727 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.845740 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.845753 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.845767 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.845782 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.845796 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.845812 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.845826 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.845840 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.845854 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.845867 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.845881 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.845894 4664 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.845907 4664 reconstruct.go:97] "Volume reconstruction finished" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.845915 4664 reconciler.go:26] "Reconciler: start to sync state" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.853787 4664 manager.go:324] Recovery completed Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.864135 4664 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.867168 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.867215 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.867229 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.868296 4664 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.868312 4664 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.868333 4664 state_mem.go:36] "Initialized new in-memory state store" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.872369 4664 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.874089 4664 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.874197 4664 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.874876 4664 kubelet.go:2335] "Starting kubelet main sync loop" Oct 03 07:48:19 crc kubenswrapper[4664]: E1003 07:48:19.875032 4664 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 03 07:48:19 crc kubenswrapper[4664]: W1003 07:48:19.875395 4664 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.190:6443: connect: connection refused Oct 03 07:48:19 crc kubenswrapper[4664]: E1003 07:48:19.875451 4664 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.190:6443: connect: connection refused" logger="UnhandledError" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.892163 4664 policy_none.go:49] "None policy: Start" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.893016 4664 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.893095 4664 state_mem.go:35] "Initializing new in-memory state store" Oct 03 07:48:19 crc kubenswrapper[4664]: E1003 07:48:19.926233 4664 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.944404 4664 manager.go:334] "Starting Device Plugin manager" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.944485 4664 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.944498 4664 server.go:79] "Starting device plugin registration server" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.944980 4664 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.945002 4664 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.945571 4664 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.945793 4664 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.945806 4664 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 03 07:48:19 crc kubenswrapper[4664]: E1003 07:48:19.951560 4664 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.976234 4664 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.976380 4664 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.977621 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.977662 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.977675 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.977836 4664 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.978081 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.978151 4664 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.978632 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.978660 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.978677 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.978810 4664 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.979118 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.979199 4664 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.979431 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.979464 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.979476 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.979593 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.979665 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.979679 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.979910 4664 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.980007 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.980055 4664 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.980426 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.980461 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.980474 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.980665 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.980692 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.980704 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.980814 4664 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.980975 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.981012 4664 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.981097 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.981130 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.981144 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.981531 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.981572 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.981589 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.981759 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.981786 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.981804 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.981831 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.981872 4664 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.983567 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.983714 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:19 crc kubenswrapper[4664]: I1003 07:48:19.983736 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:20 crc kubenswrapper[4664]: E1003 07:48:20.031188 4664 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.190:6443: connect: connection refused" interval="400ms" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.046150 4664 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.047333 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.047463 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.047502 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.047533 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.047560 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.047584 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.047594 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.047644 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.047677 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.047648 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.047705 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.047724 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.047863 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.048037 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.048055 4664 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.048069 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.048206 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.048228 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.048252 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 07:48:20 crc kubenswrapper[4664]: E1003 07:48:20.048979 4664 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.190:6443: connect: connection refused" node="crc" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.149674 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.149733 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.149757 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.149774 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.149791 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.149807 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.149821 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.149836 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.149839 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.149853 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.149914 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.149919 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.149945 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.149965 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.149975 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.149984 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.150006 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.150010 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.149895 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.150017 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.150079 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.150039 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.150047 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.150054 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.149946 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.150069 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.150100 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.150104 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.150070 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.150032 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.249895 4664 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.251064 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.251107 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.251116 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.251139 4664 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 07:48:20 crc kubenswrapper[4664]: E1003 07:48:20.251599 4664 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.190:6443: connect: connection refused" node="crc" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.313924 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.322684 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.340151 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.357060 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.364599 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 07:48:20 crc kubenswrapper[4664]: W1003 07:48:20.409137 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-dad0cc7f7c583addc5b413657bf637cb322258992e0bffd55120c8ecd0b31e91 WatchSource:0}: Error finding container dad0cc7f7c583addc5b413657bf637cb322258992e0bffd55120c8ecd0b31e91: Status 404 returned error can't find the container with id dad0cc7f7c583addc5b413657bf637cb322258992e0bffd55120c8ecd0b31e91 Oct 03 07:48:20 crc kubenswrapper[4664]: W1003 07:48:20.410689 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-b2b6a315637e4d6dceaee2cd69ab8f8e3f5f2aa0adb0cafa1508a4fe38233501 WatchSource:0}: Error finding container b2b6a315637e4d6dceaee2cd69ab8f8e3f5f2aa0adb0cafa1508a4fe38233501: Status 404 returned error can't find the container with id b2b6a315637e4d6dceaee2cd69ab8f8e3f5f2aa0adb0cafa1508a4fe38233501 Oct 03 07:48:20 crc kubenswrapper[4664]: W1003 07:48:20.415229 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-cdda46ff6a2f9dbee5d5db85736f199ba873bfa6c242203c562f42b0c5d9e19c WatchSource:0}: Error finding container cdda46ff6a2f9dbee5d5db85736f199ba873bfa6c242203c562f42b0c5d9e19c: Status 404 returned error can't find the container with id cdda46ff6a2f9dbee5d5db85736f199ba873bfa6c242203c562f42b0c5d9e19c Oct 03 07:48:20 crc kubenswrapper[4664]: W1003 07:48:20.416887 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-414066a2cd20bd09473344cdc2b6e27e41e9156beb2d8a239317c8421a60ec4f WatchSource:0}: Error finding container 414066a2cd20bd09473344cdc2b6e27e41e9156beb2d8a239317c8421a60ec4f: Status 404 returned error can't find the container with id 414066a2cd20bd09473344cdc2b6e27e41e9156beb2d8a239317c8421a60ec4f Oct 03 07:48:20 crc kubenswrapper[4664]: W1003 07:48:20.417779 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-ce6a4a58500d24d597989fb60ac4c475e8d8fd63a790bba04f94e49a9247ed39 WatchSource:0}: Error finding container ce6a4a58500d24d597989fb60ac4c475e8d8fd63a790bba04f94e49a9247ed39: Status 404 returned error can't find the container with id ce6a4a58500d24d597989fb60ac4c475e8d8fd63a790bba04f94e49a9247ed39 Oct 03 07:48:20 crc kubenswrapper[4664]: E1003 07:48:20.432879 4664 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.190:6443: connect: connection refused" interval="800ms" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.652049 4664 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.653715 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.653769 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.653780 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.653803 4664 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 07:48:20 crc kubenswrapper[4664]: E1003 07:48:20.654199 4664 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.190:6443: connect: connection refused" node="crc" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.830206 4664 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.190:6443: connect: connection refused Oct 03 07:48:20 crc kubenswrapper[4664]: W1003 07:48:20.830309 4664 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.190:6443: connect: connection refused Oct 03 07:48:20 crc kubenswrapper[4664]: E1003 07:48:20.830361 4664 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.190:6443: connect: connection refused" logger="UnhandledError" Oct 03 07:48:20 crc kubenswrapper[4664]: W1003 07:48:20.838051 4664 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.190:6443: connect: connection refused Oct 03 07:48:20 crc kubenswrapper[4664]: E1003 07:48:20.838126 4664 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.190:6443: connect: connection refused" logger="UnhandledError" Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.878224 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ce6a4a58500d24d597989fb60ac4c475e8d8fd63a790bba04f94e49a9247ed39"} Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.879113 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"414066a2cd20bd09473344cdc2b6e27e41e9156beb2d8a239317c8421a60ec4f"} Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.880567 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"cdda46ff6a2f9dbee5d5db85736f199ba873bfa6c242203c562f42b0c5d9e19c"} Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.881935 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"dad0cc7f7c583addc5b413657bf637cb322258992e0bffd55120c8ecd0b31e91"} Oct 03 07:48:20 crc kubenswrapper[4664]: I1003 07:48:20.882714 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"b2b6a315637e4d6dceaee2cd69ab8f8e3f5f2aa0adb0cafa1508a4fe38233501"} Oct 03 07:48:21 crc kubenswrapper[4664]: W1003 07:48:21.029284 4664 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.190:6443: connect: connection refused Oct 03 07:48:21 crc kubenswrapper[4664]: E1003 07:48:21.029364 4664 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.190:6443: connect: connection refused" logger="UnhandledError" Oct 03 07:48:21 crc kubenswrapper[4664]: E1003 07:48:21.234287 4664 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.190:6443: connect: connection refused" interval="1.6s" Oct 03 07:48:21 crc kubenswrapper[4664]: W1003 07:48:21.382674 4664 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.190:6443: connect: connection refused Oct 03 07:48:21 crc kubenswrapper[4664]: E1003 07:48:21.382789 4664 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.190:6443: connect: connection refused" logger="UnhandledError" Oct 03 07:48:21 crc kubenswrapper[4664]: I1003 07:48:21.454671 4664 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 07:48:21 crc kubenswrapper[4664]: I1003 07:48:21.457308 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:21 crc kubenswrapper[4664]: I1003 07:48:21.457405 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:21 crc kubenswrapper[4664]: I1003 07:48:21.457437 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:21 crc kubenswrapper[4664]: I1003 07:48:21.457494 4664 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 07:48:21 crc kubenswrapper[4664]: E1003 07:48:21.458207 4664 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.190:6443: connect: connection refused" node="crc" Oct 03 07:48:21 crc kubenswrapper[4664]: I1003 07:48:21.831104 4664 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.190:6443: connect: connection refused Oct 03 07:48:21 crc kubenswrapper[4664]: I1003 07:48:21.888425 4664 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="19b60eef776685fde325952f0aa6c0b2a679d105f02227deae22666253ae8596" exitCode=0 Oct 03 07:48:21 crc kubenswrapper[4664]: I1003 07:48:21.888484 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"19b60eef776685fde325952f0aa6c0b2a679d105f02227deae22666253ae8596"} Oct 03 07:48:21 crc kubenswrapper[4664]: I1003 07:48:21.888519 4664 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 07:48:21 crc kubenswrapper[4664]: I1003 07:48:21.889378 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:21 crc kubenswrapper[4664]: I1003 07:48:21.889412 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:21 crc kubenswrapper[4664]: I1003 07:48:21.889423 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:21 crc kubenswrapper[4664]: I1003 07:48:21.891469 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4e4e705e4bd02aaf141572ec91291240f8f7ec92b33cd45ba98ea8ce83f06918"} Oct 03 07:48:21 crc kubenswrapper[4664]: I1003 07:48:21.891514 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6e7eed5fa26eeb30a803512265d01d5fbb9fffad00d857a683939158da6b0bb7"} Oct 03 07:48:21 crc kubenswrapper[4664]: I1003 07:48:21.891534 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"938aaa7e3e507e4f04632ecbdeb12d1e169b59fdcd8c19838e21fedc4b55d721"} Oct 03 07:48:21 crc kubenswrapper[4664]: I1003 07:48:21.891547 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"682632866f1e606691047ac952506c02f250c825f142a1676a4abcca32e58466"} Oct 03 07:48:21 crc kubenswrapper[4664]: I1003 07:48:21.891517 4664 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 07:48:21 crc kubenswrapper[4664]: I1003 07:48:21.892243 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:21 crc kubenswrapper[4664]: I1003 07:48:21.892276 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:21 crc kubenswrapper[4664]: I1003 07:48:21.892291 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:21 crc kubenswrapper[4664]: I1003 07:48:21.892966 4664 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="373216c551b7fed67ebbcb9c4cac5afd1e715a5fda849d977798c3690dc4f928" exitCode=0 Oct 03 07:48:21 crc kubenswrapper[4664]: I1003 07:48:21.893218 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"373216c551b7fed67ebbcb9c4cac5afd1e715a5fda849d977798c3690dc4f928"} Oct 03 07:48:21 crc kubenswrapper[4664]: I1003 07:48:21.893283 4664 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 07:48:21 crc kubenswrapper[4664]: I1003 07:48:21.894049 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:21 crc kubenswrapper[4664]: I1003 07:48:21.894083 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:21 crc kubenswrapper[4664]: I1003 07:48:21.894099 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:21 crc kubenswrapper[4664]: I1003 07:48:21.895782 4664 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04" exitCode=0 Oct 03 07:48:21 crc kubenswrapper[4664]: I1003 07:48:21.895922 4664 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 07:48:21 crc kubenswrapper[4664]: I1003 07:48:21.895984 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04"} Oct 03 07:48:21 crc kubenswrapper[4664]: I1003 07:48:21.896723 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:21 crc kubenswrapper[4664]: I1003 07:48:21.896749 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:21 crc kubenswrapper[4664]: I1003 07:48:21.896762 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:21 crc kubenswrapper[4664]: I1003 07:48:21.897290 4664 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="fc449fe962a0144d2f3088d5d5b4a8769035aca8b07118aca83de2cd3183651b" exitCode=0 Oct 03 07:48:21 crc kubenswrapper[4664]: I1003 07:48:21.897331 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"fc449fe962a0144d2f3088d5d5b4a8769035aca8b07118aca83de2cd3183651b"} Oct 03 07:48:21 crc kubenswrapper[4664]: I1003 07:48:21.897381 4664 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 07:48:21 crc kubenswrapper[4664]: I1003 07:48:21.898006 4664 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 07:48:21 crc kubenswrapper[4664]: I1003 07:48:21.898281 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:21 crc kubenswrapper[4664]: I1003 07:48:21.898309 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:21 crc kubenswrapper[4664]: I1003 07:48:21.898321 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:21 crc kubenswrapper[4664]: I1003 07:48:21.899115 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:21 crc kubenswrapper[4664]: I1003 07:48:21.899153 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:21 crc kubenswrapper[4664]: I1003 07:48:21.899169 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:22 crc kubenswrapper[4664]: I1003 07:48:22.830640 4664 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.190:6443: connect: connection refused Oct 03 07:48:22 crc kubenswrapper[4664]: E1003 07:48:22.835228 4664 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.190:6443: connect: connection refused" interval="3.2s" Oct 03 07:48:22 crc kubenswrapper[4664]: I1003 07:48:22.835313 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 07:48:22 crc kubenswrapper[4664]: I1003 07:48:22.911453 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1078b48064376958ba64f9040dc1973628494ff4c9935c90c296bdb8a27f8738"} Oct 03 07:48:22 crc kubenswrapper[4664]: I1003 07:48:22.911501 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9f580c83a6ce9959ed5a1c8b78834f7d6caecaeff3ed366fdef991ff728b08d3"} Oct 03 07:48:22 crc kubenswrapper[4664]: I1003 07:48:22.911510 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ed30aa5b737df201bfca9e17073255b5d791430429d86149107650c38a1c0c47"} Oct 03 07:48:22 crc kubenswrapper[4664]: I1003 07:48:22.911518 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0e96ea8ece692f200d9dbe3650ba30a3680897c6d7c33c9b359b27e31692689a"} Oct 03 07:48:22 crc kubenswrapper[4664]: I1003 07:48:22.913277 4664 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="01406adf04f1c29dcb0acaa2268c9514d4712a73f7f055149ae1507b1cdbb088" exitCode=0 Oct 03 07:48:22 crc kubenswrapper[4664]: I1003 07:48:22.913349 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"01406adf04f1c29dcb0acaa2268c9514d4712a73f7f055149ae1507b1cdbb088"} Oct 03 07:48:22 crc kubenswrapper[4664]: I1003 07:48:22.913483 4664 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 07:48:22 crc kubenswrapper[4664]: I1003 07:48:22.914698 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:22 crc kubenswrapper[4664]: I1003 07:48:22.914735 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:22 crc kubenswrapper[4664]: I1003 07:48:22.914746 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:22 crc kubenswrapper[4664]: I1003 07:48:22.915560 4664 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 07:48:22 crc kubenswrapper[4664]: I1003 07:48:22.915542 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"d71e13c58bbb5cba74df86672f02d7970dae7bc41a9c88aa6652f98c62fa7122"} Oct 03 07:48:22 crc kubenswrapper[4664]: I1003 07:48:22.916373 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:22 crc kubenswrapper[4664]: I1003 07:48:22.916400 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:22 crc kubenswrapper[4664]: I1003 07:48:22.916411 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:22 crc kubenswrapper[4664]: I1003 07:48:22.920871 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"0f5a207be586adc5a1d852c18ed4ca1bf7d81b707a2f55ac2e76f31c5a94ffb1"} Oct 03 07:48:22 crc kubenswrapper[4664]: I1003 07:48:22.920927 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"fd376fcbd2d46400852a695c1682ba853522bcc98038fcdb55c89d87d13ef012"} Oct 03 07:48:22 crc kubenswrapper[4664]: I1003 07:48:22.920938 4664 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 07:48:22 crc kubenswrapper[4664]: I1003 07:48:22.920942 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"8b1c43aa954d4f0ba95fd6bafe459a5f2d2df82b8cdbcd661c2a6e6238526fbb"} Oct 03 07:48:22 crc kubenswrapper[4664]: I1003 07:48:22.921039 4664 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 07:48:22 crc kubenswrapper[4664]: I1003 07:48:22.924420 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:22 crc kubenswrapper[4664]: I1003 07:48:22.924428 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:22 crc kubenswrapper[4664]: I1003 07:48:22.924452 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:22 crc kubenswrapper[4664]: I1003 07:48:22.924460 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:22 crc kubenswrapper[4664]: I1003 07:48:22.924463 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:22 crc kubenswrapper[4664]: I1003 07:48:22.924475 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:23 crc kubenswrapper[4664]: I1003 07:48:23.058389 4664 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 07:48:23 crc kubenswrapper[4664]: I1003 07:48:23.069901 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:23 crc kubenswrapper[4664]: I1003 07:48:23.069967 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:23 crc kubenswrapper[4664]: I1003 07:48:23.069991 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:23 crc kubenswrapper[4664]: I1003 07:48:23.070031 4664 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 07:48:23 crc kubenswrapper[4664]: E1003 07:48:23.070745 4664 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.190:6443: connect: connection refused" node="crc" Oct 03 07:48:23 crc kubenswrapper[4664]: W1003 07:48:23.370176 4664 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.190:6443: connect: connection refused Oct 03 07:48:23 crc kubenswrapper[4664]: E1003 07:48:23.370274 4664 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.190:6443: connect: connection refused" logger="UnhandledError" Oct 03 07:48:23 crc kubenswrapper[4664]: W1003 07:48:23.779451 4664 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.190:6443: connect: connection refused Oct 03 07:48:23 crc kubenswrapper[4664]: E1003 07:48:23.779553 4664 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.190:6443: connect: connection refused" logger="UnhandledError" Oct 03 07:48:23 crc kubenswrapper[4664]: W1003 07:48:23.827501 4664 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.190:6443: connect: connection refused Oct 03 07:48:23 crc kubenswrapper[4664]: E1003 07:48:23.827591 4664 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.190:6443: connect: connection refused" logger="UnhandledError" Oct 03 07:48:23 crc kubenswrapper[4664]: I1003 07:48:23.830265 4664 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.190:6443: connect: connection refused Oct 03 07:48:23 crc kubenswrapper[4664]: I1003 07:48:23.927801 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"71f17ff9f4fc1366ae9115b1054886353f6e3567b5359d46f3e053572810fa1e"} Oct 03 07:48:23 crc kubenswrapper[4664]: I1003 07:48:23.927949 4664 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 07:48:23 crc kubenswrapper[4664]: I1003 07:48:23.929135 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:23 crc kubenswrapper[4664]: I1003 07:48:23.929181 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:23 crc kubenswrapper[4664]: I1003 07:48:23.929197 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:23 crc kubenswrapper[4664]: I1003 07:48:23.931075 4664 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="f6097fc39412ae589b05b16aede4059ae55e0bf6251ff504832fff0bba157aac" exitCode=0 Oct 03 07:48:23 crc kubenswrapper[4664]: I1003 07:48:23.931198 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"f6097fc39412ae589b05b16aede4059ae55e0bf6251ff504832fff0bba157aac"} Oct 03 07:48:23 crc kubenswrapper[4664]: I1003 07:48:23.931204 4664 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 07:48:23 crc kubenswrapper[4664]: I1003 07:48:23.931239 4664 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 07:48:23 crc kubenswrapper[4664]: I1003 07:48:23.931275 4664 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 07:48:23 crc kubenswrapper[4664]: I1003 07:48:23.931208 4664 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 07:48:23 crc kubenswrapper[4664]: I1003 07:48:23.931275 4664 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 07:48:23 crc kubenswrapper[4664]: I1003 07:48:23.932481 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:23 crc kubenswrapper[4664]: I1003 07:48:23.932509 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:23 crc kubenswrapper[4664]: I1003 07:48:23.932522 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:23 crc kubenswrapper[4664]: I1003 07:48:23.932536 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:23 crc kubenswrapper[4664]: I1003 07:48:23.932596 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:23 crc kubenswrapper[4664]: I1003 07:48:23.932629 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:23 crc kubenswrapper[4664]: I1003 07:48:23.932639 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:23 crc kubenswrapper[4664]: I1003 07:48:23.932535 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:23 crc kubenswrapper[4664]: I1003 07:48:23.932687 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:23 crc kubenswrapper[4664]: I1003 07:48:23.932988 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:23 crc kubenswrapper[4664]: I1003 07:48:23.933008 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:23 crc kubenswrapper[4664]: I1003 07:48:23.933018 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:23 crc kubenswrapper[4664]: W1003 07:48:23.935073 4664 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.190:6443: connect: connection refused Oct 03 07:48:23 crc kubenswrapper[4664]: E1003 07:48:23.935147 4664 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.190:6443: connect: connection refused" logger="UnhandledError" Oct 03 07:48:24 crc kubenswrapper[4664]: I1003 07:48:24.935195 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 03 07:48:24 crc kubenswrapper[4664]: I1003 07:48:24.941086 4664 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="71f17ff9f4fc1366ae9115b1054886353f6e3567b5359d46f3e053572810fa1e" exitCode=255 Oct 03 07:48:24 crc kubenswrapper[4664]: I1003 07:48:24.941273 4664 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 07:48:24 crc kubenswrapper[4664]: I1003 07:48:24.941410 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"71f17ff9f4fc1366ae9115b1054886353f6e3567b5359d46f3e053572810fa1e"} Oct 03 07:48:24 crc kubenswrapper[4664]: I1003 07:48:24.942502 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:24 crc kubenswrapper[4664]: I1003 07:48:24.942547 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:24 crc kubenswrapper[4664]: I1003 07:48:24.942559 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:24 crc kubenswrapper[4664]: I1003 07:48:24.943121 4664 scope.go:117] "RemoveContainer" containerID="71f17ff9f4fc1366ae9115b1054886353f6e3567b5359d46f3e053572810fa1e" Oct 03 07:48:24 crc kubenswrapper[4664]: I1003 07:48:24.945552 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"081e3f10dbfcd175f2595aabad8ff020b4878a8605ab4f85d68ecd8178bda548"} Oct 03 07:48:24 crc kubenswrapper[4664]: I1003 07:48:24.945618 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d9054c0ddac91219be608702604b3f2fc398dcb23cd4ae2c87e29aea2267383a"} Oct 03 07:48:24 crc kubenswrapper[4664]: I1003 07:48:24.945632 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"5ccdd0e3981ae1631d21b940d6ab096e6c5a8f62ea0d9edfba00c925b7b4a235"} Oct 03 07:48:24 crc kubenswrapper[4664]: I1003 07:48:24.945643 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"28b42bc27e52911b15b17e3effdc251425df3177be375aed26649ed02cea13e5"} Oct 03 07:48:25 crc kubenswrapper[4664]: I1003 07:48:25.375339 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 07:48:25 crc kubenswrapper[4664]: I1003 07:48:25.375512 4664 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 07:48:25 crc kubenswrapper[4664]: I1003 07:48:25.376573 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:25 crc kubenswrapper[4664]: I1003 07:48:25.376617 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:25 crc kubenswrapper[4664]: I1003 07:48:25.376628 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:25 crc kubenswrapper[4664]: I1003 07:48:25.950896 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 03 07:48:25 crc kubenswrapper[4664]: I1003 07:48:25.952976 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9"} Oct 03 07:48:25 crc kubenswrapper[4664]: I1003 07:48:25.953087 4664 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 07:48:25 crc kubenswrapper[4664]: I1003 07:48:25.953141 4664 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 07:48:25 crc kubenswrapper[4664]: I1003 07:48:25.954120 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:25 crc kubenswrapper[4664]: I1003 07:48:25.954168 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:25 crc kubenswrapper[4664]: I1003 07:48:25.954181 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:25 crc kubenswrapper[4664]: I1003 07:48:25.957856 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"111f5ec5819c2f11bfc65c9cf5308b5d3ffc4b3bb6f8b3f65f31a9323b56d9e9"} Oct 03 07:48:25 crc kubenswrapper[4664]: I1003 07:48:25.957992 4664 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 07:48:25 crc kubenswrapper[4664]: I1003 07:48:25.958785 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:25 crc kubenswrapper[4664]: I1003 07:48:25.958822 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:25 crc kubenswrapper[4664]: I1003 07:48:25.958836 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:26 crc kubenswrapper[4664]: I1003 07:48:26.082112 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 07:48:26 crc kubenswrapper[4664]: I1003 07:48:26.082301 4664 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 07:48:26 crc kubenswrapper[4664]: I1003 07:48:26.083806 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:26 crc kubenswrapper[4664]: I1003 07:48:26.083852 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:26 crc kubenswrapper[4664]: I1003 07:48:26.083866 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:26 crc kubenswrapper[4664]: I1003 07:48:26.091068 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 07:48:26 crc kubenswrapper[4664]: I1003 07:48:26.270985 4664 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 07:48:26 crc kubenswrapper[4664]: I1003 07:48:26.272567 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:26 crc kubenswrapper[4664]: I1003 07:48:26.272633 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:26 crc kubenswrapper[4664]: I1003 07:48:26.272645 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:26 crc kubenswrapper[4664]: I1003 07:48:26.272673 4664 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 07:48:26 crc kubenswrapper[4664]: I1003 07:48:26.960301 4664 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 07:48:26 crc kubenswrapper[4664]: I1003 07:48:26.960433 4664 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 07:48:26 crc kubenswrapper[4664]: I1003 07:48:26.960510 4664 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 07:48:26 crc kubenswrapper[4664]: I1003 07:48:26.961246 4664 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 07:48:26 crc kubenswrapper[4664]: I1003 07:48:26.961347 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:26 crc kubenswrapper[4664]: I1003 07:48:26.961427 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:26 crc kubenswrapper[4664]: I1003 07:48:26.961443 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:26 crc kubenswrapper[4664]: I1003 07:48:26.961553 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:26 crc kubenswrapper[4664]: I1003 07:48:26.961589 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:26 crc kubenswrapper[4664]: I1003 07:48:26.961600 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:26 crc kubenswrapper[4664]: I1003 07:48:26.963486 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:26 crc kubenswrapper[4664]: I1003 07:48:26.963523 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:26 crc kubenswrapper[4664]: I1003 07:48:26.963533 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:27 crc kubenswrapper[4664]: I1003 07:48:27.076519 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 07:48:27 crc kubenswrapper[4664]: I1003 07:48:27.843406 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 07:48:27 crc kubenswrapper[4664]: I1003 07:48:27.962877 4664 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 07:48:27 crc kubenswrapper[4664]: I1003 07:48:27.963975 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:27 crc kubenswrapper[4664]: I1003 07:48:27.964054 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:27 crc kubenswrapper[4664]: I1003 07:48:27.964074 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:28 crc kubenswrapper[4664]: I1003 07:48:28.058538 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 03 07:48:28 crc kubenswrapper[4664]: I1003 07:48:28.058849 4664 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 07:48:28 crc kubenswrapper[4664]: I1003 07:48:28.060947 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:28 crc kubenswrapper[4664]: I1003 07:48:28.060983 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:28 crc kubenswrapper[4664]: I1003 07:48:28.060997 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:28 crc kubenswrapper[4664]: I1003 07:48:28.073251 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 03 07:48:28 crc kubenswrapper[4664]: I1003 07:48:28.231700 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 07:48:28 crc kubenswrapper[4664]: I1003 07:48:28.587492 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 07:48:28 crc kubenswrapper[4664]: I1003 07:48:28.587867 4664 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 07:48:28 crc kubenswrapper[4664]: I1003 07:48:28.589375 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:28 crc kubenswrapper[4664]: I1003 07:48:28.589436 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:28 crc kubenswrapper[4664]: I1003 07:48:28.589451 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:28 crc kubenswrapper[4664]: I1003 07:48:28.966149 4664 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 07:48:28 crc kubenswrapper[4664]: I1003 07:48:28.966193 4664 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 07:48:28 crc kubenswrapper[4664]: I1003 07:48:28.967889 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:28 crc kubenswrapper[4664]: I1003 07:48:28.967919 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:28 crc kubenswrapper[4664]: I1003 07:48:28.967930 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:28 crc kubenswrapper[4664]: I1003 07:48:28.968017 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:28 crc kubenswrapper[4664]: I1003 07:48:28.968061 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:28 crc kubenswrapper[4664]: I1003 07:48:28.968073 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:29 crc kubenswrapper[4664]: I1003 07:48:29.187109 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 07:48:29 crc kubenswrapper[4664]: I1003 07:48:29.187289 4664 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 07:48:29 crc kubenswrapper[4664]: I1003 07:48:29.188266 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:29 crc kubenswrapper[4664]: I1003 07:48:29.188293 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:29 crc kubenswrapper[4664]: I1003 07:48:29.188302 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:29 crc kubenswrapper[4664]: E1003 07:48:29.951769 4664 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 03 07:48:29 crc kubenswrapper[4664]: I1003 07:48:29.967414 4664 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 07:48:29 crc kubenswrapper[4664]: I1003 07:48:29.968327 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:29 crc kubenswrapper[4664]: I1003 07:48:29.968372 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:29 crc kubenswrapper[4664]: I1003 07:48:29.968383 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:32 crc kubenswrapper[4664]: I1003 07:48:32.187719 4664 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 03 07:48:32 crc kubenswrapper[4664]: I1003 07:48:32.187789 4664 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 07:48:32 crc kubenswrapper[4664]: I1003 07:48:32.839835 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 07:48:32 crc kubenswrapper[4664]: I1003 07:48:32.839993 4664 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 07:48:32 crc kubenswrapper[4664]: I1003 07:48:32.841105 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:32 crc kubenswrapper[4664]: I1003 07:48:32.841151 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:32 crc kubenswrapper[4664]: I1003 07:48:32.841163 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:34 crc kubenswrapper[4664]: I1003 07:48:34.008580 4664 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 03 07:48:34 crc kubenswrapper[4664]: I1003 07:48:34.009191 4664 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 03 07:48:34 crc kubenswrapper[4664]: I1003 07:48:34.015865 4664 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 03 07:48:34 crc kubenswrapper[4664]: I1003 07:48:34.015973 4664 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 03 07:48:37 crc kubenswrapper[4664]: I1003 07:48:37.077834 4664 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 03 07:48:37 crc kubenswrapper[4664]: I1003 07:48:37.077903 4664 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 03 07:48:38 crc kubenswrapper[4664]: I1003 07:48:38.094846 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 03 07:48:38 crc kubenswrapper[4664]: I1003 07:48:38.095019 4664 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 07:48:38 crc kubenswrapper[4664]: I1003 07:48:38.095953 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:38 crc kubenswrapper[4664]: I1003 07:48:38.095991 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:38 crc kubenswrapper[4664]: I1003 07:48:38.096004 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:38 crc kubenswrapper[4664]: I1003 07:48:38.106153 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 03 07:48:38 crc kubenswrapper[4664]: I1003 07:48:38.236010 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 07:48:38 crc kubenswrapper[4664]: I1003 07:48:38.236219 4664 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 07:48:38 crc kubenswrapper[4664]: I1003 07:48:38.236520 4664 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 03 07:48:38 crc kubenswrapper[4664]: I1003 07:48:38.236582 4664 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 03 07:48:38 crc kubenswrapper[4664]: I1003 07:48:38.237323 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:38 crc kubenswrapper[4664]: I1003 07:48:38.237364 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:38 crc kubenswrapper[4664]: I1003 07:48:38.237403 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:38 crc kubenswrapper[4664]: I1003 07:48:38.241265 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 07:48:38 crc kubenswrapper[4664]: I1003 07:48:38.990862 4664 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 07:48:38 crc kubenswrapper[4664]: I1003 07:48:38.990862 4664 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 07:48:38 crc kubenswrapper[4664]: I1003 07:48:38.991139 4664 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 03 07:48:38 crc kubenswrapper[4664]: I1003 07:48:38.991183 4664 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 03 07:48:38 crc kubenswrapper[4664]: E1003 07:48:38.991853 4664 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 03 07:48:38 crc kubenswrapper[4664]: I1003 07:48:38.991920 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:38 crc kubenswrapper[4664]: I1003 07:48:38.991956 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:38 crc kubenswrapper[4664]: I1003 07:48:38.991979 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:38 crc kubenswrapper[4664]: I1003 07:48:38.992178 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:38 crc kubenswrapper[4664]: I1003 07:48:38.992210 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:38 crc kubenswrapper[4664]: I1003 07:48:38.992223 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:38 crc kubenswrapper[4664]: I1003 07:48:38.994893 4664 trace.go:236] Trace[1976053879]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Oct-2025 07:48:28.577) (total time: 10417ms): Oct 03 07:48:38 crc kubenswrapper[4664]: Trace[1976053879]: ---"Objects listed" error: 10416ms (07:48:38.993) Oct 03 07:48:38 crc kubenswrapper[4664]: Trace[1976053879]: [10.417538271s] [10.417538271s] END Oct 03 07:48:38 crc kubenswrapper[4664]: I1003 07:48:38.995114 4664 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 03 07:48:38 crc kubenswrapper[4664]: I1003 07:48:38.995425 4664 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 03 07:48:38 crc kubenswrapper[4664]: E1003 07:48:38.996557 4664 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 03 07:48:38 crc kubenswrapper[4664]: I1003 07:48:38.997532 4664 trace.go:236] Trace[1436402680]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Oct-2025 07:48:28.563) (total time: 10434ms): Oct 03 07:48:38 crc kubenswrapper[4664]: Trace[1436402680]: ---"Objects listed" error: 10434ms (07:48:38.997) Oct 03 07:48:38 crc kubenswrapper[4664]: Trace[1436402680]: [10.434218877s] [10.434218877s] END Oct 03 07:48:38 crc kubenswrapper[4664]: I1003 07:48:38.997557 4664 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 03 07:48:38 crc kubenswrapper[4664]: I1003 07:48:38.997822 4664 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 03 07:48:38 crc kubenswrapper[4664]: I1003 07:48:38.999844 4664 trace.go:236] Trace[1782888858]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Oct-2025 07:48:27.632) (total time: 11367ms): Oct 03 07:48:38 crc kubenswrapper[4664]: Trace[1782888858]: ---"Objects listed" error: 11366ms (07:48:38.998) Oct 03 07:48:38 crc kubenswrapper[4664]: Trace[1782888858]: [11.367178475s] [11.367178475s] END Oct 03 07:48:38 crc kubenswrapper[4664]: I1003 07:48:38.999876 4664 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.193070 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.197587 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.497294 4664 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:48090->192.168.126.11:17697: read: connection reset by peer" start-of-body= Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.497364 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:48090->192.168.126.11:17697: read: connection reset by peer" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.820781 4664 apiserver.go:52] "Watching apiserver" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.823404 4664 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.823833 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.825497 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:48:39 crc kubenswrapper[4664]: E1003 07:48:39.825684 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.825852 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.825904 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 07:48:39 crc kubenswrapper[4664]: E1003 07:48:39.827073 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.825955 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.825966 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.825924 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:48:39 crc kubenswrapper[4664]: E1003 07:48:39.827434 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.828946 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.829002 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.829032 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.830855 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.830886 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.830922 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.830854 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.831304 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.831338 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.831517 4664 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.846013 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8078a587-9c95-43a8-9cf8-4286835f8134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938aaa7e3e507e4f04632ecbdeb12d1e169b59fdcd8c19838e21fedc4b55d721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682632866f1e606691047ac952506c02f250c825f142a1676a4abcca32e58466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7eed5fa26eeb30a803512265d01d5fbb9fffad00d857a683939158da6b0bb7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e4e705e4bd02aaf141572ec91291240f8f7ec92b33cd45ba98ea8ce83f06918\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.859931 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.870220 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8078a587-9c95-43a8-9cf8-4286835f8134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938aaa7e3e507e4f04632ecbdeb12d1e169b59fdcd8c19838e21fedc4b55d721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682632866f1e606691047ac952506c02f250c825f142a1676a4abcca32e58466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7eed5fa26eeb30a803512265d01d5fbb9fffad00d857a683939158da6b0bb7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e4e705e4bd02aaf141572ec91291240f8f7ec92b33cd45ba98ea8ce83f06918\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.878870 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.893197 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.903229 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.905264 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.905299 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.905326 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.905349 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.905373 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.905397 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.905421 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.905552 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.905661 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.906187 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.906266 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.906293 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.906306 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.906323 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.906344 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.906350 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.906381 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.906390 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.906435 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.906462 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.906485 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.906507 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.906596 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.906638 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.906658 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.906679 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.906700 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.906721 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.906741 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.906764 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.906787 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.906811 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.906892 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.906918 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.906940 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.906964 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.906985 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.907008 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.907056 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.907082 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.907122 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.907220 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.906558 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.907247 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.906574 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.906744 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.906820 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.906904 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.906999 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.907038 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.907058 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.907181 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.907237 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.907272 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.907357 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.907374 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.907391 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.907410 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.907430 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.907444 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.907459 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.907474 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.907490 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.907506 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.907524 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.907538 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.907556 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.907573 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.907589 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.907586 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.907623 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.907643 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.907658 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.907674 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.907691 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.907711 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.907727 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.907743 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.907758 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.907826 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.907847 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.907872 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.907902 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.907928 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.907945 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.907966 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.907981 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.907998 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.908013 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.908029 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.908046 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.908062 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.908078 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.908094 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.908110 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.908125 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.908142 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.908158 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.908176 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.908192 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.908224 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.908242 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.908260 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.908279 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.908295 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.908311 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.908326 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.908341 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.908357 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.908372 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.908391 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.908447 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.908464 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.908479 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.908497 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.908567 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.908584 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.908621 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.908638 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.908656 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.908670 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.908688 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.908707 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.909773 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.909832 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.909873 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.909903 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.909938 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.909967 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.909999 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.910027 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.910059 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.910127 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.910154 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.910184 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.910216 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.910250 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.910322 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.910359 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.910390 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.910417 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.910452 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.910485 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.910514 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.910546 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.910577 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.910629 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.910659 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.910696 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.910727 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.910763 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.910794 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.910826 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.910853 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.910885 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.910922 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.910954 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.910982 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.911032 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.911064 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.911091 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.911125 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.911160 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.911190 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.911222 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.911252 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.911286 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.911313 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.911349 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.911385 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.911434 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.911467 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.911500 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.911526 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.911555 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.911584 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.911638 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.911675 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.911713 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.911740 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.911773 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.911804 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.911832 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.911858 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.911893 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.911923 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.911950 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.911980 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.912010 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.912037 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.912067 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.912095 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.912128 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.912153 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.912185 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.912217 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.912247 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.912281 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.912314 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.912351 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.912378 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.912410 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.912440 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.912464 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.912496 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.912526 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.912553 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.912584 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.912638 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.912674 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.912703 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.912734 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.912796 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.912838 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.912875 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.912907 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.912935 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.912969 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.913002 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.913036 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.913081 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.913109 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.913142 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.913180 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.913212 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.913238 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.913433 4664 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.913459 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.913475 4664 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.913489 4664 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.913503 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.913522 4664 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.913535 4664 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.913549 4664 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.913563 4664 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.913580 4664 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.913595 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.913749 4664 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.913776 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.913794 4664 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.913816 4664 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.913830 4664 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.913844 4664 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.907813 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.907816 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.907827 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.914142 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.907892 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.908407 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.908554 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.908578 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.908678 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.909251 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.909518 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.909559 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.909703 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.909678 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.909542 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.909742 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.909849 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.909844 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.909999 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.909999 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.910021 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.910176 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.910190 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.910278 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.910321 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.910632 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.910632 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.910719 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.910877 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.908728 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.911064 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.911097 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.911186 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.911371 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.911524 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.911546 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.911635 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.911742 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.911849 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.911860 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.912189 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.912235 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.912265 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.912268 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.912376 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.912421 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.912484 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.912655 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.912758 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.912925 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.912981 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.913013 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.913034 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.913233 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.913247 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.913265 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.913591 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.914359 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.914480 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.914497 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.914524 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.914710 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.914592 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.914870 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.915013 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.915022 4664 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.915026 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.915100 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.915151 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.915264 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.915320 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.915649 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.915674 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.915694 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.915840 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.915987 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.916280 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.916377 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.916477 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.916525 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.916669 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.917068 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.917099 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.917005 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.917129 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.917262 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.917341 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.917784 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.918310 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.918321 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.918309 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.918359 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.918976 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.919021 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.919285 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.919332 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.919399 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.919436 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.919652 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.919680 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.919721 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.919862 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.920149 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.920190 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.920370 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.920693 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.920749 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.920754 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.920705 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.920981 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.921045 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.921089 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.921109 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.921286 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.921374 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.921409 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: E1003 07:48:39.921588 4664 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 07:48:39 crc kubenswrapper[4664]: E1003 07:48:39.922700 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 07:48:40.422659935 +0000 UTC m=+21.243850425 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.922752 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.922791 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 07:48:39 crc kubenswrapper[4664]: E1003 07:48:39.922902 4664 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.922914 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.913634 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.922309 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: E1003 07:48:39.922954 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 07:48:40.422940693 +0000 UTC m=+21.244131183 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.921907 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.921682 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.923142 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.923287 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.923317 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.923596 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.923859 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.924143 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.922550 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.925914 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.926664 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.927283 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.927445 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.927754 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.916031 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.928005 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.928195 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.931946 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.932458 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.932641 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.932882 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.933099 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.933287 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.933562 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: E1003 07:48:39.933814 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 07:48:40.433794764 +0000 UTC m=+21.254985254 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.933845 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.933972 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.934022 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.934140 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.934167 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.934383 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.934504 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.934851 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.934934 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.935056 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.934913 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.934740 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.935521 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.935630 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.935991 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.935415 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.936415 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.936553 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.936565 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.936758 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.936869 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.937090 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.937259 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.937509 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 07:48:39 crc kubenswrapper[4664]: E1003 07:48:39.937533 4664 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 07:48:39 crc kubenswrapper[4664]: E1003 07:48:39.937598 4664 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.937837 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: E1003 07:48:39.937741 4664 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 07:48:39 crc kubenswrapper[4664]: E1003 07:48:39.937895 4664 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 07:48:39 crc kubenswrapper[4664]: E1003 07:48:39.937985 4664 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.937630 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.937894 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: E1003 07:48:39.937963 4664 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 07:48:39 crc kubenswrapper[4664]: E1003 07:48:39.938089 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 07:48:40.438074506 +0000 UTC m=+21.259264996 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.938134 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: E1003 07:48:39.938186 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 07:48:40.438166149 +0000 UTC m=+21.259356639 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.938343 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.938849 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.942038 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.942445 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.942701 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.942727 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.943409 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.943550 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.943891 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.947891 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.949650 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.962193 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.964509 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.965858 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.966291 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.973528 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.975004 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.984863 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.994301 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.994951 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.995285 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.996813 4664 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9" exitCode=255 Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.996847 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9"} Oct 03 07:48:39 crc kubenswrapper[4664]: I1003 07:48:39.996922 4664 scope.go:117] "RemoveContainer" containerID="71f17ff9f4fc1366ae9115b1054886353f6e3567b5359d46f3e053572810fa1e" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.003377 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 07:48:40 crc kubenswrapper[4664]: E1003 07:48:40.004563 4664 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.006822 4664 scope.go:117] "RemoveContainer" containerID="387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9" Oct 03 07:48:40 crc kubenswrapper[4664]: E1003 07:48:40.006986 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.008132 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.012747 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8078a587-9c95-43a8-9cf8-4286835f8134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938aaa7e3e507e4f04632ecbdeb12d1e169b59fdcd8c19838e21fedc4b55d721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682632866f1e606691047ac952506c02f250c825f142a1676a4abcca32e58466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7eed5fa26eeb30a803512265d01d5fbb9fffad00d857a683939158da6b0bb7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e4e705e4bd02aaf141572ec91291240f8f7ec92b33cd45ba98ea8ce83f06918\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.015263 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.015346 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.015416 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.015427 4664 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.015480 4664 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.015492 4664 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.015504 4664 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.015516 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.015526 4664 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.015534 4664 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.015543 4664 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.015552 4664 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.015561 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.015571 4664 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.015592 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.015633 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.015648 4664 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.015644 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.015660 4664 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.015758 4664 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.015773 4664 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.015786 4664 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.015799 4664 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.015812 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.015824 4664 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.015858 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.015872 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.015884 4664 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.015896 4664 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.015908 4664 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.015920 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.015933 4664 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.015945 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.015957 4664 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.015973 4664 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.015990 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016003 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016016 4664 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016029 4664 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016041 4664 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016053 4664 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016065 4664 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016077 4664 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016090 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016103 4664 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016115 4664 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016127 4664 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016140 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016152 4664 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016164 4664 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016177 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016189 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016203 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016216 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016362 4664 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016376 4664 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016389 4664 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016401 4664 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016414 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016427 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016439 4664 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016452 4664 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016464 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016478 4664 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016489 4664 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016502 4664 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016515 4664 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016529 4664 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016545 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016570 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016582 4664 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016594 4664 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016632 4664 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016646 4664 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016658 4664 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016671 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016682 4664 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016695 4664 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016708 4664 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016721 4664 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016734 4664 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016767 4664 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016781 4664 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016792 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016804 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016816 4664 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016829 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016841 4664 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016853 4664 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016864 4664 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016876 4664 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016887 4664 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016898 4664 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016909 4664 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016920 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016931 4664 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016943 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016956 4664 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016968 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016980 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.016991 4664 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.017005 4664 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.017016 4664 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.017029 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.017040 4664 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.017051 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.017063 4664 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.017076 4664 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.017088 4664 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.017101 4664 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.017112 4664 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.017124 4664 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.017135 4664 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.017147 4664 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.017159 4664 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.017170 4664 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.017181 4664 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.017193 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.017205 4664 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.017219 4664 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.017232 4664 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.017250 4664 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.017263 4664 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.017274 4664 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.017289 4664 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.017304 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.017336 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.017350 4664 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.017362 4664 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.017375 4664 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.017389 4664 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.017401 4664 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.017415 4664 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.017429 4664 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.017442 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.017454 4664 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.017466 4664 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.017478 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.017491 4664 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.017503 4664 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.017515 4664 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.017526 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.017553 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.017567 4664 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.017579 4664 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.017592 4664 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.018697 4664 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.018720 4664 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.018734 4664 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.018747 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.018760 4664 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.018772 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.018785 4664 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.018798 4664 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.018813 4664 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.018826 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.018841 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.018854 4664 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.018867 4664 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.018879 4664 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.018892 4664 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.018904 4664 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.018918 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.018947 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.018962 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.018976 4664 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.018989 4664 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.019000 4664 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.019012 4664 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.019027 4664 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.019039 4664 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.019052 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.019065 4664 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.019078 4664 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.019093 4664 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.019106 4664 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.019119 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.019133 4664 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.019145 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.019158 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.019170 4664 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.038487 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.052770 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.064570 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7002f9af-9339-4bec-8b7a-ce0c1d20f3ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96ea8ece692f200d9dbe3650ba30a3680897c6d7c33c9b359b27e31692689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f580c83a6ce9959ed5a1c8b78834f7d6caecaeff3ed366fdef991ff728b08d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed30aa5b737df201bfca9e17073255b5d791430429d86149107650c38a1c0c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71f17ff9f4fc1366ae9115b1054886353f6e3567b5359d46f3e053572810fa1e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T07:48:24Z\\\",\\\"message\\\":\\\"W1003 07:48:23.315679 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1003 07:48:23.316217 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759477703 cert, and key in /tmp/serving-cert-1251427851/serving-signer.crt, /tmp/serving-cert-1251427851/serving-signer.key\\\\nI1003 07:48:23.833504 1 observer_polling.go:159] Starting file observer\\\\nW1003 07:48:23.837345 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1003 07:48:23.837658 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 07:48:23.839845 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1251427851/tls.crt::/tmp/serving-cert-1251427851/tls.key\\\\\\\"\\\\nF1003 07:48:23.980788 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 07:48:39.005956 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 07:48:39.006283 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 07:48:39.007038 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3527224080/tls.crt::/tmp/serving-cert-3527224080/tls.key\\\\\\\"\\\\nI1003 07:48:39.474760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 07:48:39.479276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 07:48:39.479354 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 07:48:39.479401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 07:48:39.479427 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 07:48:39.489134 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 07:48:39.489225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 07:48:39.489291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 07:48:39.489311 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 07:48:39.489336 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 07:48:39.489506 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 07:48:39.491496 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1078b48064376958ba64f9040dc1973628494ff4c9935c90c296bdb8a27f8738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.072979 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.081563 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8078a587-9c95-43a8-9cf8-4286835f8134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938aaa7e3e507e4f04632ecbdeb12d1e169b59fdcd8c19838e21fedc4b55d721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682632866f1e606691047ac952506c02f250c825f142a1676a4abcca32e58466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7eed5fa26eeb30a803512265d01d5fbb9fffad00d857a683939158da6b0bb7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e4e705e4bd02aaf141572ec91291240f8f7ec92b33cd45ba98ea8ce83f06918\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.090695 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.100784 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.112373 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.124324 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.134218 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.139353 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.146648 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 07:48:40 crc kubenswrapper[4664]: W1003 07:48:40.149193 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-bf8fa7449090c1d55e39fe10204c77b99ce4e08de1c31020f121c092158e74bd WatchSource:0}: Error finding container bf8fa7449090c1d55e39fe10204c77b99ce4e08de1c31020f121c092158e74bd: Status 404 returned error can't find the container with id bf8fa7449090c1d55e39fe10204c77b99ce4e08de1c31020f121c092158e74bd Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.151533 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 07:48:40 crc kubenswrapper[4664]: W1003 07:48:40.157004 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-ca19195abbc3348dbcbc7f647e965a3165249b7431c527e0ee56100611992e35 WatchSource:0}: Error finding container ca19195abbc3348dbcbc7f647e965a3165249b7431c527e0ee56100611992e35: Status 404 returned error can't find the container with id ca19195abbc3348dbcbc7f647e965a3165249b7431c527e0ee56100611992e35 Oct 03 07:48:40 crc kubenswrapper[4664]: W1003 07:48:40.167698 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-2d61f704b5a2de7eff1a0cfa25a8c429a6a35868ea6ea4fd5f1703235095988b WatchSource:0}: Error finding container 2d61f704b5a2de7eff1a0cfa25a8c429a6a35868ea6ea4fd5f1703235095988b: Status 404 returned error can't find the container with id 2d61f704b5a2de7eff1a0cfa25a8c429a6a35868ea6ea4fd5f1703235095988b Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.523073 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.523154 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:48:40 crc kubenswrapper[4664]: E1003 07:48:40.523166 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 07:48:41.523147826 +0000 UTC m=+22.344338316 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.523192 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.523219 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:48:40 crc kubenswrapper[4664]: I1003 07:48:40.523242 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:48:40 crc kubenswrapper[4664]: E1003 07:48:40.523272 4664 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 07:48:40 crc kubenswrapper[4664]: E1003 07:48:40.523290 4664 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 07:48:40 crc kubenswrapper[4664]: E1003 07:48:40.523302 4664 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 07:48:40 crc kubenswrapper[4664]: E1003 07:48:40.523303 4664 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 07:48:40 crc kubenswrapper[4664]: E1003 07:48:40.523339 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 07:48:41.523328891 +0000 UTC m=+22.344519381 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 07:48:40 crc kubenswrapper[4664]: E1003 07:48:40.523354 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 07:48:41.523346631 +0000 UTC m=+22.344537121 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 07:48:40 crc kubenswrapper[4664]: E1003 07:48:40.523367 4664 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 07:48:40 crc kubenswrapper[4664]: E1003 07:48:40.523396 4664 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 07:48:40 crc kubenswrapper[4664]: E1003 07:48:40.523407 4664 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 07:48:40 crc kubenswrapper[4664]: E1003 07:48:40.523467 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 07:48:41.523450734 +0000 UTC m=+22.344641224 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 07:48:40 crc kubenswrapper[4664]: E1003 07:48:40.523545 4664 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 07:48:40 crc kubenswrapper[4664]: E1003 07:48:40.523575 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 07:48:41.523567898 +0000 UTC m=+22.344758388 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.002571 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.005144 4664 scope.go:117] "RemoveContainer" containerID="387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9" Oct 03 07:48:41 crc kubenswrapper[4664]: E1003 07:48:41.005270 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.006049 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"f2a8b21d2fb73dbe0bf41ef524cce8befb05dca708bc568a567ccd71bc1434b3"} Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.006084 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"2d61f704b5a2de7eff1a0cfa25a8c429a6a35868ea6ea4fd5f1703235095988b"} Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.007533 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e33c51653288453da9ac24f91fdd67f9cec6bea6dc9b90432eeab5350b7e48f2"} Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.007625 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"741ad5608cab1c104f0a912ca7325733265167e85291eb6bc1b08dd5a23a9b4a"} Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.007640 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ca19195abbc3348dbcbc7f647e965a3165249b7431c527e0ee56100611992e35"} Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.008229 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"bf8fa7449090c1d55e39fe10204c77b99ce4e08de1c31020f121c092158e74bd"} Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.021728 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7002f9af-9339-4bec-8b7a-ce0c1d20f3ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96ea8ece692f200d9dbe3650ba30a3680897c6d7c33c9b359b27e31692689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f580c83a6ce9959ed5a1c8b78834f7d6caecaeff3ed366fdef991ff728b08d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed30aa5b737df201bfca9e17073255b5d791430429d86149107650c38a1c0c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 07:48:39.005956 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 07:48:39.006283 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 07:48:39.007038 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3527224080/tls.crt::/tmp/serving-cert-3527224080/tls.key\\\\\\\"\\\\nI1003 07:48:39.474760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 07:48:39.479276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 07:48:39.479354 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 07:48:39.479401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 07:48:39.479427 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 07:48:39.489134 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 07:48:39.489225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 07:48:39.489291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 07:48:39.489311 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 07:48:39.489336 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 07:48:39.489506 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 07:48:39.491496 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1078b48064376958ba64f9040dc1973628494ff4c9935c90c296bdb8a27f8738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:41Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.036319 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:41Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.050097 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:41Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.063904 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:41Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.079739 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:41Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.097263 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:41Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.109465 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8078a587-9c95-43a8-9cf8-4286835f8134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938aaa7e3e507e4f04632ecbdeb12d1e169b59fdcd8c19838e21fedc4b55d721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682632866f1e606691047ac952506c02f250c825f142a1676a4abcca32e58466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7eed5fa26eeb30a803512265d01d5fbb9fffad00d857a683939158da6b0bb7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e4e705e4bd02aaf141572ec91291240f8f7ec92b33cd45ba98ea8ce83f06918\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:41Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.122790 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:41Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.137297 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7002f9af-9339-4bec-8b7a-ce0c1d20f3ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96ea8ece692f200d9dbe3650ba30a3680897c6d7c33c9b359b27e31692689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f580c83a6ce9959ed5a1c8b78834f7d6caecaeff3ed366fdef991ff728b08d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed30aa5b737df201bfca9e17073255b5d791430429d86149107650c38a1c0c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 07:48:39.005956 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 07:48:39.006283 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 07:48:39.007038 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3527224080/tls.crt::/tmp/serving-cert-3527224080/tls.key\\\\\\\"\\\\nI1003 07:48:39.474760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 07:48:39.479276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 07:48:39.479354 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 07:48:39.479401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 07:48:39.479427 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 07:48:39.489134 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 07:48:39.489225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 07:48:39.489291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 07:48:39.489311 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 07:48:39.489336 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 07:48:39.489506 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 07:48:39.491496 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1078b48064376958ba64f9040dc1973628494ff4c9935c90c296bdb8a27f8738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:41Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.150672 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:41Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.161937 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8078a587-9c95-43a8-9cf8-4286835f8134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938aaa7e3e507e4f04632ecbdeb12d1e169b59fdcd8c19838e21fedc4b55d721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682632866f1e606691047ac952506c02f250c825f142a1676a4abcca32e58466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7eed5fa26eeb30a803512265d01d5fbb9fffad00d857a683939158da6b0bb7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e4e705e4bd02aaf141572ec91291240f8f7ec92b33cd45ba98ea8ce83f06918\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:41Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.172730 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:41Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.183065 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:41Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.196625 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c51653288453da9ac24f91fdd67f9cec6bea6dc9b90432eeab5350b7e48f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://741ad5608cab1c104f0a912ca7325733265167e85291eb6bc1b08dd5a23a9b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:41Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.210812 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a8b21d2fb73dbe0bf41ef524cce8befb05dca708bc568a567ccd71bc1434b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:41Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.222700 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:41Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.530797 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.530920 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:48:41 crc kubenswrapper[4664]: E1003 07:48:41.530947 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 07:48:43.530929771 +0000 UTC m=+24.352120261 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.530970 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.531000 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.531020 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:48:41 crc kubenswrapper[4664]: E1003 07:48:41.531039 4664 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 07:48:41 crc kubenswrapper[4664]: E1003 07:48:41.531055 4664 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 07:48:41 crc kubenswrapper[4664]: E1003 07:48:41.531067 4664 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 07:48:41 crc kubenswrapper[4664]: E1003 07:48:41.531068 4664 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 07:48:41 crc kubenswrapper[4664]: E1003 07:48:41.531103 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 07:48:43.531093376 +0000 UTC m=+24.352283876 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 07:48:41 crc kubenswrapper[4664]: E1003 07:48:41.531104 4664 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 07:48:41 crc kubenswrapper[4664]: E1003 07:48:41.531121 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 07:48:43.531109606 +0000 UTC m=+24.352300096 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 07:48:41 crc kubenswrapper[4664]: E1003 07:48:41.531074 4664 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 07:48:41 crc kubenswrapper[4664]: E1003 07:48:41.531128 4664 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 07:48:41 crc kubenswrapper[4664]: E1003 07:48:41.531148 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 07:48:43.531143077 +0000 UTC m=+24.352333567 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 07:48:41 crc kubenswrapper[4664]: E1003 07:48:41.531148 4664 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 07:48:41 crc kubenswrapper[4664]: E1003 07:48:41.531184 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 07:48:43.531172868 +0000 UTC m=+24.352363358 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.876057 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.876143 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.876160 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:48:41 crc kubenswrapper[4664]: E1003 07:48:41.876187 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:48:41 crc kubenswrapper[4664]: E1003 07:48:41.876269 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:48:41 crc kubenswrapper[4664]: E1003 07:48:41.876369 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.881980 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.882670 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.883909 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.884528 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.885563 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.886111 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.886775 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.887744 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.888354 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.889450 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.889964 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.890997 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.891459 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.892017 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.892914 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.893415 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.894489 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.894968 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.895608 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.896571 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.897040 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.897989 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.898386 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.899154 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.899778 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.900429 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.901821 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.902321 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.903342 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.903918 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.904857 4664 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.904965 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.906740 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.907783 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.908196 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.909935 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.910697 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.911680 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.912259 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.913176 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.913634 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.914256 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.914896 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.915519 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.915983 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.916520 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.917201 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.918093 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.918662 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.919184 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.919653 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.920340 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.920959 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 03 07:48:41 crc kubenswrapper[4664]: I1003 07:48:41.921572 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 03 07:48:43 crc kubenswrapper[4664]: I1003 07:48:43.015296 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"14d6a6dd75fa3f860ac6164f4065d5fbb3ea31148102e726b792655d5b08e34d"} Oct 03 07:48:43 crc kubenswrapper[4664]: I1003 07:48:43.030358 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7002f9af-9339-4bec-8b7a-ce0c1d20f3ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96ea8ece692f200d9dbe3650ba30a3680897c6d7c33c9b359b27e31692689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f580c83a6ce9959ed5a1c8b78834f7d6caecaeff3ed366fdef991ff728b08d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed30aa5b737df201bfca9e17073255b5d791430429d86149107650c38a1c0c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 07:48:39.005956 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 07:48:39.006283 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 07:48:39.007038 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3527224080/tls.crt::/tmp/serving-cert-3527224080/tls.key\\\\\\\"\\\\nI1003 07:48:39.474760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 07:48:39.479276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 07:48:39.479354 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 07:48:39.479401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 07:48:39.479427 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 07:48:39.489134 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 07:48:39.489225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 07:48:39.489291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 07:48:39.489311 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 07:48:39.489336 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 07:48:39.489506 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 07:48:39.491496 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1078b48064376958ba64f9040dc1973628494ff4c9935c90c296bdb8a27f8738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:43Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:43 crc kubenswrapper[4664]: I1003 07:48:43.043285 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:43Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:43 crc kubenswrapper[4664]: I1003 07:48:43.056308 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:43Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:43 crc kubenswrapper[4664]: I1003 07:48:43.069431 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8078a587-9c95-43a8-9cf8-4286835f8134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938aaa7e3e507e4f04632ecbdeb12d1e169b59fdcd8c19838e21fedc4b55d721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682632866f1e606691047ac952506c02f250c825f142a1676a4abcca32e58466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7eed5fa26eeb30a803512265d01d5fbb9fffad00d857a683939158da6b0bb7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e4e705e4bd02aaf141572ec91291240f8f7ec92b33cd45ba98ea8ce83f06918\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:43Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:43 crc kubenswrapper[4664]: I1003 07:48:43.083476 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:43Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:43 crc kubenswrapper[4664]: I1003 07:48:43.095875 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d6a6dd75fa3f860ac6164f4065d5fbb3ea31148102e726b792655d5b08e34d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:43Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:43 crc kubenswrapper[4664]: I1003 07:48:43.108983 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c51653288453da9ac24f91fdd67f9cec6bea6dc9b90432eeab5350b7e48f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://741ad5608cab1c104f0a912ca7325733265167e85291eb6bc1b08dd5a23a9b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:43Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:43 crc kubenswrapper[4664]: I1003 07:48:43.121772 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a8b21d2fb73dbe0bf41ef524cce8befb05dca708bc568a567ccd71bc1434b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:43Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:43 crc kubenswrapper[4664]: I1003 07:48:43.548940 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:48:43 crc kubenswrapper[4664]: I1003 07:48:43.549007 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:48:43 crc kubenswrapper[4664]: I1003 07:48:43.549042 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:48:43 crc kubenswrapper[4664]: E1003 07:48:43.549090 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 07:48:47.549064277 +0000 UTC m=+28.370254767 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:48:43 crc kubenswrapper[4664]: I1003 07:48:43.549138 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:48:43 crc kubenswrapper[4664]: E1003 07:48:43.549153 4664 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 07:48:43 crc kubenswrapper[4664]: I1003 07:48:43.549181 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:48:43 crc kubenswrapper[4664]: E1003 07:48:43.549218 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 07:48:47.549199651 +0000 UTC m=+28.370390231 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 07:48:43 crc kubenswrapper[4664]: E1003 07:48:43.549263 4664 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 07:48:43 crc kubenswrapper[4664]: E1003 07:48:43.549295 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 07:48:47.549289203 +0000 UTC m=+28.370479693 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 07:48:43 crc kubenswrapper[4664]: E1003 07:48:43.549349 4664 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 07:48:43 crc kubenswrapper[4664]: E1003 07:48:43.549421 4664 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 07:48:43 crc kubenswrapper[4664]: E1003 07:48:43.549435 4664 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 07:48:43 crc kubenswrapper[4664]: E1003 07:48:43.549511 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 07:48:47.549488209 +0000 UTC m=+28.370678699 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 07:48:43 crc kubenswrapper[4664]: E1003 07:48:43.549532 4664 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 07:48:43 crc kubenswrapper[4664]: E1003 07:48:43.549641 4664 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 07:48:43 crc kubenswrapper[4664]: E1003 07:48:43.549662 4664 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 07:48:43 crc kubenswrapper[4664]: E1003 07:48:43.549765 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 07:48:47.549736766 +0000 UTC m=+28.370927526 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 07:48:43 crc kubenswrapper[4664]: I1003 07:48:43.876140 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:48:43 crc kubenswrapper[4664]: I1003 07:48:43.876248 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:48:43 crc kubenswrapper[4664]: E1003 07:48:43.876308 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:48:43 crc kubenswrapper[4664]: E1003 07:48:43.876430 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:48:43 crc kubenswrapper[4664]: I1003 07:48:43.876248 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:48:43 crc kubenswrapper[4664]: E1003 07:48:43.876538 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:48:44 crc kubenswrapper[4664]: I1003 07:48:44.721098 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-hqfhs"] Oct 03 07:48:44 crc kubenswrapper[4664]: I1003 07:48:44.721862 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hqfhs" Oct 03 07:48:44 crc kubenswrapper[4664]: I1003 07:48:44.724484 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 03 07:48:44 crc kubenswrapper[4664]: I1003 07:48:44.725912 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 03 07:48:44 crc kubenswrapper[4664]: I1003 07:48:44.726437 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 03 07:48:44 crc kubenswrapper[4664]: I1003 07:48:44.748812 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d6a6dd75fa3f860ac6164f4065d5fbb3ea31148102e726b792655d5b08e34d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:44Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:44 crc kubenswrapper[4664]: I1003 07:48:44.771991 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c51653288453da9ac24f91fdd67f9cec6bea6dc9b90432eeab5350b7e48f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://741ad5608cab1c104f0a912ca7325733265167e85291eb6bc1b08dd5a23a9b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:44Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:44 crc kubenswrapper[4664]: I1003 07:48:44.792964 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a8b21d2fb73dbe0bf41ef524cce8befb05dca708bc568a567ccd71bc1434b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:44Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:44 crc kubenswrapper[4664]: I1003 07:48:44.813365 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:44Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:44 crc kubenswrapper[4664]: I1003 07:48:44.833353 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8078a587-9c95-43a8-9cf8-4286835f8134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938aaa7e3e507e4f04632ecbdeb12d1e169b59fdcd8c19838e21fedc4b55d721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682632866f1e606691047ac952506c02f250c825f142a1676a4abcca32e58466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7eed5fa26eeb30a803512265d01d5fbb9fffad00d857a683939158da6b0bb7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e4e705e4bd02aaf141572ec91291240f8f7ec92b33cd45ba98ea8ce83f06918\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:44Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:44 crc kubenswrapper[4664]: I1003 07:48:44.853058 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:44Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:44 crc kubenswrapper[4664]: I1003 07:48:44.857322 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nhjf\" (UniqueName: \"kubernetes.io/projected/e03e57b4-cb64-42fa-b8c5-ee4863291568-kube-api-access-8nhjf\") pod \"node-resolver-hqfhs\" (UID: \"e03e57b4-cb64-42fa-b8c5-ee4863291568\") " pod="openshift-dns/node-resolver-hqfhs" Oct 03 07:48:44 crc kubenswrapper[4664]: I1003 07:48:44.857401 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e03e57b4-cb64-42fa-b8c5-ee4863291568-hosts-file\") pod \"node-resolver-hqfhs\" (UID: \"e03e57b4-cb64-42fa-b8c5-ee4863291568\") " pod="openshift-dns/node-resolver-hqfhs" Oct 03 07:48:44 crc kubenswrapper[4664]: I1003 07:48:44.867571 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hqfhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e03e57b4-cb64-42fa-b8c5-ee4863291568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:44Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:44Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hqfhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:44Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:44 crc kubenswrapper[4664]: I1003 07:48:44.881364 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7002f9af-9339-4bec-8b7a-ce0c1d20f3ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96ea8ece692f200d9dbe3650ba30a3680897c6d7c33c9b359b27e31692689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f580c83a6ce9959ed5a1c8b78834f7d6caecaeff3ed366fdef991ff728b08d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed30aa5b737df201bfca9e17073255b5d791430429d86149107650c38a1c0c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 07:48:39.005956 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 07:48:39.006283 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 07:48:39.007038 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3527224080/tls.crt::/tmp/serving-cert-3527224080/tls.key\\\\\\\"\\\\nI1003 07:48:39.474760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 07:48:39.479276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 07:48:39.479354 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 07:48:39.479401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 07:48:39.479427 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 07:48:39.489134 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 07:48:39.489225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 07:48:39.489291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 07:48:39.489311 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 07:48:39.489336 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 07:48:39.489506 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 07:48:39.491496 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1078b48064376958ba64f9040dc1973628494ff4c9935c90c296bdb8a27f8738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:44Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:44 crc kubenswrapper[4664]: I1003 07:48:44.894383 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:44Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:44 crc kubenswrapper[4664]: I1003 07:48:44.958549 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nhjf\" (UniqueName: \"kubernetes.io/projected/e03e57b4-cb64-42fa-b8c5-ee4863291568-kube-api-access-8nhjf\") pod \"node-resolver-hqfhs\" (UID: \"e03e57b4-cb64-42fa-b8c5-ee4863291568\") " pod="openshift-dns/node-resolver-hqfhs" Oct 03 07:48:44 crc kubenswrapper[4664]: I1003 07:48:44.958591 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e03e57b4-cb64-42fa-b8c5-ee4863291568-hosts-file\") pod \"node-resolver-hqfhs\" (UID: \"e03e57b4-cb64-42fa-b8c5-ee4863291568\") " pod="openshift-dns/node-resolver-hqfhs" Oct 03 07:48:44 crc kubenswrapper[4664]: I1003 07:48:44.958692 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e03e57b4-cb64-42fa-b8c5-ee4863291568-hosts-file\") pod \"node-resolver-hqfhs\" (UID: \"e03e57b4-cb64-42fa-b8c5-ee4863291568\") " pod="openshift-dns/node-resolver-hqfhs" Oct 03 07:48:44 crc kubenswrapper[4664]: I1003 07:48:44.983482 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nhjf\" (UniqueName: \"kubernetes.io/projected/e03e57b4-cb64-42fa-b8c5-ee4863291568-kube-api-access-8nhjf\") pod \"node-resolver-hqfhs\" (UID: \"e03e57b4-cb64-42fa-b8c5-ee4863291568\") " pod="openshift-dns/node-resolver-hqfhs" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.035867 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hqfhs" Oct 03 07:48:45 crc kubenswrapper[4664]: W1003 07:48:45.047774 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode03e57b4_cb64_42fa_b8c5_ee4863291568.slice/crio-c6eac16aeed7a01b51952cbe7ab68ed586ec1db72128e6dc0d89ee49d4e857a9 WatchSource:0}: Error finding container c6eac16aeed7a01b51952cbe7ab68ed586ec1db72128e6dc0d89ee49d4e857a9: Status 404 returned error can't find the container with id c6eac16aeed7a01b51952cbe7ab68ed586ec1db72128e6dc0d89ee49d4e857a9 Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.397680 4664 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.399672 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.399721 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.399735 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.399805 4664 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.409210 4664 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.409578 4664 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.411053 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.411122 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.411143 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.411169 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.411185 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:45Z","lastTransitionTime":"2025-10-03T07:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:45 crc kubenswrapper[4664]: E1003 07:48:45.432990 4664 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eca81c87-676e-4667-a87b-e015ec0be81c\\\",\\\"systemUUID\\\":\\\"7be6e848-96ef-48b1-8627-9ddc13d5cc87\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:45Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.437014 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.437083 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.437096 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.437120 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.437134 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:45Z","lastTransitionTime":"2025-10-03T07:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:45 crc kubenswrapper[4664]: E1003 07:48:45.453375 4664 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eca81c87-676e-4667-a87b-e015ec0be81c\\\",\\\"systemUUID\\\":\\\"7be6e848-96ef-48b1-8627-9ddc13d5cc87\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:45Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.457238 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.457280 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.457289 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.457306 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.457316 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:45Z","lastTransitionTime":"2025-10-03T07:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:45 crc kubenswrapper[4664]: E1003 07:48:45.472738 4664 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eca81c87-676e-4667-a87b-e015ec0be81c\\\",\\\"systemUUID\\\":\\\"7be6e848-96ef-48b1-8627-9ddc13d5cc87\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:45Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.477507 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.477554 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.477564 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.477581 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.477598 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:45Z","lastTransitionTime":"2025-10-03T07:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:45 crc kubenswrapper[4664]: E1003 07:48:45.490596 4664 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eca81c87-676e-4667-a87b-e015ec0be81c\\\",\\\"systemUUID\\\":\\\"7be6e848-96ef-48b1-8627-9ddc13d5cc87\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:45Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.495812 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.495867 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.495880 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.495904 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.495921 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:45Z","lastTransitionTime":"2025-10-03T07:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:45 crc kubenswrapper[4664]: E1003 07:48:45.509090 4664 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eca81c87-676e-4667-a87b-e015ec0be81c\\\",\\\"systemUUID\\\":\\\"7be6e848-96ef-48b1-8627-9ddc13d5cc87\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:45Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:45 crc kubenswrapper[4664]: E1003 07:48:45.509243 4664 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.511562 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.511622 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.511640 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.511659 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.511672 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:45Z","lastTransitionTime":"2025-10-03T07:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.614557 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.614618 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.614628 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.614644 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.614654 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:45Z","lastTransitionTime":"2025-10-03T07:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.696223 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-72cm2"] Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.696730 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-x9dgm"] Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.696969 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-72cm2" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.697102 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.699344 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.699699 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.699840 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.700003 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.700541 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.700595 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.700595 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.700754 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.702289 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-2jpvm"] Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.702619 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.704128 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.705362 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-h865c"] Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.705694 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.706080 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.706157 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-h865c" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.706496 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.706877 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.706897 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.706922 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.709169 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.709377 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.710978 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.718880 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.719354 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.719395 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.719405 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.719430 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.719441 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:45Z","lastTransitionTime":"2025-10-03T07:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.723506 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-72cm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6998d742-8d17-4f20-ab52-c30d9f7b0b89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9hpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-72cm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:45Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.747575 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7002f9af-9339-4bec-8b7a-ce0c1d20f3ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96ea8ece692f200d9dbe3650ba30a3680897c6d7c33c9b359b27e31692689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f580c83a6ce9959ed5a1c8b78834f7d6caecaeff3ed366fdef991ff728b08d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed30aa5b737df201bfca9e17073255b5d791430429d86149107650c38a1c0c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 07:48:39.005956 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 07:48:39.006283 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 07:48:39.007038 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3527224080/tls.crt::/tmp/serving-cert-3527224080/tls.key\\\\\\\"\\\\nI1003 07:48:39.474760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 07:48:39.479276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 07:48:39.479354 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 07:48:39.479401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 07:48:39.479427 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 07:48:39.489134 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 07:48:39.489225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 07:48:39.489291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 07:48:39.489311 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 07:48:39.489336 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 07:48:39.489506 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 07:48:39.491496 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1078b48064376958ba64f9040dc1973628494ff4c9935c90c296bdb8a27f8738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:45Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.773862 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:45Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.787424 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hqfhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e03e57b4-cb64-42fa-b8c5-ee4863291568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:44Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:44Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hqfhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:45Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.802527 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8078a587-9c95-43a8-9cf8-4286835f8134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938aaa7e3e507e4f04632ecbdeb12d1e169b59fdcd8c19838e21fedc4b55d721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682632866f1e606691047ac952506c02f250c825f142a1676a4abcca32e58466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7eed5fa26eeb30a803512265d01d5fbb9fffad00d857a683939158da6b0bb7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e4e705e4bd02aaf141572ec91291240f8f7ec92b33cd45ba98ea8ce83f06918\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:45Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.822430 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.822484 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.822497 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.822516 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.822527 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:45Z","lastTransitionTime":"2025-10-03T07:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.859732 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:45Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.866169 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-host-cni-bin\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.866226 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-ovn-node-metrics-cert\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.866249 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k42nn\" (UniqueName: \"kubernetes.io/projected/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-kube-api-access-k42nn\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.866269 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/598b81ce-0ce7-498f-9337-ae5e6e64682b-mcd-auth-proxy-config\") pod \"machine-config-daemon-x9dgm\" (UID: \"598b81ce-0ce7-498f-9337-ae5e6e64682b\") " pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.866286 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br5ch\" (UniqueName: \"kubernetes.io/projected/59e441c3-e8db-4705-9da1-0c6513d57048-kube-api-access-br5ch\") pod \"multus-additional-cni-plugins-h865c\" (UID: \"59e441c3-e8db-4705-9da1-0c6513d57048\") " pod="openshift-multus/multus-additional-cni-plugins-h865c" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.866320 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-run-ovn\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.866418 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-node-log\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.866496 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-ovnkube-script-lib\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.866523 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6998d742-8d17-4f20-ab52-c30d9f7b0b89-cnibin\") pod \"multus-72cm2\" (UID: \"6998d742-8d17-4f20-ab52-c30d9f7b0b89\") " pod="openshift-multus/multus-72cm2" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.866581 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-host-run-netns\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.866654 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-host-run-ovn-kubernetes\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.866681 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6998d742-8d17-4f20-ab52-c30d9f7b0b89-hostroot\") pod \"multus-72cm2\" (UID: \"6998d742-8d17-4f20-ab52-c30d9f7b0b89\") " pod="openshift-multus/multus-72cm2" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.866723 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/59e441c3-e8db-4705-9da1-0c6513d57048-os-release\") pod \"multus-additional-cni-plugins-h865c\" (UID: \"59e441c3-e8db-4705-9da1-0c6513d57048\") " pod="openshift-multus/multus-additional-cni-plugins-h865c" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.866750 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/598b81ce-0ce7-498f-9337-ae5e6e64682b-rootfs\") pod \"machine-config-daemon-x9dgm\" (UID: \"598b81ce-0ce7-498f-9337-ae5e6e64682b\") " pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.866770 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-run-systemd\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.866792 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-var-lib-openvswitch\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.866818 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/59e441c3-e8db-4705-9da1-0c6513d57048-tuning-conf-dir\") pod \"multus-additional-cni-plugins-h865c\" (UID: \"59e441c3-e8db-4705-9da1-0c6513d57048\") " pod="openshift-multus/multus-additional-cni-plugins-h865c" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.866850 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-host-kubelet\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.866875 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6998d742-8d17-4f20-ab52-c30d9f7b0b89-multus-cni-dir\") pod \"multus-72cm2\" (UID: \"6998d742-8d17-4f20-ab52-c30d9f7b0b89\") " pod="openshift-multus/multus-72cm2" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.866900 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9hpq\" (UniqueName: \"kubernetes.io/projected/6998d742-8d17-4f20-ab52-c30d9f7b0b89-kube-api-access-l9hpq\") pod \"multus-72cm2\" (UID: \"6998d742-8d17-4f20-ab52-c30d9f7b0b89\") " pod="openshift-multus/multus-72cm2" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.866936 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6998d742-8d17-4f20-ab52-c30d9f7b0b89-host-var-lib-cni-bin\") pod \"multus-72cm2\" (UID: \"6998d742-8d17-4f20-ab52-c30d9f7b0b89\") " pod="openshift-multus/multus-72cm2" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.867034 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6998d742-8d17-4f20-ab52-c30d9f7b0b89-os-release\") pod \"multus-72cm2\" (UID: \"6998d742-8d17-4f20-ab52-c30d9f7b0b89\") " pod="openshift-multus/multus-72cm2" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.867089 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6998d742-8d17-4f20-ab52-c30d9f7b0b89-cni-binary-copy\") pod \"multus-72cm2\" (UID: \"6998d742-8d17-4f20-ab52-c30d9f7b0b89\") " pod="openshift-multus/multus-72cm2" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.867111 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6998d742-8d17-4f20-ab52-c30d9f7b0b89-host-run-k8s-cni-cncf-io\") pod \"multus-72cm2\" (UID: \"6998d742-8d17-4f20-ab52-c30d9f7b0b89\") " pod="openshift-multus/multus-72cm2" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.867132 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/59e441c3-e8db-4705-9da1-0c6513d57048-cnibin\") pod \"multus-additional-cni-plugins-h865c\" (UID: \"59e441c3-e8db-4705-9da1-0c6513d57048\") " pod="openshift-multus/multus-additional-cni-plugins-h865c" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.867168 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6998d742-8d17-4f20-ab52-c30d9f7b0b89-host-var-lib-kubelet\") pod \"multus-72cm2\" (UID: \"6998d742-8d17-4f20-ab52-c30d9f7b0b89\") " pod="openshift-multus/multus-72cm2" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.867183 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6998d742-8d17-4f20-ab52-c30d9f7b0b89-multus-conf-dir\") pod \"multus-72cm2\" (UID: \"6998d742-8d17-4f20-ab52-c30d9f7b0b89\") " pod="openshift-multus/multus-72cm2" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.867205 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-log-socket\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.867224 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6998d742-8d17-4f20-ab52-c30d9f7b0b89-multus-daemon-config\") pod \"multus-72cm2\" (UID: \"6998d742-8d17-4f20-ab52-c30d9f7b0b89\") " pod="openshift-multus/multus-72cm2" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.867246 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6998d742-8d17-4f20-ab52-c30d9f7b0b89-etc-kubernetes\") pod \"multus-72cm2\" (UID: \"6998d742-8d17-4f20-ab52-c30d9f7b0b89\") " pod="openshift-multus/multus-72cm2" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.867262 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/59e441c3-e8db-4705-9da1-0c6513d57048-cni-binary-copy\") pod \"multus-additional-cni-plugins-h865c\" (UID: \"59e441c3-e8db-4705-9da1-0c6513d57048\") " pod="openshift-multus/multus-additional-cni-plugins-h865c" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.867279 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-host-slash\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.867297 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-etc-openvswitch\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.867330 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zx92c\" (UniqueName: \"kubernetes.io/projected/598b81ce-0ce7-498f-9337-ae5e6e64682b-kube-api-access-zx92c\") pod \"machine-config-daemon-x9dgm\" (UID: \"598b81ce-0ce7-498f-9337-ae5e6e64682b\") " pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.867358 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6998d742-8d17-4f20-ab52-c30d9f7b0b89-system-cni-dir\") pod \"multus-72cm2\" (UID: \"6998d742-8d17-4f20-ab52-c30d9f7b0b89\") " pod="openshift-multus/multus-72cm2" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.867381 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6998d742-8d17-4f20-ab52-c30d9f7b0b89-host-run-multus-certs\") pod \"multus-72cm2\" (UID: \"6998d742-8d17-4f20-ab52-c30d9f7b0b89\") " pod="openshift-multus/multus-72cm2" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.867418 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-systemd-units\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.867459 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-env-overrides\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.867486 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6998d742-8d17-4f20-ab52-c30d9f7b0b89-host-run-netns\") pod \"multus-72cm2\" (UID: \"6998d742-8d17-4f20-ab52-c30d9f7b0b89\") " pod="openshift-multus/multus-72cm2" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.867512 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6998d742-8d17-4f20-ab52-c30d9f7b0b89-host-var-lib-cni-multus\") pod \"multus-72cm2\" (UID: \"6998d742-8d17-4f20-ab52-c30d9f7b0b89\") " pod="openshift-multus/multus-72cm2" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.867572 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/59e441c3-e8db-4705-9da1-0c6513d57048-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-h865c\" (UID: \"59e441c3-e8db-4705-9da1-0c6513d57048\") " pod="openshift-multus/multus-additional-cni-plugins-h865c" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.867630 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/598b81ce-0ce7-498f-9337-ae5e6e64682b-proxy-tls\") pod \"machine-config-daemon-x9dgm\" (UID: \"598b81ce-0ce7-498f-9337-ae5e6e64682b\") " pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.867663 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6998d742-8d17-4f20-ab52-c30d9f7b0b89-multus-socket-dir-parent\") pod \"multus-72cm2\" (UID: \"6998d742-8d17-4f20-ab52-c30d9f7b0b89\") " pod="openshift-multus/multus-72cm2" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.867694 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-run-openvswitch\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.867720 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-host-cni-netd\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.867750 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.867788 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-ovnkube-config\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.867822 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/59e441c3-e8db-4705-9da1-0c6513d57048-system-cni-dir\") pod \"multus-additional-cni-plugins-h865c\" (UID: \"59e441c3-e8db-4705-9da1-0c6513d57048\") " pod="openshift-multus/multus-additional-cni-plugins-h865c" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.875475 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.875522 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.875548 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:48:45 crc kubenswrapper[4664]: E1003 07:48:45.875680 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:48:45 crc kubenswrapper[4664]: E1003 07:48:45.875780 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:48:45 crc kubenswrapper[4664]: E1003 07:48:45.875874 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.892685 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d6a6dd75fa3f860ac6164f4065d5fbb3ea31148102e726b792655d5b08e34d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:45Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.910909 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c51653288453da9ac24f91fdd67f9cec6bea6dc9b90432eeab5350b7e48f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://741ad5608cab1c104f0a912ca7325733265167e85291eb6bc1b08dd5a23a9b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:45Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.925776 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.926057 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.926139 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.926247 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.926346 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:45Z","lastTransitionTime":"2025-10-03T07:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.941210 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a8b21d2fb73dbe0bf41ef524cce8befb05dca708bc568a567ccd71bc1434b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:45Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.955838 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:45Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.967415 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b81ce-0ce7-498f-9337-ae5e6e64682b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x9dgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:45Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.968821 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/598b81ce-0ce7-498f-9337-ae5e6e64682b-rootfs\") pod \"machine-config-daemon-x9dgm\" (UID: \"598b81ce-0ce7-498f-9337-ae5e6e64682b\") " pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.968976 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-run-systemd\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.968990 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/598b81ce-0ce7-498f-9337-ae5e6e64682b-rootfs\") pod \"machine-config-daemon-x9dgm\" (UID: \"598b81ce-0ce7-498f-9337-ae5e6e64682b\") " pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.969295 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-run-systemd\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.969358 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-var-lib-openvswitch\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.969509 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/59e441c3-e8db-4705-9da1-0c6513d57048-tuning-conf-dir\") pod \"multus-additional-cni-plugins-h865c\" (UID: \"59e441c3-e8db-4705-9da1-0c6513d57048\") " pod="openshift-multus/multus-additional-cni-plugins-h865c" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.969550 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6998d742-8d17-4f20-ab52-c30d9f7b0b89-multus-cni-dir\") pod \"multus-72cm2\" (UID: \"6998d742-8d17-4f20-ab52-c30d9f7b0b89\") " pod="openshift-multus/multus-72cm2" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.969580 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9hpq\" (UniqueName: \"kubernetes.io/projected/6998d742-8d17-4f20-ab52-c30d9f7b0b89-kube-api-access-l9hpq\") pod \"multus-72cm2\" (UID: \"6998d742-8d17-4f20-ab52-c30d9f7b0b89\") " pod="openshift-multus/multus-72cm2" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.969636 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-host-kubelet\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.969690 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6998d742-8d17-4f20-ab52-c30d9f7b0b89-host-var-lib-cni-bin\") pod \"multus-72cm2\" (UID: \"6998d742-8d17-4f20-ab52-c30d9f7b0b89\") " pod="openshift-multus/multus-72cm2" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.969736 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6998d742-8d17-4f20-ab52-c30d9f7b0b89-os-release\") pod \"multus-72cm2\" (UID: \"6998d742-8d17-4f20-ab52-c30d9f7b0b89\") " pod="openshift-multus/multus-72cm2" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.969772 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6998d742-8d17-4f20-ab52-c30d9f7b0b89-cni-binary-copy\") pod \"multus-72cm2\" (UID: \"6998d742-8d17-4f20-ab52-c30d9f7b0b89\") " pod="openshift-multus/multus-72cm2" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.969802 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6998d742-8d17-4f20-ab52-c30d9f7b0b89-host-run-k8s-cni-cncf-io\") pod \"multus-72cm2\" (UID: \"6998d742-8d17-4f20-ab52-c30d9f7b0b89\") " pod="openshift-multus/multus-72cm2" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.969828 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/59e441c3-e8db-4705-9da1-0c6513d57048-cnibin\") pod \"multus-additional-cni-plugins-h865c\" (UID: \"59e441c3-e8db-4705-9da1-0c6513d57048\") " pod="openshift-multus/multus-additional-cni-plugins-h865c" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.969867 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6998d742-8d17-4f20-ab52-c30d9f7b0b89-host-var-lib-kubelet\") pod \"multus-72cm2\" (UID: \"6998d742-8d17-4f20-ab52-c30d9f7b0b89\") " pod="openshift-multus/multus-72cm2" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.969855 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6998d742-8d17-4f20-ab52-c30d9f7b0b89-host-var-lib-cni-bin\") pod \"multus-72cm2\" (UID: \"6998d742-8d17-4f20-ab52-c30d9f7b0b89\") " pod="openshift-multus/multus-72cm2" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.969929 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6998d742-8d17-4f20-ab52-c30d9f7b0b89-multus-conf-dir\") pod \"multus-72cm2\" (UID: \"6998d742-8d17-4f20-ab52-c30d9f7b0b89\") " pod="openshift-multus/multus-72cm2" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.969883 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-host-kubelet\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.969980 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6998d742-8d17-4f20-ab52-c30d9f7b0b89-host-var-lib-kubelet\") pod \"multus-72cm2\" (UID: \"6998d742-8d17-4f20-ab52-c30d9f7b0b89\") " pod="openshift-multus/multus-72cm2" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.969959 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6998d742-8d17-4f20-ab52-c30d9f7b0b89-os-release\") pod \"multus-72cm2\" (UID: \"6998d742-8d17-4f20-ab52-c30d9f7b0b89\") " pod="openshift-multus/multus-72cm2" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.969893 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6998d742-8d17-4f20-ab52-c30d9f7b0b89-multus-conf-dir\") pod \"multus-72cm2\" (UID: \"6998d742-8d17-4f20-ab52-c30d9f7b0b89\") " pod="openshift-multus/multus-72cm2" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.970039 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/59e441c3-e8db-4705-9da1-0c6513d57048-cnibin\") pod \"multus-additional-cni-plugins-h865c\" (UID: \"59e441c3-e8db-4705-9da1-0c6513d57048\") " pod="openshift-multus/multus-additional-cni-plugins-h865c" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.970083 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-log-socket\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.969983 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6998d742-8d17-4f20-ab52-c30d9f7b0b89-host-run-k8s-cni-cncf-io\") pod \"multus-72cm2\" (UID: \"6998d742-8d17-4f20-ab52-c30d9f7b0b89\") " pod="openshift-multus/multus-72cm2" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.970125 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6998d742-8d17-4f20-ab52-c30d9f7b0b89-multus-daemon-config\") pod \"multus-72cm2\" (UID: \"6998d742-8d17-4f20-ab52-c30d9f7b0b89\") " pod="openshift-multus/multus-72cm2" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.970132 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-log-socket\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.970159 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6998d742-8d17-4f20-ab52-c30d9f7b0b89-etc-kubernetes\") pod \"multus-72cm2\" (UID: \"6998d742-8d17-4f20-ab52-c30d9f7b0b89\") " pod="openshift-multus/multus-72cm2" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.970192 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/59e441c3-e8db-4705-9da1-0c6513d57048-cni-binary-copy\") pod \"multus-additional-cni-plugins-h865c\" (UID: \"59e441c3-e8db-4705-9da1-0c6513d57048\") " pod="openshift-multus/multus-additional-cni-plugins-h865c" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.970221 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-host-slash\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.970247 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-etc-openvswitch\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.970260 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6998d742-8d17-4f20-ab52-c30d9f7b0b89-etc-kubernetes\") pod \"multus-72cm2\" (UID: \"6998d742-8d17-4f20-ab52-c30d9f7b0b89\") " pod="openshift-multus/multus-72cm2" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.970275 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zx92c\" (UniqueName: \"kubernetes.io/projected/598b81ce-0ce7-498f-9337-ae5e6e64682b-kube-api-access-zx92c\") pod \"machine-config-daemon-x9dgm\" (UID: \"598b81ce-0ce7-498f-9337-ae5e6e64682b\") " pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.970305 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-host-slash\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.970307 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6998d742-8d17-4f20-ab52-c30d9f7b0b89-system-cni-dir\") pod \"multus-72cm2\" (UID: \"6998d742-8d17-4f20-ab52-c30d9f7b0b89\") " pod="openshift-multus/multus-72cm2" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.970356 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/59e441c3-e8db-4705-9da1-0c6513d57048-tuning-conf-dir\") pod \"multus-additional-cni-plugins-h865c\" (UID: \"59e441c3-e8db-4705-9da1-0c6513d57048\") " pod="openshift-multus/multus-additional-cni-plugins-h865c" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.970367 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6998d742-8d17-4f20-ab52-c30d9f7b0b89-host-run-multus-certs\") pod \"multus-72cm2\" (UID: \"6998d742-8d17-4f20-ab52-c30d9f7b0b89\") " pod="openshift-multus/multus-72cm2" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.970405 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-systemd-units\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.970428 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-env-overrides\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.970444 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6998d742-8d17-4f20-ab52-c30d9f7b0b89-system-cni-dir\") pod \"multus-72cm2\" (UID: \"6998d742-8d17-4f20-ab52-c30d9f7b0b89\") " pod="openshift-multus/multus-72cm2" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.970452 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6998d742-8d17-4f20-ab52-c30d9f7b0b89-host-run-netns\") pod \"multus-72cm2\" (UID: \"6998d742-8d17-4f20-ab52-c30d9f7b0b89\") " pod="openshift-multus/multus-72cm2" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.970486 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6998d742-8d17-4f20-ab52-c30d9f7b0b89-host-run-netns\") pod \"multus-72cm2\" (UID: \"6998d742-8d17-4f20-ab52-c30d9f7b0b89\") " pod="openshift-multus/multus-72cm2" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.970493 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6998d742-8d17-4f20-ab52-c30d9f7b0b89-host-var-lib-cni-multus\") pod \"multus-72cm2\" (UID: \"6998d742-8d17-4f20-ab52-c30d9f7b0b89\") " pod="openshift-multus/multus-72cm2" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.970520 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6998d742-8d17-4f20-ab52-c30d9f7b0b89-host-run-multus-certs\") pod \"multus-72cm2\" (UID: \"6998d742-8d17-4f20-ab52-c30d9f7b0b89\") " pod="openshift-multus/multus-72cm2" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.970526 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/59e441c3-e8db-4705-9da1-0c6513d57048-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-h865c\" (UID: \"59e441c3-e8db-4705-9da1-0c6513d57048\") " pod="openshift-multus/multus-additional-cni-plugins-h865c" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.970563 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-systemd-units\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.970567 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/598b81ce-0ce7-498f-9337-ae5e6e64682b-proxy-tls\") pod \"machine-config-daemon-x9dgm\" (UID: \"598b81ce-0ce7-498f-9337-ae5e6e64682b\") " pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.970628 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6998d742-8d17-4f20-ab52-c30d9f7b0b89-multus-socket-dir-parent\") pod \"multus-72cm2\" (UID: \"6998d742-8d17-4f20-ab52-c30d9f7b0b89\") " pod="openshift-multus/multus-72cm2" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.970659 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-run-openvswitch\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.970682 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6998d742-8d17-4f20-ab52-c30d9f7b0b89-cni-binary-copy\") pod \"multus-72cm2\" (UID: \"6998d742-8d17-4f20-ab52-c30d9f7b0b89\") " pod="openshift-multus/multus-72cm2" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.970689 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-host-cni-netd\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.970714 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-host-cni-netd\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.970749 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.970774 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-ovnkube-config\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.970792 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/59e441c3-e8db-4705-9da1-0c6513d57048-system-cni-dir\") pod \"multus-additional-cni-plugins-h865c\" (UID: \"59e441c3-e8db-4705-9da1-0c6513d57048\") " pod="openshift-multus/multus-additional-cni-plugins-h865c" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.970812 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-ovn-node-metrics-cert\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.970830 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k42nn\" (UniqueName: \"kubernetes.io/projected/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-kube-api-access-k42nn\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.970845 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-host-cni-bin\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.970866 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/598b81ce-0ce7-498f-9337-ae5e6e64682b-mcd-auth-proxy-config\") pod \"machine-config-daemon-x9dgm\" (UID: \"598b81ce-0ce7-498f-9337-ae5e6e64682b\") " pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.970885 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-br5ch\" (UniqueName: \"kubernetes.io/projected/59e441c3-e8db-4705-9da1-0c6513d57048-kube-api-access-br5ch\") pod \"multus-additional-cni-plugins-h865c\" (UID: \"59e441c3-e8db-4705-9da1-0c6513d57048\") " pod="openshift-multus/multus-additional-cni-plugins-h865c" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.970920 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-run-ovn\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.970939 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-node-log\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.970959 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-ovnkube-script-lib\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.970981 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6998d742-8d17-4f20-ab52-c30d9f7b0b89-cnibin\") pod \"multus-72cm2\" (UID: \"6998d742-8d17-4f20-ab52-c30d9f7b0b89\") " pod="openshift-multus/multus-72cm2" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.971023 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-host-run-netns\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.971044 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-host-run-ovn-kubernetes\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.971067 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6998d742-8d17-4f20-ab52-c30d9f7b0b89-hostroot\") pod \"multus-72cm2\" (UID: \"6998d742-8d17-4f20-ab52-c30d9f7b0b89\") " pod="openshift-multus/multus-72cm2" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.971085 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/59e441c3-e8db-4705-9da1-0c6513d57048-os-release\") pod \"multus-additional-cni-plugins-h865c\" (UID: \"59e441c3-e8db-4705-9da1-0c6513d57048\") " pod="openshift-multus/multus-additional-cni-plugins-h865c" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.971167 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/59e441c3-e8db-4705-9da1-0c6513d57048-os-release\") pod \"multus-additional-cni-plugins-h865c\" (UID: \"59e441c3-e8db-4705-9da1-0c6513d57048\") " pod="openshift-multus/multus-additional-cni-plugins-h865c" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.971173 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6998d742-8d17-4f20-ab52-c30d9f7b0b89-multus-daemon-config\") pod \"multus-72cm2\" (UID: \"6998d742-8d17-4f20-ab52-c30d9f7b0b89\") " pod="openshift-multus/multus-72cm2" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.971194 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.971215 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-run-openvswitch\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.971229 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/59e441c3-e8db-4705-9da1-0c6513d57048-system-cni-dir\") pod \"multus-additional-cni-plugins-h865c\" (UID: \"59e441c3-e8db-4705-9da1-0c6513d57048\") " pod="openshift-multus/multus-additional-cni-plugins-h865c" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.971232 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6998d742-8d17-4f20-ab52-c30d9f7b0b89-multus-socket-dir-parent\") pod \"multus-72cm2\" (UID: \"6998d742-8d17-4f20-ab52-c30d9f7b0b89\") " pod="openshift-multus/multus-72cm2" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.971179 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/59e441c3-e8db-4705-9da1-0c6513d57048-cni-binary-copy\") pod \"multus-additional-cni-plugins-h865c\" (UID: \"59e441c3-e8db-4705-9da1-0c6513d57048\") " pod="openshift-multus/multus-additional-cni-plugins-h865c" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.971305 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-etc-openvswitch\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.971337 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-host-run-netns\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.971353 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-node-log\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.971404 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-env-overrides\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.971414 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-host-run-ovn-kubernetes\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.971473 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6998d742-8d17-4f20-ab52-c30d9f7b0b89-cnibin\") pod \"multus-72cm2\" (UID: \"6998d742-8d17-4f20-ab52-c30d9f7b0b89\") " pod="openshift-multus/multus-72cm2" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.971501 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-host-cni-bin\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.971527 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-run-ovn\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.971522 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6998d742-8d17-4f20-ab52-c30d9f7b0b89-host-var-lib-cni-multus\") pod \"multus-72cm2\" (UID: \"6998d742-8d17-4f20-ab52-c30d9f7b0b89\") " pod="openshift-multus/multus-72cm2" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.971551 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6998d742-8d17-4f20-ab52-c30d9f7b0b89-hostroot\") pod \"multus-72cm2\" (UID: \"6998d742-8d17-4f20-ab52-c30d9f7b0b89\") " pod="openshift-multus/multus-72cm2" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.971679 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6998d742-8d17-4f20-ab52-c30d9f7b0b89-multus-cni-dir\") pod \"multus-72cm2\" (UID: \"6998d742-8d17-4f20-ab52-c30d9f7b0b89\") " pod="openshift-multus/multus-72cm2" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.971998 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-ovnkube-config\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.972098 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/598b81ce-0ce7-498f-9337-ae5e6e64682b-mcd-auth-proxy-config\") pod \"machine-config-daemon-x9dgm\" (UID: \"598b81ce-0ce7-498f-9337-ae5e6e64682b\") " pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.972129 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/59e441c3-e8db-4705-9da1-0c6513d57048-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-h865c\" (UID: \"59e441c3-e8db-4705-9da1-0c6513d57048\") " pod="openshift-multus/multus-additional-cni-plugins-h865c" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.972171 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-ovnkube-script-lib\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.972369 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-var-lib-openvswitch\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.975429 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/598b81ce-0ce7-498f-9337-ae5e6e64682b-proxy-tls\") pod \"machine-config-daemon-x9dgm\" (UID: \"598b81ce-0ce7-498f-9337-ae5e6e64682b\") " pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.982751 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-ovn-node-metrics-cert\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.987169 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9hpq\" (UniqueName: \"kubernetes.io/projected/6998d742-8d17-4f20-ab52-c30d9f7b0b89-kube-api-access-l9hpq\") pod \"multus-72cm2\" (UID: \"6998d742-8d17-4f20-ab52-c30d9f7b0b89\") " pod="openshift-multus/multus-72cm2" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.988144 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k42nn\" (UniqueName: \"kubernetes.io/projected/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-kube-api-access-k42nn\") pod \"ovnkube-node-2jpvm\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.994724 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:45Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:45 crc kubenswrapper[4664]: I1003 07:48:45.995503 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-br5ch\" (UniqueName: \"kubernetes.io/projected/59e441c3-e8db-4705-9da1-0c6513d57048-kube-api-access-br5ch\") pod \"multus-additional-cni-plugins-h865c\" (UID: \"59e441c3-e8db-4705-9da1-0c6513d57048\") " pod="openshift-multus/multus-additional-cni-plugins-h865c" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.001311 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zx92c\" (UniqueName: \"kubernetes.io/projected/598b81ce-0ce7-498f-9337-ae5e6e64682b-kube-api-access-zx92c\") pod \"machine-config-daemon-x9dgm\" (UID: \"598b81ce-0ce7-498f-9337-ae5e6e64682b\") " pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.011338 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b81ce-0ce7-498f-9337-ae5e6e64682b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x9dgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:46Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.011826 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-72cm2" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.021790 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.024208 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hqfhs" event={"ID":"e03e57b4-cb64-42fa-b8c5-ee4863291568","Type":"ContainerStarted","Data":"31867b4c35050cbe6c4247edfe085dc9429d5516c19ad8da432699262b7fd092"} Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.024356 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hqfhs" event={"ID":"e03e57b4-cb64-42fa-b8c5-ee4863291568","Type":"ContainerStarted","Data":"c6eac16aeed7a01b51952cbe7ab68ed586ec1db72128e6dc0d89ee49d4e857a9"} Oct 03 07:48:46 crc kubenswrapper[4664]: W1003 07:48:46.025006 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6998d742_8d17_4f20_ab52_c30d9f7b0b89.slice/crio-1c86d8119635f23111af017b52aa34187583d64157f2b290461b5b1948147f49 WatchSource:0}: Error finding container 1c86d8119635f23111af017b52aa34187583d64157f2b290461b5b1948147f49: Status 404 returned error can't find the container with id 1c86d8119635f23111af017b52aa34187583d64157f2b290461b5b1948147f49 Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.028697 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.028743 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.028753 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.028771 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.028783 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:46Z","lastTransitionTime":"2025-10-03T07:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.029038 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-72cm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6998d742-8d17-4f20-ab52-c30d9f7b0b89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9hpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-72cm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:46Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.029905 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:46 crc kubenswrapper[4664]: W1003 07:48:46.035544 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod598b81ce_0ce7_498f_9337_ae5e6e64682b.slice/crio-cf48f3c40afa216858263d24906803f732ebcba7939b2602bf7b41901717940b WatchSource:0}: Error finding container cf48f3c40afa216858263d24906803f732ebcba7939b2602bf7b41901717940b: Status 404 returned error can't find the container with id cf48f3c40afa216858263d24906803f732ebcba7939b2602bf7b41901717940b Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.039085 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-h865c" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.045980 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7002f9af-9339-4bec-8b7a-ce0c1d20f3ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96ea8ece692f200d9dbe3650ba30a3680897c6d7c33c9b359b27e31692689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f580c83a6ce9959ed5a1c8b78834f7d6caecaeff3ed366fdef991ff728b08d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed30aa5b737df201bfca9e17073255b5d791430429d86149107650c38a1c0c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 07:48:39.005956 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 07:48:39.006283 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 07:48:39.007038 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3527224080/tls.crt::/tmp/serving-cert-3527224080/tls.key\\\\\\\"\\\\nI1003 07:48:39.474760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 07:48:39.479276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 07:48:39.479354 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 07:48:39.479401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 07:48:39.479427 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 07:48:39.489134 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 07:48:39.489225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 07:48:39.489291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 07:48:39.489311 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 07:48:39.489336 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 07:48:39.489506 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 07:48:39.491496 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1078b48064376958ba64f9040dc1973628494ff4c9935c90c296bdb8a27f8738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:46Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:46 crc kubenswrapper[4664]: W1003 07:48:46.049644 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8bb65800_b794_4cb7_8fdd_ebbf3a8ff78d.slice/crio-1f80a89734a09e0e7687baa3ab9f827adc9a0a3af6e78599cde9d3d8954f5cac WatchSource:0}: Error finding container 1f80a89734a09e0e7687baa3ab9f827adc9a0a3af6e78599cde9d3d8954f5cac: Status 404 returned error can't find the container with id 1f80a89734a09e0e7687baa3ab9f827adc9a0a3af6e78599cde9d3d8954f5cac Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.073073 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2jpvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:46Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:46 crc kubenswrapper[4664]: W1003 07:48:46.087613 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59e441c3_e8db_4705_9da1_0c6513d57048.slice/crio-d4e24afd3917625fbe7d5af8065d5060dc5096274874efa66764fd897588b62c WatchSource:0}: Error finding container d4e24afd3917625fbe7d5af8065d5060dc5096274874efa66764fd897588b62c: Status 404 returned error can't find the container with id d4e24afd3917625fbe7d5af8065d5060dc5096274874efa66764fd897588b62c Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.089853 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h865c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e441c3-e8db-4705-9da1-0c6513d57048\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h865c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:46Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.107400 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:46Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.119389 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hqfhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e03e57b4-cb64-42fa-b8c5-ee4863291568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:44Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:44Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hqfhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:46Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.130645 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.130694 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.130706 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.130726 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.130746 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:46Z","lastTransitionTime":"2025-10-03T07:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.139388 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8078a587-9c95-43a8-9cf8-4286835f8134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938aaa7e3e507e4f04632ecbdeb12d1e169b59fdcd8c19838e21fedc4b55d721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682632866f1e606691047ac952506c02f250c825f142a1676a4abcca32e58466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7eed5fa26eeb30a803512265d01d5fbb9fffad00d857a683939158da6b0bb7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e4e705e4bd02aaf141572ec91291240f8f7ec92b33cd45ba98ea8ce83f06918\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:46Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.157670 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:46Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.175830 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d6a6dd75fa3f860ac6164f4065d5fbb3ea31148102e726b792655d5b08e34d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:46Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.193416 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c51653288453da9ac24f91fdd67f9cec6bea6dc9b90432eeab5350b7e48f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://741ad5608cab1c104f0a912ca7325733265167e85291eb6bc1b08dd5a23a9b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:46Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.207486 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a8b21d2fb73dbe0bf41ef524cce8befb05dca708bc568a567ccd71bc1434b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:46Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.224670 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h865c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e441c3-e8db-4705-9da1-0c6513d57048\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h865c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:46Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.233145 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.233190 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.233203 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.233226 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.233242 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:46Z","lastTransitionTime":"2025-10-03T07:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.241817 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7002f9af-9339-4bec-8b7a-ce0c1d20f3ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96ea8ece692f200d9dbe3650ba30a3680897c6d7c33c9b359b27e31692689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f580c83a6ce9959ed5a1c8b78834f7d6caecaeff3ed366fdef991ff728b08d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed30aa5b737df201bfca9e17073255b5d791430429d86149107650c38a1c0c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 07:48:39.005956 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 07:48:39.006283 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 07:48:39.007038 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3527224080/tls.crt::/tmp/serving-cert-3527224080/tls.key\\\\\\\"\\\\nI1003 07:48:39.474760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 07:48:39.479276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 07:48:39.479354 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 07:48:39.479401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 07:48:39.479427 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 07:48:39.489134 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 07:48:39.489225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 07:48:39.489291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 07:48:39.489311 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 07:48:39.489336 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 07:48:39.489506 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 07:48:39.491496 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1078b48064376958ba64f9040dc1973628494ff4c9935c90c296bdb8a27f8738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:46Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.263450 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2jpvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:46Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.277537 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:46Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.295656 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d6a6dd75fa3f860ac6164f4065d5fbb3ea31148102e726b792655d5b08e34d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:46Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.311570 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c51653288453da9ac24f91fdd67f9cec6bea6dc9b90432eeab5350b7e48f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://741ad5608cab1c104f0a912ca7325733265167e85291eb6bc1b08dd5a23a9b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:46Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.326596 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a8b21d2fb73dbe0bf41ef524cce8befb05dca708bc568a567ccd71bc1434b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:46Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.336193 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.336241 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.336257 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.336276 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.336289 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:46Z","lastTransitionTime":"2025-10-03T07:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.341023 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:46Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.352497 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hqfhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e03e57b4-cb64-42fa-b8c5-ee4863291568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31867b4c35050cbe6c4247edfe085dc9429d5516c19ad8da432699262b7fd092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hqfhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:46Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.369495 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8078a587-9c95-43a8-9cf8-4286835f8134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938aaa7e3e507e4f04632ecbdeb12d1e169b59fdcd8c19838e21fedc4b55d721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682632866f1e606691047ac952506c02f250c825f142a1676a4abcca32e58466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7eed5fa26eeb30a803512265d01d5fbb9fffad00d857a683939158da6b0bb7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e4e705e4bd02aaf141572ec91291240f8f7ec92b33cd45ba98ea8ce83f06918\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:46Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.382060 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b81ce-0ce7-498f-9337-ae5e6e64682b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x9dgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:46Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.397083 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:46Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.411368 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-72cm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6998d742-8d17-4f20-ab52-c30d9f7b0b89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9hpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-72cm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:46Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.440057 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.440132 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.440143 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.440168 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.440182 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:46Z","lastTransitionTime":"2025-10-03T07:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.542965 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.543011 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.543021 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.543037 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.543048 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:46Z","lastTransitionTime":"2025-10-03T07:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.645335 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.645380 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.645402 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.645417 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.645430 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:46Z","lastTransitionTime":"2025-10-03T07:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.748400 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.748461 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.748473 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.748492 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.748506 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:46Z","lastTransitionTime":"2025-10-03T07:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.852342 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.852400 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.852411 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.852443 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.852456 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:46Z","lastTransitionTime":"2025-10-03T07:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.955684 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.955721 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.955729 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.955741 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:46 crc kubenswrapper[4664]: I1003 07:48:46.955753 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:46Z","lastTransitionTime":"2025-10-03T07:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.028460 4664 generic.go:334] "Generic (PLEG): container finished" podID="59e441c3-e8db-4705-9da1-0c6513d57048" containerID="58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e" exitCode=0 Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.028690 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-h865c" event={"ID":"59e441c3-e8db-4705-9da1-0c6513d57048","Type":"ContainerDied","Data":"58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e"} Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.028887 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-h865c" event={"ID":"59e441c3-e8db-4705-9da1-0c6513d57048","Type":"ContainerStarted","Data":"d4e24afd3917625fbe7d5af8065d5060dc5096274874efa66764fd897588b62c"} Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.031842 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-72cm2" event={"ID":"6998d742-8d17-4f20-ab52-c30d9f7b0b89","Type":"ContainerStarted","Data":"a93e9f047fd6ffc420a229d1159b5c53c40c5efe1f945e45bafa3ba93a69caef"} Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.031885 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-72cm2" event={"ID":"6998d742-8d17-4f20-ab52-c30d9f7b0b89","Type":"ContainerStarted","Data":"1c86d8119635f23111af017b52aa34187583d64157f2b290461b5b1948147f49"} Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.034852 4664 generic.go:334] "Generic (PLEG): container finished" podID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerID="cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12" exitCode=0 Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.035386 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" event={"ID":"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d","Type":"ContainerDied","Data":"cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12"} Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.035467 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" event={"ID":"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d","Type":"ContainerStarted","Data":"1f80a89734a09e0e7687baa3ab9f827adc9a0a3af6e78599cde9d3d8954f5cac"} Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.039442 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" event={"ID":"598b81ce-0ce7-498f-9337-ae5e6e64682b","Type":"ContainerStarted","Data":"ccf6aded6ff3dcd5fbea90e46bb42d6db6b963c9688abfed6fa40662139fb80f"} Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.039535 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" event={"ID":"598b81ce-0ce7-498f-9337-ae5e6e64682b","Type":"ContainerStarted","Data":"b446e000609801e974810aa233005f7c2810cc977c7a13155452de76e1268427"} Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.039554 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" event={"ID":"598b81ce-0ce7-498f-9337-ae5e6e64682b","Type":"ContainerStarted","Data":"cf48f3c40afa216858263d24906803f732ebcba7939b2602bf7b41901717940b"} Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.056524 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h865c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e441c3-e8db-4705-9da1-0c6513d57048\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h865c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:47Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.059001 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.059050 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.059063 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.059081 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.059096 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:47Z","lastTransitionTime":"2025-10-03T07:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.073401 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7002f9af-9339-4bec-8b7a-ce0c1d20f3ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96ea8ece692f200d9dbe3650ba30a3680897c6d7c33c9b359b27e31692689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f580c83a6ce9959ed5a1c8b78834f7d6caecaeff3ed366fdef991ff728b08d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed30aa5b737df201bfca9e17073255b5d791430429d86149107650c38a1c0c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 07:48:39.005956 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 07:48:39.006283 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 07:48:39.007038 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3527224080/tls.crt::/tmp/serving-cert-3527224080/tls.key\\\\\\\"\\\\nI1003 07:48:39.474760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 07:48:39.479276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 07:48:39.479354 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 07:48:39.479401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 07:48:39.479427 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 07:48:39.489134 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 07:48:39.489225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 07:48:39.489291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 07:48:39.489311 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 07:48:39.489336 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 07:48:39.489506 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 07:48:39.491496 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1078b48064376958ba64f9040dc1973628494ff4c9935c90c296bdb8a27f8738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:47Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.093922 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2jpvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:47Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.108260 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:47Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.126193 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d6a6dd75fa3f860ac6164f4065d5fbb3ea31148102e726b792655d5b08e34d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:47Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.144031 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c51653288453da9ac24f91fdd67f9cec6bea6dc9b90432eeab5350b7e48f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://741ad5608cab1c104f0a912ca7325733265167e85291eb6bc1b08dd5a23a9b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:47Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.159089 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a8b21d2fb73dbe0bf41ef524cce8befb05dca708bc568a567ccd71bc1434b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:47Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.161449 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.161485 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.161499 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.161522 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.161537 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:47Z","lastTransitionTime":"2025-10-03T07:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.172984 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:47Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.183645 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hqfhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e03e57b4-cb64-42fa-b8c5-ee4863291568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31867b4c35050cbe6c4247edfe085dc9429d5516c19ad8da432699262b7fd092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hqfhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:47Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.197477 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8078a587-9c95-43a8-9cf8-4286835f8134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938aaa7e3e507e4f04632ecbdeb12d1e169b59fdcd8c19838e21fedc4b55d721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682632866f1e606691047ac952506c02f250c825f142a1676a4abcca32e58466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7eed5fa26eeb30a803512265d01d5fbb9fffad00d857a683939158da6b0bb7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e4e705e4bd02aaf141572ec91291240f8f7ec92b33cd45ba98ea8ce83f06918\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:47Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.209892 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b81ce-0ce7-498f-9337-ae5e6e64682b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x9dgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:47Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.224125 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:47Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.237807 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-72cm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6998d742-8d17-4f20-ab52-c30d9f7b0b89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9hpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-72cm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:47Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.253522 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-72cm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6998d742-8d17-4f20-ab52-c30d9f7b0b89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93e9f047fd6ffc420a229d1159b5c53c40c5efe1f945e45bafa3ba93a69caef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9hpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-72cm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:47Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.264018 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.264055 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.264066 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.264084 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.264097 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:47Z","lastTransitionTime":"2025-10-03T07:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.270893 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7002f9af-9339-4bec-8b7a-ce0c1d20f3ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96ea8ece692f200d9dbe3650ba30a3680897c6d7c33c9b359b27e31692689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f580c83a6ce9959ed5a1c8b78834f7d6caecaeff3ed366fdef991ff728b08d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed30aa5b737df201bfca9e17073255b5d791430429d86149107650c38a1c0c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 07:48:39.005956 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 07:48:39.006283 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 07:48:39.007038 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3527224080/tls.crt::/tmp/serving-cert-3527224080/tls.key\\\\\\\"\\\\nI1003 07:48:39.474760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 07:48:39.479276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 07:48:39.479354 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 07:48:39.479401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 07:48:39.479427 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 07:48:39.489134 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 07:48:39.489225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 07:48:39.489291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 07:48:39.489311 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 07:48:39.489336 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 07:48:39.489506 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 07:48:39.491496 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1078b48064376958ba64f9040dc1973628494ff4c9935c90c296bdb8a27f8738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:47Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.290250 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2jpvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:47Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.304376 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h865c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e441c3-e8db-4705-9da1-0c6513d57048\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h865c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:47Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.317408 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c51653288453da9ac24f91fdd67f9cec6bea6dc9b90432eeab5350b7e48f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://741ad5608cab1c104f0a912ca7325733265167e85291eb6bc1b08dd5a23a9b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:47Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.330507 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a8b21d2fb73dbe0bf41ef524cce8befb05dca708bc568a567ccd71bc1434b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:47Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.342650 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:47Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.356884 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hqfhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e03e57b4-cb64-42fa-b8c5-ee4863291568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31867b4c35050cbe6c4247edfe085dc9429d5516c19ad8da432699262b7fd092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hqfhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:47Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.370764 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.370811 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.370821 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.370835 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.370844 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:47Z","lastTransitionTime":"2025-10-03T07:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.378136 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8078a587-9c95-43a8-9cf8-4286835f8134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938aaa7e3e507e4f04632ecbdeb12d1e169b59fdcd8c19838e21fedc4b55d721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682632866f1e606691047ac952506c02f250c825f142a1676a4abcca32e58466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7eed5fa26eeb30a803512265d01d5fbb9fffad00d857a683939158da6b0bb7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e4e705e4bd02aaf141572ec91291240f8f7ec92b33cd45ba98ea8ce83f06918\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:47Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.392781 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:47Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.403630 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d6a6dd75fa3f860ac6164f4065d5fbb3ea31148102e726b792655d5b08e34d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:47Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.415110 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:47Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.425787 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b81ce-0ce7-498f-9337-ae5e6e64682b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf6aded6ff3dcd5fbea90e46bb42d6db6b963c9688abfed6fa40662139fb80f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b446e000609801e974810aa233005f7c2810cc977c7a13155452de76e1268427\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x9dgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:47Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.473764 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.473797 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.473805 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.473817 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.473827 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:47Z","lastTransitionTime":"2025-10-03T07:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.576920 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.576959 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.576972 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.577254 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.577282 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:47Z","lastTransitionTime":"2025-10-03T07:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.588084 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.588178 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.588205 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.588236 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.588254 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:48:47 crc kubenswrapper[4664]: E1003 07:48:47.588355 4664 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 07:48:47 crc kubenswrapper[4664]: E1003 07:48:47.588377 4664 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 07:48:47 crc kubenswrapper[4664]: E1003 07:48:47.588404 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 07:48:55.588391656 +0000 UTC m=+36.409582146 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 07:48:47 crc kubenswrapper[4664]: E1003 07:48:47.588409 4664 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 07:48:47 crc kubenswrapper[4664]: E1003 07:48:47.588427 4664 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 07:48:47 crc kubenswrapper[4664]: E1003 07:48:47.588481 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 07:48:55.588463368 +0000 UTC m=+36.409653918 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 07:48:47 crc kubenswrapper[4664]: E1003 07:48:47.588544 4664 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 07:48:47 crc kubenswrapper[4664]: E1003 07:48:47.588556 4664 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 07:48:47 crc kubenswrapper[4664]: E1003 07:48:47.588564 4664 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 07:48:47 crc kubenswrapper[4664]: E1003 07:48:47.588589 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 07:48:55.588583151 +0000 UTC m=+36.409773641 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 07:48:47 crc kubenswrapper[4664]: E1003 07:48:47.588635 4664 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 07:48:47 crc kubenswrapper[4664]: E1003 07:48:47.588657 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 07:48:55.588649713 +0000 UTC m=+36.409840203 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 07:48:47 crc kubenswrapper[4664]: E1003 07:48:47.588773 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 07:48:55.588761406 +0000 UTC m=+36.409951966 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.679989 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.680040 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.680051 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.680063 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.680073 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:47Z","lastTransitionTime":"2025-10-03T07:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.782760 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.783113 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.783124 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.783143 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.783156 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:47Z","lastTransitionTime":"2025-10-03T07:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.875845 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.875891 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.875942 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:48:47 crc kubenswrapper[4664]: E1003 07:48:47.876024 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:48:47 crc kubenswrapper[4664]: E1003 07:48:47.876124 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:48:47 crc kubenswrapper[4664]: E1003 07:48:47.876211 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.885585 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.885649 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.885659 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.885674 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.885686 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:47Z","lastTransitionTime":"2025-10-03T07:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.953423 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-9z9q9"] Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.953873 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9z9q9" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.956558 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.956756 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.957041 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.957645 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.974535 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7002f9af-9339-4bec-8b7a-ce0c1d20f3ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96ea8ece692f200d9dbe3650ba30a3680897c6d7c33c9b359b27e31692689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f580c83a6ce9959ed5a1c8b78834f7d6caecaeff3ed366fdef991ff728b08d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed30aa5b737df201bfca9e17073255b5d791430429d86149107650c38a1c0c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 07:48:39.005956 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 07:48:39.006283 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 07:48:39.007038 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3527224080/tls.crt::/tmp/serving-cert-3527224080/tls.key\\\\\\\"\\\\nI1003 07:48:39.474760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 07:48:39.479276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 07:48:39.479354 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 07:48:39.479401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 07:48:39.479427 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 07:48:39.489134 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 07:48:39.489225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 07:48:39.489291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 07:48:39.489311 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 07:48:39.489336 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 07:48:39.489506 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 07:48:39.491496 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1078b48064376958ba64f9040dc1973628494ff4c9935c90c296bdb8a27f8738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:47Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.989222 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.989265 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.989278 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.989299 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.989311 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:47Z","lastTransitionTime":"2025-10-03T07:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.997122 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2jpvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:47Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.997571 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqzrd\" (UniqueName: \"kubernetes.io/projected/b4492bc6-9e61-4748-935e-e070a703c05e-kube-api-access-wqzrd\") pod \"node-ca-9z9q9\" (UID: \"b4492bc6-9e61-4748-935e-e070a703c05e\") " pod="openshift-image-registry/node-ca-9z9q9" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.997727 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b4492bc6-9e61-4748-935e-e070a703c05e-serviceca\") pod \"node-ca-9z9q9\" (UID: \"b4492bc6-9e61-4748-935e-e070a703c05e\") " pod="openshift-image-registry/node-ca-9z9q9" Oct 03 07:48:47 crc kubenswrapper[4664]: I1003 07:48:47.997787 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b4492bc6-9e61-4748-935e-e070a703c05e-host\") pod \"node-ca-9z9q9\" (UID: \"b4492bc6-9e61-4748-935e-e070a703c05e\") " pod="openshift-image-registry/node-ca-9z9q9" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.014634 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h865c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e441c3-e8db-4705-9da1-0c6513d57048\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h865c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:48Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.031280 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a8b21d2fb73dbe0bf41ef524cce8befb05dca708bc568a567ccd71bc1434b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:48Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.045280 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" event={"ID":"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d","Type":"ContainerStarted","Data":"2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add"} Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.045346 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" event={"ID":"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d","Type":"ContainerStarted","Data":"5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2"} Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.045361 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" event={"ID":"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d","Type":"ContainerStarted","Data":"67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac"} Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.045374 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" event={"ID":"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d","Type":"ContainerStarted","Data":"82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00"} Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.045387 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" event={"ID":"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d","Type":"ContainerStarted","Data":"023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db"} Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.047492 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:48Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.049074 4664 generic.go:334] "Generic (PLEG): container finished" podID="59e441c3-e8db-4705-9da1-0c6513d57048" containerID="4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11" exitCode=0 Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.049105 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-h865c" event={"ID":"59e441c3-e8db-4705-9da1-0c6513d57048","Type":"ContainerDied","Data":"4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11"} Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.059883 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hqfhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e03e57b4-cb64-42fa-b8c5-ee4863291568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31867b4c35050cbe6c4247edfe085dc9429d5516c19ad8da432699262b7fd092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hqfhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:48Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.076709 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8078a587-9c95-43a8-9cf8-4286835f8134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938aaa7e3e507e4f04632ecbdeb12d1e169b59fdcd8c19838e21fedc4b55d721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682632866f1e606691047ac952506c02f250c825f142a1676a4abcca32e58466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7eed5fa26eeb30a803512265d01d5fbb9fffad00d857a683939158da6b0bb7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e4e705e4bd02aaf141572ec91291240f8f7ec92b33cd45ba98ea8ce83f06918\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:48Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.090142 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:48Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.092225 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.092269 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.092281 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.092300 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.092311 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:48Z","lastTransitionTime":"2025-10-03T07:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.098632 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqzrd\" (UniqueName: \"kubernetes.io/projected/b4492bc6-9e61-4748-935e-e070a703c05e-kube-api-access-wqzrd\") pod \"node-ca-9z9q9\" (UID: \"b4492bc6-9e61-4748-935e-e070a703c05e\") " pod="openshift-image-registry/node-ca-9z9q9" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.098695 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b4492bc6-9e61-4748-935e-e070a703c05e-serviceca\") pod \"node-ca-9z9q9\" (UID: \"b4492bc6-9e61-4748-935e-e070a703c05e\") " pod="openshift-image-registry/node-ca-9z9q9" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.098736 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b4492bc6-9e61-4748-935e-e070a703c05e-host\") pod \"node-ca-9z9q9\" (UID: \"b4492bc6-9e61-4748-935e-e070a703c05e\") " pod="openshift-image-registry/node-ca-9z9q9" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.098800 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b4492bc6-9e61-4748-935e-e070a703c05e-host\") pod \"node-ca-9z9q9\" (UID: \"b4492bc6-9e61-4748-935e-e070a703c05e\") " pod="openshift-image-registry/node-ca-9z9q9" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.100177 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b4492bc6-9e61-4748-935e-e070a703c05e-serviceca\") pod \"node-ca-9z9q9\" (UID: \"b4492bc6-9e61-4748-935e-e070a703c05e\") " pod="openshift-image-registry/node-ca-9z9q9" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.104590 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d6a6dd75fa3f860ac6164f4065d5fbb3ea31148102e726b792655d5b08e34d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:48Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.119060 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqzrd\" (UniqueName: \"kubernetes.io/projected/b4492bc6-9e61-4748-935e-e070a703c05e-kube-api-access-wqzrd\") pod \"node-ca-9z9q9\" (UID: \"b4492bc6-9e61-4748-935e-e070a703c05e\") " pod="openshift-image-registry/node-ca-9z9q9" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.122870 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c51653288453da9ac24f91fdd67f9cec6bea6dc9b90432eeab5350b7e48f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://741ad5608cab1c104f0a912ca7325733265167e85291eb6bc1b08dd5a23a9b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:48Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.138559 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:48Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.153531 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b81ce-0ce7-498f-9337-ae5e6e64682b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf6aded6ff3dcd5fbea90e46bb42d6db6b963c9688abfed6fa40662139fb80f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b446e000609801e974810aa233005f7c2810cc977c7a13155452de76e1268427\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x9dgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:48Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.167416 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9z9q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4492bc6-9e61-4748-935e-e070a703c05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqzrd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9z9q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:48Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.184342 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-72cm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6998d742-8d17-4f20-ab52-c30d9f7b0b89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93e9f047fd6ffc420a229d1159b5c53c40c5efe1f945e45bafa3ba93a69caef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9hpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-72cm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:48Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.195344 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.195389 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.195399 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.195414 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.195423 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:48Z","lastTransitionTime":"2025-10-03T07:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.200806 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a8b21d2fb73dbe0bf41ef524cce8befb05dca708bc568a567ccd71bc1434b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:48Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.220449 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:48Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.232681 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hqfhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e03e57b4-cb64-42fa-b8c5-ee4863291568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31867b4c35050cbe6c4247edfe085dc9429d5516c19ad8da432699262b7fd092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hqfhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:48Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.246745 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8078a587-9c95-43a8-9cf8-4286835f8134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938aaa7e3e507e4f04632ecbdeb12d1e169b59fdcd8c19838e21fedc4b55d721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682632866f1e606691047ac952506c02f250c825f142a1676a4abcca32e58466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7eed5fa26eeb30a803512265d01d5fbb9fffad00d857a683939158da6b0bb7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e4e705e4bd02aaf141572ec91291240f8f7ec92b33cd45ba98ea8ce83f06918\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:48Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.261856 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:48Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.272697 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9z9q9" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.276059 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d6a6dd75fa3f860ac6164f4065d5fbb3ea31148102e726b792655d5b08e34d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:48Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.290169 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c51653288453da9ac24f91fdd67f9cec6bea6dc9b90432eeab5350b7e48f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://741ad5608cab1c104f0a912ca7325733265167e85291eb6bc1b08dd5a23a9b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:48Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.298375 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.298442 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.298456 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.298480 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.298493 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:48Z","lastTransitionTime":"2025-10-03T07:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.304534 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:48Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.318583 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b81ce-0ce7-498f-9337-ae5e6e64682b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf6aded6ff3dcd5fbea90e46bb42d6db6b963c9688abfed6fa40662139fb80f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b446e000609801e974810aa233005f7c2810cc977c7a13155452de76e1268427\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x9dgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:48Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.331187 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9z9q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4492bc6-9e61-4748-935e-e070a703c05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqzrd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9z9q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:48Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.346505 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-72cm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6998d742-8d17-4f20-ab52-c30d9f7b0b89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93e9f047fd6ffc420a229d1159b5c53c40c5efe1f945e45bafa3ba93a69caef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9hpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-72cm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:48Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.367576 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7002f9af-9339-4bec-8b7a-ce0c1d20f3ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96ea8ece692f200d9dbe3650ba30a3680897c6d7c33c9b359b27e31692689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f580c83a6ce9959ed5a1c8b78834f7d6caecaeff3ed366fdef991ff728b08d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed30aa5b737df201bfca9e17073255b5d791430429d86149107650c38a1c0c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 07:48:39.005956 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 07:48:39.006283 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 07:48:39.007038 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3527224080/tls.crt::/tmp/serving-cert-3527224080/tls.key\\\\\\\"\\\\nI1003 07:48:39.474760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 07:48:39.479276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 07:48:39.479354 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 07:48:39.479401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 07:48:39.479427 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 07:48:39.489134 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 07:48:39.489225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 07:48:39.489291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 07:48:39.489311 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 07:48:39.489336 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 07:48:39.489506 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 07:48:39.491496 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1078b48064376958ba64f9040dc1973628494ff4c9935c90c296bdb8a27f8738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:48Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.389596 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2jpvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:48Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.402413 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.402477 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.402493 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.402513 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.402569 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:48Z","lastTransitionTime":"2025-10-03T07:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.407214 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h865c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e441c3-e8db-4705-9da1-0c6513d57048\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h865c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:48Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.504945 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.504984 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.504995 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.505009 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.505021 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:48Z","lastTransitionTime":"2025-10-03T07:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.608306 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.608350 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.608362 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.608377 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.608386 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:48Z","lastTransitionTime":"2025-10-03T07:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.711497 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.711559 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.711641 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.711680 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.711694 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:48Z","lastTransitionTime":"2025-10-03T07:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.814344 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.815237 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.815269 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.815298 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.815314 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:48Z","lastTransitionTime":"2025-10-03T07:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.917319 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.917390 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.917403 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.917422 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:48 crc kubenswrapper[4664]: I1003 07:48:48.917431 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:48Z","lastTransitionTime":"2025-10-03T07:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.020202 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.020249 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.020264 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.020287 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.020304 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:49Z","lastTransitionTime":"2025-10-03T07:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.056238 4664 generic.go:334] "Generic (PLEG): container finished" podID="59e441c3-e8db-4705-9da1-0c6513d57048" containerID="a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7" exitCode=0 Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.056358 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-h865c" event={"ID":"59e441c3-e8db-4705-9da1-0c6513d57048","Type":"ContainerDied","Data":"a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7"} Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.060758 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" event={"ID":"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d","Type":"ContainerStarted","Data":"5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1"} Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.062750 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9z9q9" event={"ID":"b4492bc6-9e61-4748-935e-e070a703c05e","Type":"ContainerStarted","Data":"aac5ac1e508036a7e3cee3eb11577734e7923c4076399288863f294b7efa3fdf"} Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.062786 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9z9q9" event={"ID":"b4492bc6-9e61-4748-935e-e070a703c05e","Type":"ContainerStarted","Data":"95755f59c4683912fadd1dfce670f7d29fc517254215e9f328dd3ca1500a563c"} Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.076020 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-72cm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6998d742-8d17-4f20-ab52-c30d9f7b0b89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93e9f047fd6ffc420a229d1159b5c53c40c5efe1f945e45bafa3ba93a69caef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9hpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-72cm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:49Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.092406 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h865c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e441c3-e8db-4705-9da1-0c6513d57048\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h865c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:49Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.112678 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7002f9af-9339-4bec-8b7a-ce0c1d20f3ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96ea8ece692f200d9dbe3650ba30a3680897c6d7c33c9b359b27e31692689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f580c83a6ce9959ed5a1c8b78834f7d6caecaeff3ed366fdef991ff728b08d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed30aa5b737df201bfca9e17073255b5d791430429d86149107650c38a1c0c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 07:48:39.005956 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 07:48:39.006283 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 07:48:39.007038 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3527224080/tls.crt::/tmp/serving-cert-3527224080/tls.key\\\\\\\"\\\\nI1003 07:48:39.474760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 07:48:39.479276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 07:48:39.479354 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 07:48:39.479401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 07:48:39.479427 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 07:48:39.489134 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 07:48:39.489225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 07:48:39.489291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 07:48:39.489311 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 07:48:39.489336 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 07:48:39.489506 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 07:48:39.491496 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1078b48064376958ba64f9040dc1973628494ff4c9935c90c296bdb8a27f8738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:49Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.122784 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.122828 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.122840 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.122858 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.122869 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:49Z","lastTransitionTime":"2025-10-03T07:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.135040 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2jpvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:49Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.151634 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:49Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.165026 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d6a6dd75fa3f860ac6164f4065d5fbb3ea31148102e726b792655d5b08e34d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:49Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.181860 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c51653288453da9ac24f91fdd67f9cec6bea6dc9b90432eeab5350b7e48f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://741ad5608cab1c104f0a912ca7325733265167e85291eb6bc1b08dd5a23a9b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:49Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.197858 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a8b21d2fb73dbe0bf41ef524cce8befb05dca708bc568a567ccd71bc1434b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:49Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.214545 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:49Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.226041 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.226078 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.226089 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.226107 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.226118 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:49Z","lastTransitionTime":"2025-10-03T07:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.229698 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hqfhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e03e57b4-cb64-42fa-b8c5-ee4863291568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31867b4c35050cbe6c4247edfe085dc9429d5516c19ad8da432699262b7fd092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hqfhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:49Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.246780 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8078a587-9c95-43a8-9cf8-4286835f8134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938aaa7e3e507e4f04632ecbdeb12d1e169b59fdcd8c19838e21fedc4b55d721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682632866f1e606691047ac952506c02f250c825f142a1676a4abcca32e58466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7eed5fa26eeb30a803512265d01d5fbb9fffad00d857a683939158da6b0bb7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e4e705e4bd02aaf141572ec91291240f8f7ec92b33cd45ba98ea8ce83f06918\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:49Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.259228 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b81ce-0ce7-498f-9337-ae5e6e64682b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf6aded6ff3dcd5fbea90e46bb42d6db6b963c9688abfed6fa40662139fb80f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b446e000609801e974810aa233005f7c2810cc977c7a13155452de76e1268427\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x9dgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:49Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.272762 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9z9q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4492bc6-9e61-4748-935e-e070a703c05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqzrd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9z9q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:49Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.297861 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:49Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.312466 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7002f9af-9339-4bec-8b7a-ce0c1d20f3ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96ea8ece692f200d9dbe3650ba30a3680897c6d7c33c9b359b27e31692689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f580c83a6ce9959ed5a1c8b78834f7d6caecaeff3ed366fdef991ff728b08d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed30aa5b737df201bfca9e17073255b5d791430429d86149107650c38a1c0c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 07:48:39.005956 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 07:48:39.006283 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 07:48:39.007038 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3527224080/tls.crt::/tmp/serving-cert-3527224080/tls.key\\\\\\\"\\\\nI1003 07:48:39.474760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 07:48:39.479276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 07:48:39.479354 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 07:48:39.479401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 07:48:39.479427 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 07:48:39.489134 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 07:48:39.489225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 07:48:39.489291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 07:48:39.489311 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 07:48:39.489336 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 07:48:39.489506 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 07:48:39.491496 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1078b48064376958ba64f9040dc1973628494ff4c9935c90c296bdb8a27f8738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:49Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.329034 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.329085 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.329099 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.329122 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.329135 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:49Z","lastTransitionTime":"2025-10-03T07:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.339184 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2jpvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:49Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.356816 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h865c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e441c3-e8db-4705-9da1-0c6513d57048\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h865c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:49Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.371746 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hqfhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e03e57b4-cb64-42fa-b8c5-ee4863291568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31867b4c35050cbe6c4247edfe085dc9429d5516c19ad8da432699262b7fd092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hqfhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:49Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.387967 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8078a587-9c95-43a8-9cf8-4286835f8134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938aaa7e3e507e4f04632ecbdeb12d1e169b59fdcd8c19838e21fedc4b55d721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682632866f1e606691047ac952506c02f250c825f142a1676a4abcca32e58466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7eed5fa26eeb30a803512265d01d5fbb9fffad00d857a683939158da6b0bb7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e4e705e4bd02aaf141572ec91291240f8f7ec92b33cd45ba98ea8ce83f06918\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:49Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.403121 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:49Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.415971 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d6a6dd75fa3f860ac6164f4065d5fbb3ea31148102e726b792655d5b08e34d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:49Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.431782 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c51653288453da9ac24f91fdd67f9cec6bea6dc9b90432eeab5350b7e48f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://741ad5608cab1c104f0a912ca7325733265167e85291eb6bc1b08dd5a23a9b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:49Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.432701 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.432736 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.432750 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.432774 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.432786 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:49Z","lastTransitionTime":"2025-10-03T07:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.448163 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a8b21d2fb73dbe0bf41ef524cce8befb05dca708bc568a567ccd71bc1434b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:49Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.461741 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:49Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.464554 4664 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.465422 4664 scope.go:117] "RemoveContainer" containerID="387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9" Oct 03 07:48:49 crc kubenswrapper[4664]: E1003 07:48:49.465776 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.474198 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:49Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.491037 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b81ce-0ce7-498f-9337-ae5e6e64682b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf6aded6ff3dcd5fbea90e46bb42d6db6b963c9688abfed6fa40662139fb80f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b446e000609801e974810aa233005f7c2810cc977c7a13155452de76e1268427\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x9dgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:49Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.500696 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9z9q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4492bc6-9e61-4748-935e-e070a703c05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aac5ac1e508036a7e3cee3eb11577734e7923c4076399288863f294b7efa3fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqzrd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9z9q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:49Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.514690 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-72cm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6998d742-8d17-4f20-ab52-c30d9f7b0b89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93e9f047fd6ffc420a229d1159b5c53c40c5efe1f945e45bafa3ba93a69caef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9hpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-72cm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:49Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.536077 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.536124 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.536136 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.536153 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.536165 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:49Z","lastTransitionTime":"2025-10-03T07:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.639711 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.639759 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.639779 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.639806 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.639822 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:49Z","lastTransitionTime":"2025-10-03T07:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.743695 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.743970 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.744045 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.744122 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.744189 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:49Z","lastTransitionTime":"2025-10-03T07:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.847930 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.848013 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.848036 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.848068 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.848092 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:49Z","lastTransitionTime":"2025-10-03T07:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.875332 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.875332 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.875499 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:48:49 crc kubenswrapper[4664]: E1003 07:48:49.875707 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:48:49 crc kubenswrapper[4664]: E1003 07:48:49.875853 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:48:49 crc kubenswrapper[4664]: E1003 07:48:49.875958 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.896424 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:49Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.910417 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b81ce-0ce7-498f-9337-ae5e6e64682b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf6aded6ff3dcd5fbea90e46bb42d6db6b963c9688abfed6fa40662139fb80f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b446e000609801e974810aa233005f7c2810cc977c7a13155452de76e1268427\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x9dgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:49Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.923105 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9z9q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4492bc6-9e61-4748-935e-e070a703c05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aac5ac1e508036a7e3cee3eb11577734e7923c4076399288863f294b7efa3fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqzrd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9z9q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:49Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.937890 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-72cm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6998d742-8d17-4f20-ab52-c30d9f7b0b89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93e9f047fd6ffc420a229d1159b5c53c40c5efe1f945e45bafa3ba93a69caef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9hpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-72cm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:49Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.951061 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.951120 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.951149 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.951188 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.951216 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:49Z","lastTransitionTime":"2025-10-03T07:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.957108 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7002f9af-9339-4bec-8b7a-ce0c1d20f3ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96ea8ece692f200d9dbe3650ba30a3680897c6d7c33c9b359b27e31692689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f580c83a6ce9959ed5a1c8b78834f7d6caecaeff3ed366fdef991ff728b08d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed30aa5b737df201bfca9e17073255b5d791430429d86149107650c38a1c0c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 07:48:39.005956 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 07:48:39.006283 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 07:48:39.007038 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3527224080/tls.crt::/tmp/serving-cert-3527224080/tls.key\\\\\\\"\\\\nI1003 07:48:39.474760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 07:48:39.479276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 07:48:39.479354 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 07:48:39.479401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 07:48:39.479427 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 07:48:39.489134 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 07:48:39.489225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 07:48:39.489291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 07:48:39.489311 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 07:48:39.489336 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 07:48:39.489506 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 07:48:39.491496 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1078b48064376958ba64f9040dc1973628494ff4c9935c90c296bdb8a27f8738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:49Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.979967 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2jpvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:49Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:49 crc kubenswrapper[4664]: I1003 07:48:49.998455 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h865c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e441c3-e8db-4705-9da1-0c6513d57048\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h865c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:49Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.014651 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:50Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.028446 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hqfhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e03e57b4-cb64-42fa-b8c5-ee4863291568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31867b4c35050cbe6c4247edfe085dc9429d5516c19ad8da432699262b7fd092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hqfhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:50Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.045150 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8078a587-9c95-43a8-9cf8-4286835f8134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938aaa7e3e507e4f04632ecbdeb12d1e169b59fdcd8c19838e21fedc4b55d721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682632866f1e606691047ac952506c02f250c825f142a1676a4abcca32e58466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7eed5fa26eeb30a803512265d01d5fbb9fffad00d857a683939158da6b0bb7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e4e705e4bd02aaf141572ec91291240f8f7ec92b33cd45ba98ea8ce83f06918\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:50Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.054011 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.054071 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.054085 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.054107 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.054120 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:50Z","lastTransitionTime":"2025-10-03T07:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.061060 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:50Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.069580 4664 generic.go:334] "Generic (PLEG): container finished" podID="59e441c3-e8db-4705-9da1-0c6513d57048" containerID="6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9" exitCode=0 Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.069667 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-h865c" event={"ID":"59e441c3-e8db-4705-9da1-0c6513d57048","Type":"ContainerDied","Data":"6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9"} Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.079901 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d6a6dd75fa3f860ac6164f4065d5fbb3ea31148102e726b792655d5b08e34d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:50Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.099170 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c51653288453da9ac24f91fdd67f9cec6bea6dc9b90432eeab5350b7e48f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://741ad5608cab1c104f0a912ca7325733265167e85291eb6bc1b08dd5a23a9b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:50Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.114713 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a8b21d2fb73dbe0bf41ef524cce8befb05dca708bc568a567ccd71bc1434b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:50Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.128440 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-72cm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6998d742-8d17-4f20-ab52-c30d9f7b0b89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93e9f047fd6ffc420a229d1159b5c53c40c5efe1f945e45bafa3ba93a69caef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9hpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-72cm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:50Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.146434 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7002f9af-9339-4bec-8b7a-ce0c1d20f3ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96ea8ece692f200d9dbe3650ba30a3680897c6d7c33c9b359b27e31692689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f580c83a6ce9959ed5a1c8b78834f7d6caecaeff3ed366fdef991ff728b08d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed30aa5b737df201bfca9e17073255b5d791430429d86149107650c38a1c0c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 07:48:39.005956 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 07:48:39.006283 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 07:48:39.007038 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3527224080/tls.crt::/tmp/serving-cert-3527224080/tls.key\\\\\\\"\\\\nI1003 07:48:39.474760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 07:48:39.479276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 07:48:39.479354 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 07:48:39.479401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 07:48:39.479427 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 07:48:39.489134 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 07:48:39.489225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 07:48:39.489291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 07:48:39.489311 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 07:48:39.489336 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 07:48:39.489506 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 07:48:39.491496 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1078b48064376958ba64f9040dc1973628494ff4c9935c90c296bdb8a27f8738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:50Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.157943 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.158019 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.158030 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.158050 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.158064 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:50Z","lastTransitionTime":"2025-10-03T07:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.168923 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2jpvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:50Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.184227 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h865c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e441c3-e8db-4705-9da1-0c6513d57048\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h865c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:50Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.198494 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8078a587-9c95-43a8-9cf8-4286835f8134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938aaa7e3e507e4f04632ecbdeb12d1e169b59fdcd8c19838e21fedc4b55d721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682632866f1e606691047ac952506c02f250c825f142a1676a4abcca32e58466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7eed5fa26eeb30a803512265d01d5fbb9fffad00d857a683939158da6b0bb7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e4e705e4bd02aaf141572ec91291240f8f7ec92b33cd45ba98ea8ce83f06918\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:50Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.210531 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:50Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.227562 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d6a6dd75fa3f860ac6164f4065d5fbb3ea31148102e726b792655d5b08e34d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:50Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.244337 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c51653288453da9ac24f91fdd67f9cec6bea6dc9b90432eeab5350b7e48f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://741ad5608cab1c104f0a912ca7325733265167e85291eb6bc1b08dd5a23a9b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:50Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.259845 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a8b21d2fb73dbe0bf41ef524cce8befb05dca708bc568a567ccd71bc1434b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:50Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.260840 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.260863 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.260872 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.260888 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.260897 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:50Z","lastTransitionTime":"2025-10-03T07:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.275430 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:50Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.286294 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hqfhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e03e57b4-cb64-42fa-b8c5-ee4863291568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31867b4c35050cbe6c4247edfe085dc9429d5516c19ad8da432699262b7fd092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hqfhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:50Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.299387 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:50Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.309687 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b81ce-0ce7-498f-9337-ae5e6e64682b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf6aded6ff3dcd5fbea90e46bb42d6db6b963c9688abfed6fa40662139fb80f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b446e000609801e974810aa233005f7c2810cc977c7a13155452de76e1268427\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x9dgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:50Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.318812 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9z9q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4492bc6-9e61-4748-935e-e070a703c05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aac5ac1e508036a7e3cee3eb11577734e7923c4076399288863f294b7efa3fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqzrd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9z9q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:50Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.362820 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.362870 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.362883 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.362900 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.362911 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:50Z","lastTransitionTime":"2025-10-03T07:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.464876 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.464916 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.464925 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.464945 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.464965 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:50Z","lastTransitionTime":"2025-10-03T07:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.567434 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.567480 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.567490 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.567505 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.567514 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:50Z","lastTransitionTime":"2025-10-03T07:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.669651 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.669689 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.669701 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.669716 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.669728 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:50Z","lastTransitionTime":"2025-10-03T07:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.772036 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.772075 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.772089 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.772106 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.772118 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:50Z","lastTransitionTime":"2025-10-03T07:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.874144 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.874188 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.874199 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.874217 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.874232 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:50Z","lastTransitionTime":"2025-10-03T07:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.976315 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.976349 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.976358 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.976371 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:50 crc kubenswrapper[4664]: I1003 07:48:50.976380 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:50Z","lastTransitionTime":"2025-10-03T07:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.074967 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" event={"ID":"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d","Type":"ContainerStarted","Data":"a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b"} Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.077095 4664 generic.go:334] "Generic (PLEG): container finished" podID="59e441c3-e8db-4705-9da1-0c6513d57048" containerID="74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c" exitCode=0 Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.077122 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-h865c" event={"ID":"59e441c3-e8db-4705-9da1-0c6513d57048","Type":"ContainerDied","Data":"74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c"} Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.078172 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.078225 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.078243 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.078260 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.078272 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:51Z","lastTransitionTime":"2025-10-03T07:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.100623 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-72cm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6998d742-8d17-4f20-ab52-c30d9f7b0b89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93e9f047fd6ffc420a229d1159b5c53c40c5efe1f945e45bafa3ba93a69caef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9hpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-72cm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:51Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.115197 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7002f9af-9339-4bec-8b7a-ce0c1d20f3ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96ea8ece692f200d9dbe3650ba30a3680897c6d7c33c9b359b27e31692689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f580c83a6ce9959ed5a1c8b78834f7d6caecaeff3ed366fdef991ff728b08d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed30aa5b737df201bfca9e17073255b5d791430429d86149107650c38a1c0c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 07:48:39.005956 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 07:48:39.006283 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 07:48:39.007038 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3527224080/tls.crt::/tmp/serving-cert-3527224080/tls.key\\\\\\\"\\\\nI1003 07:48:39.474760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 07:48:39.479276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 07:48:39.479354 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 07:48:39.479401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 07:48:39.479427 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 07:48:39.489134 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 07:48:39.489225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 07:48:39.489291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 07:48:39.489311 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 07:48:39.489336 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 07:48:39.489506 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 07:48:39.491496 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1078b48064376958ba64f9040dc1973628494ff4c9935c90c296bdb8a27f8738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:51Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.132735 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2jpvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:51Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.147463 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h865c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e441c3-e8db-4705-9da1-0c6513d57048\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h865c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:51Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.162497 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:51Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.173684 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hqfhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e03e57b4-cb64-42fa-b8c5-ee4863291568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31867b4c35050cbe6c4247edfe085dc9429d5516c19ad8da432699262b7fd092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hqfhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:51Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.186932 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8078a587-9c95-43a8-9cf8-4286835f8134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938aaa7e3e507e4f04632ecbdeb12d1e169b59fdcd8c19838e21fedc4b55d721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682632866f1e606691047ac952506c02f250c825f142a1676a4abcca32e58466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7eed5fa26eeb30a803512265d01d5fbb9fffad00d857a683939158da6b0bb7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e4e705e4bd02aaf141572ec91291240f8f7ec92b33cd45ba98ea8ce83f06918\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:51Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.187324 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.187366 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.187378 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.187395 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.187405 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:51Z","lastTransitionTime":"2025-10-03T07:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.199663 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:51Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.211391 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d6a6dd75fa3f860ac6164f4065d5fbb3ea31148102e726b792655d5b08e34d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:51Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.227492 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c51653288453da9ac24f91fdd67f9cec6bea6dc9b90432eeab5350b7e48f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://741ad5608cab1c104f0a912ca7325733265167e85291eb6bc1b08dd5a23a9b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:51Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.240586 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a8b21d2fb73dbe0bf41ef524cce8befb05dca708bc568a567ccd71bc1434b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:51Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.253305 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:51Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.264740 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b81ce-0ce7-498f-9337-ae5e6e64682b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf6aded6ff3dcd5fbea90e46bb42d6db6b963c9688abfed6fa40662139fb80f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b446e000609801e974810aa233005f7c2810cc977c7a13155452de76e1268427\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x9dgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:51Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.274353 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9z9q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4492bc6-9e61-4748-935e-e070a703c05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aac5ac1e508036a7e3cee3eb11577734e7923c4076399288863f294b7efa3fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqzrd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9z9q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:51Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.289860 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.290127 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.290233 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.290341 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.290430 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:51Z","lastTransitionTime":"2025-10-03T07:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.392888 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.392937 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.392949 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.392968 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.392978 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:51Z","lastTransitionTime":"2025-10-03T07:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.496061 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.496104 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.496116 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.496131 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.496145 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:51Z","lastTransitionTime":"2025-10-03T07:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.599189 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.599265 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.599279 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.599306 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.599321 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:51Z","lastTransitionTime":"2025-10-03T07:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.702839 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.702904 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.702921 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.702943 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.702958 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:51Z","lastTransitionTime":"2025-10-03T07:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.806690 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.806747 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.806763 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.806786 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.806806 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:51Z","lastTransitionTime":"2025-10-03T07:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.875812 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.875855 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.875946 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:48:51 crc kubenswrapper[4664]: E1003 07:48:51.876025 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:48:51 crc kubenswrapper[4664]: E1003 07:48:51.876169 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:48:51 crc kubenswrapper[4664]: E1003 07:48:51.876267 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.908990 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.909032 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.909043 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.909059 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:51 crc kubenswrapper[4664]: I1003 07:48:51.909069 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:51Z","lastTransitionTime":"2025-10-03T07:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.012982 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.013074 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.013096 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.013130 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.013151 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:52Z","lastTransitionTime":"2025-10-03T07:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.085973 4664 generic.go:334] "Generic (PLEG): container finished" podID="59e441c3-e8db-4705-9da1-0c6513d57048" containerID="e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9" exitCode=0 Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.086033 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-h865c" event={"ID":"59e441c3-e8db-4705-9da1-0c6513d57048","Type":"ContainerDied","Data":"e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9"} Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.109388 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:52Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.116042 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.116159 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.116176 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.116193 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.116204 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:52Z","lastTransitionTime":"2025-10-03T07:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.125593 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b81ce-0ce7-498f-9337-ae5e6e64682b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf6aded6ff3dcd5fbea90e46bb42d6db6b963c9688abfed6fa40662139fb80f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b446e000609801e974810aa233005f7c2810cc977c7a13155452de76e1268427\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x9dgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:52Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.142445 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9z9q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4492bc6-9e61-4748-935e-e070a703c05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aac5ac1e508036a7e3cee3eb11577734e7923c4076399288863f294b7efa3fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqzrd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9z9q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:52Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.160146 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-72cm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6998d742-8d17-4f20-ab52-c30d9f7b0b89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93e9f047fd6ffc420a229d1159b5c53c40c5efe1f945e45bafa3ba93a69caef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9hpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-72cm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:52Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.175419 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7002f9af-9339-4bec-8b7a-ce0c1d20f3ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96ea8ece692f200d9dbe3650ba30a3680897c6d7c33c9b359b27e31692689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f580c83a6ce9959ed5a1c8b78834f7d6caecaeff3ed366fdef991ff728b08d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed30aa5b737df201bfca9e17073255b5d791430429d86149107650c38a1c0c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 07:48:39.005956 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 07:48:39.006283 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 07:48:39.007038 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3527224080/tls.crt::/tmp/serving-cert-3527224080/tls.key\\\\\\\"\\\\nI1003 07:48:39.474760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 07:48:39.479276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 07:48:39.479354 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 07:48:39.479401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 07:48:39.479427 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 07:48:39.489134 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 07:48:39.489225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 07:48:39.489291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 07:48:39.489311 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 07:48:39.489336 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 07:48:39.489506 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 07:48:39.491496 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1078b48064376958ba64f9040dc1973628494ff4c9935c90c296bdb8a27f8738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:52Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.195277 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2jpvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:52Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.213569 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h865c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e441c3-e8db-4705-9da1-0c6513d57048\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h865c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:52Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.225850 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.225918 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.225930 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.225953 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.225968 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:52Z","lastTransitionTime":"2025-10-03T07:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.229471 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c51653288453da9ac24f91fdd67f9cec6bea6dc9b90432eeab5350b7e48f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://741ad5608cab1c104f0a912ca7325733265167e85291eb6bc1b08dd5a23a9b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:52Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.245143 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a8b21d2fb73dbe0bf41ef524cce8befb05dca708bc568a567ccd71bc1434b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:52Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.260569 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:52Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.273531 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hqfhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e03e57b4-cb64-42fa-b8c5-ee4863291568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31867b4c35050cbe6c4247edfe085dc9429d5516c19ad8da432699262b7fd092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hqfhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:52Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.286455 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8078a587-9c95-43a8-9cf8-4286835f8134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938aaa7e3e507e4f04632ecbdeb12d1e169b59fdcd8c19838e21fedc4b55d721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682632866f1e606691047ac952506c02f250c825f142a1676a4abcca32e58466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7eed5fa26eeb30a803512265d01d5fbb9fffad00d857a683939158da6b0bb7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e4e705e4bd02aaf141572ec91291240f8f7ec92b33cd45ba98ea8ce83f06918\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:52Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.299729 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:52Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.312872 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d6a6dd75fa3f860ac6164f4065d5fbb3ea31148102e726b792655d5b08e34d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:52Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.328781 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.328831 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.328842 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.328861 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.328873 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:52Z","lastTransitionTime":"2025-10-03T07:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.431979 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.432519 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.432536 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.432555 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.432569 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:52Z","lastTransitionTime":"2025-10-03T07:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.535294 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.535368 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.535384 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.535408 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.535423 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:52Z","lastTransitionTime":"2025-10-03T07:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.639793 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.639835 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.639878 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.639893 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.639902 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:52Z","lastTransitionTime":"2025-10-03T07:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.743334 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.743379 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.743390 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.743411 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.743422 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:52Z","lastTransitionTime":"2025-10-03T07:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.846373 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.846423 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.846437 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.846460 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.846474 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:52Z","lastTransitionTime":"2025-10-03T07:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.949700 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.949767 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.949787 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.949815 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:52 crc kubenswrapper[4664]: I1003 07:48:52.949836 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:52Z","lastTransitionTime":"2025-10-03T07:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.052882 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.052954 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.052973 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.052999 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.053018 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:53Z","lastTransitionTime":"2025-10-03T07:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.095320 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-h865c" event={"ID":"59e441c3-e8db-4705-9da1-0c6513d57048","Type":"ContainerStarted","Data":"45de7994382833aa2a4b6d7926e96257038511c4f547f24336eb11f3cf89e2d6"} Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.102662 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" event={"ID":"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d","Type":"ContainerStarted","Data":"b9a2122c70537d35a5c0829063c331dcaecb4f1835ebc5bf577c2441b5ccae3d"} Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.103006 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.103072 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.152851 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7002f9af-9339-4bec-8b7a-ce0c1d20f3ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96ea8ece692f200d9dbe3650ba30a3680897c6d7c33c9b359b27e31692689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f580c83a6ce9959ed5a1c8b78834f7d6caecaeff3ed366fdef991ff728b08d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed30aa5b737df201bfca9e17073255b5d791430429d86149107650c38a1c0c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 07:48:39.005956 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 07:48:39.006283 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 07:48:39.007038 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3527224080/tls.crt::/tmp/serving-cert-3527224080/tls.key\\\\\\\"\\\\nI1003 07:48:39.474760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 07:48:39.479276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 07:48:39.479354 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 07:48:39.479401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 07:48:39.479427 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 07:48:39.489134 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 07:48:39.489225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 07:48:39.489291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 07:48:39.489311 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 07:48:39.489336 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 07:48:39.489506 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 07:48:39.491496 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1078b48064376958ba64f9040dc1973628494ff4c9935c90c296bdb8a27f8738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:53Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.156137 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.156197 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.156234 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.156268 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.156289 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:53Z","lastTransitionTime":"2025-10-03T07:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.196145 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.204202 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.210291 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2jpvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:53Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.227042 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h865c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e441c3-e8db-4705-9da1-0c6513d57048\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de7994382833aa2a4b6d7926e96257038511c4f547f24336eb11f3cf89e2d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h865c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:53Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.239351 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hqfhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e03e57b4-cb64-42fa-b8c5-ee4863291568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31867b4c35050cbe6c4247edfe085dc9429d5516c19ad8da432699262b7fd092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hqfhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:53Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.256379 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8078a587-9c95-43a8-9cf8-4286835f8134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938aaa7e3e507e4f04632ecbdeb12d1e169b59fdcd8c19838e21fedc4b55d721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682632866f1e606691047ac952506c02f250c825f142a1676a4abcca32e58466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7eed5fa26eeb30a803512265d01d5fbb9fffad00d857a683939158da6b0bb7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e4e705e4bd02aaf141572ec91291240f8f7ec92b33cd45ba98ea8ce83f06918\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:53Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.259040 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.259091 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.259104 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.259124 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.259137 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:53Z","lastTransitionTime":"2025-10-03T07:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.272755 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:53Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.287363 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d6a6dd75fa3f860ac6164f4065d5fbb3ea31148102e726b792655d5b08e34d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:53Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.304818 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c51653288453da9ac24f91fdd67f9cec6bea6dc9b90432eeab5350b7e48f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://741ad5608cab1c104f0a912ca7325733265167e85291eb6bc1b08dd5a23a9b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:53Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.318554 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a8b21d2fb73dbe0bf41ef524cce8befb05dca708bc568a567ccd71bc1434b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:53Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.332128 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:53Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.354379 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:53Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.361517 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.361555 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.361571 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.361592 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.361629 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:53Z","lastTransitionTime":"2025-10-03T07:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.367860 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b81ce-0ce7-498f-9337-ae5e6e64682b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf6aded6ff3dcd5fbea90e46bb42d6db6b963c9688abfed6fa40662139fb80f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b446e000609801e974810aa233005f7c2810cc977c7a13155452de76e1268427\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x9dgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:53Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.379576 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9z9q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4492bc6-9e61-4748-935e-e070a703c05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aac5ac1e508036a7e3cee3eb11577734e7923c4076399288863f294b7efa3fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqzrd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9z9q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:53Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.393455 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-72cm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6998d742-8d17-4f20-ab52-c30d9f7b0b89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93e9f047fd6ffc420a229d1159b5c53c40c5efe1f945e45bafa3ba93a69caef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9hpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-72cm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:53Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.408267 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b81ce-0ce7-498f-9337-ae5e6e64682b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf6aded6ff3dcd5fbea90e46bb42d6db6b963c9688abfed6fa40662139fb80f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b446e000609801e974810aa233005f7c2810cc977c7a13155452de76e1268427\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x9dgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:53Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.420088 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9z9q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4492bc6-9e61-4748-935e-e070a703c05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aac5ac1e508036a7e3cee3eb11577734e7923c4076399288863f294b7efa3fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqzrd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9z9q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:53Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.434942 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:53Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.448776 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-72cm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6998d742-8d17-4f20-ab52-c30d9f7b0b89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93e9f047fd6ffc420a229d1159b5c53c40c5efe1f945e45bafa3ba93a69caef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9hpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-72cm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:53Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.463859 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.463902 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.463912 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.463927 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.463938 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:53Z","lastTransitionTime":"2025-10-03T07:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.465954 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h865c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e441c3-e8db-4705-9da1-0c6513d57048\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de7994382833aa2a4b6d7926e96257038511c4f547f24336eb11f3cf89e2d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h865c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:53Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.483291 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7002f9af-9339-4bec-8b7a-ce0c1d20f3ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96ea8ece692f200d9dbe3650ba30a3680897c6d7c33c9b359b27e31692689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f580c83a6ce9959ed5a1c8b78834f7d6caecaeff3ed366fdef991ff728b08d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed30aa5b737df201bfca9e17073255b5d791430429d86149107650c38a1c0c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 07:48:39.005956 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 07:48:39.006283 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 07:48:39.007038 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3527224080/tls.crt::/tmp/serving-cert-3527224080/tls.key\\\\\\\"\\\\nI1003 07:48:39.474760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 07:48:39.479276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 07:48:39.479354 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 07:48:39.479401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 07:48:39.479427 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 07:48:39.489134 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 07:48:39.489225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 07:48:39.489291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 07:48:39.489311 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 07:48:39.489336 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 07:48:39.489506 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 07:48:39.491496 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1078b48064376958ba64f9040dc1973628494ff4c9935c90c296bdb8a27f8738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:53Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.504329 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9a2122c70537d35a5c0829063c331dcaecb4f1835ebc5bf577c2441b5ccae3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2jpvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:53Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.520486 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:53Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.538785 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d6a6dd75fa3f860ac6164f4065d5fbb3ea31148102e726b792655d5b08e34d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:53Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.552933 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c51653288453da9ac24f91fdd67f9cec6bea6dc9b90432eeab5350b7e48f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://741ad5608cab1c104f0a912ca7325733265167e85291eb6bc1b08dd5a23a9b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:53Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.566779 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.566850 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.566864 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.566887 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.566907 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:53Z","lastTransitionTime":"2025-10-03T07:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.569557 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a8b21d2fb73dbe0bf41ef524cce8befb05dca708bc568a567ccd71bc1434b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:53Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.587459 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:53Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.599404 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hqfhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e03e57b4-cb64-42fa-b8c5-ee4863291568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31867b4c35050cbe6c4247edfe085dc9429d5516c19ad8da432699262b7fd092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hqfhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:53Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.615727 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8078a587-9c95-43a8-9cf8-4286835f8134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938aaa7e3e507e4f04632ecbdeb12d1e169b59fdcd8c19838e21fedc4b55d721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682632866f1e606691047ac952506c02f250c825f142a1676a4abcca32e58466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7eed5fa26eeb30a803512265d01d5fbb9fffad00d857a683939158da6b0bb7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e4e705e4bd02aaf141572ec91291240f8f7ec92b33cd45ba98ea8ce83f06918\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:53Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.669240 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.669299 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.669310 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.669335 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.669347 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:53Z","lastTransitionTime":"2025-10-03T07:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.773124 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.773189 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.773203 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.773228 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.773246 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:53Z","lastTransitionTime":"2025-10-03T07:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.875351 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.875464 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.875380 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:48:53 crc kubenswrapper[4664]: E1003 07:48:53.875600 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:48:53 crc kubenswrapper[4664]: E1003 07:48:53.875945 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:48:53 crc kubenswrapper[4664]: E1003 07:48:53.876145 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.878448 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.878498 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.878567 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.878594 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.878715 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:53Z","lastTransitionTime":"2025-10-03T07:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.982780 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.982857 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.982876 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.982920 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:53 crc kubenswrapper[4664]: I1003 07:48:53.982960 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:53Z","lastTransitionTime":"2025-10-03T07:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:54 crc kubenswrapper[4664]: I1003 07:48:54.087021 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:54 crc kubenswrapper[4664]: I1003 07:48:54.087066 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:54 crc kubenswrapper[4664]: I1003 07:48:54.087077 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:54 crc kubenswrapper[4664]: I1003 07:48:54.087096 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:54 crc kubenswrapper[4664]: I1003 07:48:54.087110 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:54Z","lastTransitionTime":"2025-10-03T07:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:54 crc kubenswrapper[4664]: I1003 07:48:54.107950 4664 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 07:48:54 crc kubenswrapper[4664]: I1003 07:48:54.190082 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:54 crc kubenswrapper[4664]: I1003 07:48:54.190155 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:54 crc kubenswrapper[4664]: I1003 07:48:54.190171 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:54 crc kubenswrapper[4664]: I1003 07:48:54.190200 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:54 crc kubenswrapper[4664]: I1003 07:48:54.190221 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:54Z","lastTransitionTime":"2025-10-03T07:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:54 crc kubenswrapper[4664]: I1003 07:48:54.293164 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:54 crc kubenswrapper[4664]: I1003 07:48:54.293220 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:54 crc kubenswrapper[4664]: I1003 07:48:54.293229 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:54 crc kubenswrapper[4664]: I1003 07:48:54.293250 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:54 crc kubenswrapper[4664]: I1003 07:48:54.293260 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:54Z","lastTransitionTime":"2025-10-03T07:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:54 crc kubenswrapper[4664]: I1003 07:48:54.396184 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:54 crc kubenswrapper[4664]: I1003 07:48:54.396229 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:54 crc kubenswrapper[4664]: I1003 07:48:54.396238 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:54 crc kubenswrapper[4664]: I1003 07:48:54.396255 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:54 crc kubenswrapper[4664]: I1003 07:48:54.396267 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:54Z","lastTransitionTime":"2025-10-03T07:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:54 crc kubenswrapper[4664]: I1003 07:48:54.499038 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:54 crc kubenswrapper[4664]: I1003 07:48:54.499095 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:54 crc kubenswrapper[4664]: I1003 07:48:54.499116 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:54 crc kubenswrapper[4664]: I1003 07:48:54.499134 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:54 crc kubenswrapper[4664]: I1003 07:48:54.499144 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:54Z","lastTransitionTime":"2025-10-03T07:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:54 crc kubenswrapper[4664]: I1003 07:48:54.601316 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:54 crc kubenswrapper[4664]: I1003 07:48:54.601592 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:54 crc kubenswrapper[4664]: I1003 07:48:54.601689 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:54 crc kubenswrapper[4664]: I1003 07:48:54.601756 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:54 crc kubenswrapper[4664]: I1003 07:48:54.601853 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:54Z","lastTransitionTime":"2025-10-03T07:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:54 crc kubenswrapper[4664]: I1003 07:48:54.704511 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:54 crc kubenswrapper[4664]: I1003 07:48:54.704592 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:54 crc kubenswrapper[4664]: I1003 07:48:54.704626 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:54 crc kubenswrapper[4664]: I1003 07:48:54.704654 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:54 crc kubenswrapper[4664]: I1003 07:48:54.704670 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:54Z","lastTransitionTime":"2025-10-03T07:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:54 crc kubenswrapper[4664]: I1003 07:48:54.808067 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:54 crc kubenswrapper[4664]: I1003 07:48:54.808108 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:54 crc kubenswrapper[4664]: I1003 07:48:54.808118 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:54 crc kubenswrapper[4664]: I1003 07:48:54.808136 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:54 crc kubenswrapper[4664]: I1003 07:48:54.808147 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:54Z","lastTransitionTime":"2025-10-03T07:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:54 crc kubenswrapper[4664]: I1003 07:48:54.911182 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:54 crc kubenswrapper[4664]: I1003 07:48:54.911231 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:54 crc kubenswrapper[4664]: I1003 07:48:54.911250 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:54 crc kubenswrapper[4664]: I1003 07:48:54.911270 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:54 crc kubenswrapper[4664]: I1003 07:48:54.911282 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:54Z","lastTransitionTime":"2025-10-03T07:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.013462 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.013505 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.013516 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.013694 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.013724 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:55Z","lastTransitionTime":"2025-10-03T07:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.111148 4664 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.116419 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.116483 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.116494 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.116513 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.116526 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:55Z","lastTransitionTime":"2025-10-03T07:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.219375 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.219439 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.219453 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.219476 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.219490 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:55Z","lastTransitionTime":"2025-10-03T07:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.322241 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.322312 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.322324 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.322345 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.322360 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:55Z","lastTransitionTime":"2025-10-03T07:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.424556 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.424772 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.424867 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.424960 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.425038 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:55Z","lastTransitionTime":"2025-10-03T07:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.527711 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.527773 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.527782 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.527794 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.527803 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:55Z","lastTransitionTime":"2025-10-03T07:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.594036 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.594127 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:48:55 crc kubenswrapper[4664]: E1003 07:48:55.594173 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 07:49:11.594144577 +0000 UTC m=+52.415335077 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:48:55 crc kubenswrapper[4664]: E1003 07:48:55.594202 4664 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.594211 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:48:55 crc kubenswrapper[4664]: E1003 07:48:55.594218 4664 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 07:48:55 crc kubenswrapper[4664]: E1003 07:48:55.594231 4664 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.594249 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:48:55 crc kubenswrapper[4664]: E1003 07:48:55.594271 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 07:49:11.59425865 +0000 UTC m=+52.415449140 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.594289 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:48:55 crc kubenswrapper[4664]: E1003 07:48:55.594338 4664 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 07:48:55 crc kubenswrapper[4664]: E1003 07:48:55.594355 4664 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 07:48:55 crc kubenswrapper[4664]: E1003 07:48:55.594355 4664 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 07:48:55 crc kubenswrapper[4664]: E1003 07:48:55.594367 4664 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 07:48:55 crc kubenswrapper[4664]: E1003 07:48:55.594380 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 07:49:11.594374014 +0000 UTC m=+52.415564504 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 07:48:55 crc kubenswrapper[4664]: E1003 07:48:55.594400 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 07:49:11.594388574 +0000 UTC m=+52.415579064 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 07:48:55 crc kubenswrapper[4664]: E1003 07:48:55.594400 4664 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 07:48:55 crc kubenswrapper[4664]: E1003 07:48:55.594486 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 07:49:11.594465826 +0000 UTC m=+52.415656366 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.631034 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.631082 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.631091 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.631106 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.631117 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:55Z","lastTransitionTime":"2025-10-03T07:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.689904 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.690001 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.690020 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.690042 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.690054 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:55Z","lastTransitionTime":"2025-10-03T07:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:55 crc kubenswrapper[4664]: E1003 07:48:55.706712 4664 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:48:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:48:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:48:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:48:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eca81c87-676e-4667-a87b-e015ec0be81c\\\",\\\"systemUUID\\\":\\\"7be6e848-96ef-48b1-8627-9ddc13d5cc87\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:55Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.712387 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.712418 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.712427 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.712442 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.712451 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:55Z","lastTransitionTime":"2025-10-03T07:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:55 crc kubenswrapper[4664]: E1003 07:48:55.729161 4664 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:48:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:48:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:48:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:48:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eca81c87-676e-4667-a87b-e015ec0be81c\\\",\\\"systemUUID\\\":\\\"7be6e848-96ef-48b1-8627-9ddc13d5cc87\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:55Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.735201 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.735276 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.735293 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.735318 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.735332 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:55Z","lastTransitionTime":"2025-10-03T07:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:55 crc kubenswrapper[4664]: E1003 07:48:55.749315 4664 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:48:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:48:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:48:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:48:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eca81c87-676e-4667-a87b-e015ec0be81c\\\",\\\"systemUUID\\\":\\\"7be6e848-96ef-48b1-8627-9ddc13d5cc87\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:55Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.753899 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.753960 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.753974 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.754017 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.754032 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:55Z","lastTransitionTime":"2025-10-03T07:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:55 crc kubenswrapper[4664]: E1003 07:48:55.768284 4664 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:48:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:48:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:48:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:48:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eca81c87-676e-4667-a87b-e015ec0be81c\\\",\\\"systemUUID\\\":\\\"7be6e848-96ef-48b1-8627-9ddc13d5cc87\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:55Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.773100 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.773154 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.773166 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.773187 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.773203 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:55Z","lastTransitionTime":"2025-10-03T07:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:55 crc kubenswrapper[4664]: E1003 07:48:55.789282 4664 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:48:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:48:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:48:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:48:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eca81c87-676e-4667-a87b-e015ec0be81c\\\",\\\"systemUUID\\\":\\\"7be6e848-96ef-48b1-8627-9ddc13d5cc87\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:55Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:55 crc kubenswrapper[4664]: E1003 07:48:55.789396 4664 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.791501 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.791566 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.791579 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.791598 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.791633 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:55Z","lastTransitionTime":"2025-10-03T07:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.876428 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.876538 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.876458 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:48:55 crc kubenswrapper[4664]: E1003 07:48:55.876695 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:48:55 crc kubenswrapper[4664]: E1003 07:48:55.876811 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:48:55 crc kubenswrapper[4664]: E1003 07:48:55.877090 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.895180 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.895236 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.895251 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.895272 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.895286 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:55Z","lastTransitionTime":"2025-10-03T07:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.998434 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.999061 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.999080 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.999108 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:55 crc kubenswrapper[4664]: I1003 07:48:55.999128 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:55Z","lastTransitionTime":"2025-10-03T07:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.101922 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.101986 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.101999 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.102020 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.102034 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:56Z","lastTransitionTime":"2025-10-03T07:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.116418 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2jpvm_8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d/ovnkube-controller/0.log" Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.118922 4664 generic.go:334] "Generic (PLEG): container finished" podID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerID="b9a2122c70537d35a5c0829063c331dcaecb4f1835ebc5bf577c2441b5ccae3d" exitCode=1 Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.118965 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" event={"ID":"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d","Type":"ContainerDied","Data":"b9a2122c70537d35a5c0829063c331dcaecb4f1835ebc5bf577c2441b5ccae3d"} Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.119792 4664 scope.go:117] "RemoveContainer" containerID="b9a2122c70537d35a5c0829063c331dcaecb4f1835ebc5bf577c2441b5ccae3d" Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.136540 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7002f9af-9339-4bec-8b7a-ce0c1d20f3ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96ea8ece692f200d9dbe3650ba30a3680897c6d7c33c9b359b27e31692689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f580c83a6ce9959ed5a1c8b78834f7d6caecaeff3ed366fdef991ff728b08d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed30aa5b737df201bfca9e17073255b5d791430429d86149107650c38a1c0c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 07:48:39.005956 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 07:48:39.006283 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 07:48:39.007038 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3527224080/tls.crt::/tmp/serving-cert-3527224080/tls.key\\\\\\\"\\\\nI1003 07:48:39.474760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 07:48:39.479276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 07:48:39.479354 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 07:48:39.479401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 07:48:39.479427 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 07:48:39.489134 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 07:48:39.489225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 07:48:39.489291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 07:48:39.489311 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 07:48:39.489336 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 07:48:39.489506 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 07:48:39.491496 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1078b48064376958ba64f9040dc1973628494ff4c9935c90c296bdb8a27f8738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:56Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.159183 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9a2122c70537d35a5c0829063c331dcaecb4f1835ebc5bf577c2441b5ccae3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9a2122c70537d35a5c0829063c331dcaecb4f1835ebc5bf577c2441b5ccae3d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T07:48:55Z\\\",\\\"message\\\":\\\"946 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 07:48:55.868387 5946 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1003 07:48:55.868399 5946 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 07:48:55.868411 5946 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1003 07:48:55.867999 5946 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1003 07:48:55.868515 5946 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1003 07:48:55.868533 5946 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1003 07:48:55.868543 5946 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 07:48:55.868590 5946 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1003 07:48:55.868627 5946 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 07:48:55.868655 5946 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1003 07:48:55.868705 5946 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1003 07:48:55.868708 5946 factory.go:656] Stopping watch factory\\\\nI1003 07:48:55.868734 5946 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2jpvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:56Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.177422 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h865c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e441c3-e8db-4705-9da1-0c6513d57048\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de7994382833aa2a4b6d7926e96257038511c4f547f24336eb11f3cf89e2d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h865c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:56Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.193179 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:56Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.205064 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.205106 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.205116 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.205136 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.205151 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:56Z","lastTransitionTime":"2025-10-03T07:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.207573 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hqfhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e03e57b4-cb64-42fa-b8c5-ee4863291568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31867b4c35050cbe6c4247edfe085dc9429d5516c19ad8da432699262b7fd092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hqfhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:56Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.221984 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8078a587-9c95-43a8-9cf8-4286835f8134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938aaa7e3e507e4f04632ecbdeb12d1e169b59fdcd8c19838e21fedc4b55d721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682632866f1e606691047ac952506c02f250c825f142a1676a4abcca32e58466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7eed5fa26eeb30a803512265d01d5fbb9fffad00d857a683939158da6b0bb7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e4e705e4bd02aaf141572ec91291240f8f7ec92b33cd45ba98ea8ce83f06918\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:56Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.239685 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:56Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.254248 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d6a6dd75fa3f860ac6164f4065d5fbb3ea31148102e726b792655d5b08e34d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:56Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.269749 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c51653288453da9ac24f91fdd67f9cec6bea6dc9b90432eeab5350b7e48f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://741ad5608cab1c104f0a912ca7325733265167e85291eb6bc1b08dd5a23a9b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:56Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.286940 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a8b21d2fb73dbe0bf41ef524cce8befb05dca708bc568a567ccd71bc1434b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:56Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.305730 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:56Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.308391 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.308428 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.308440 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.308460 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.308475 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:56Z","lastTransitionTime":"2025-10-03T07:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.323960 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b81ce-0ce7-498f-9337-ae5e6e64682b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf6aded6ff3dcd5fbea90e46bb42d6db6b963c9688abfed6fa40662139fb80f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b446e000609801e974810aa233005f7c2810cc977c7a13155452de76e1268427\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x9dgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:56Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.340731 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9z9q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4492bc6-9e61-4748-935e-e070a703c05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aac5ac1e508036a7e3cee3eb11577734e7923c4076399288863f294b7efa3fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqzrd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9z9q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:56Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.360122 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-72cm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6998d742-8d17-4f20-ab52-c30d9f7b0b89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93e9f047fd6ffc420a229d1159b5c53c40c5efe1f945e45bafa3ba93a69caef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9hpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-72cm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:56Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.411803 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.411854 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.411867 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.411889 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.411903 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:56Z","lastTransitionTime":"2025-10-03T07:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.515793 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.515848 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.515864 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.515896 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.515911 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:56Z","lastTransitionTime":"2025-10-03T07:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.618857 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.618901 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.618911 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.618925 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.618936 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:56Z","lastTransitionTime":"2025-10-03T07:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.722025 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.722091 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.722106 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.722131 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.722143 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:56Z","lastTransitionTime":"2025-10-03T07:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.825573 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.825661 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.825675 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.825697 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.825713 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:56Z","lastTransitionTime":"2025-10-03T07:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.929265 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.929328 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.929341 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.929363 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:56 crc kubenswrapper[4664]: I1003 07:48:56.929380 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:56Z","lastTransitionTime":"2025-10-03T07:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.033234 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.033324 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.033339 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.033371 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.033385 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:57Z","lastTransitionTime":"2025-10-03T07:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.125215 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2jpvm_8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d/ovnkube-controller/0.log" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.128736 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" event={"ID":"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d","Type":"ContainerStarted","Data":"4ce0e6687a56096c1c89207f11b667732301910fdefad7e84fabca337fb456c4"} Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.128887 4664 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.135368 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.135420 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.135432 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.135454 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.135467 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:57Z","lastTransitionTime":"2025-10-03T07:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.147139 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:57Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.173749 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b81ce-0ce7-498f-9337-ae5e6e64682b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf6aded6ff3dcd5fbea90e46bb42d6db6b963c9688abfed6fa40662139fb80f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b446e000609801e974810aa233005f7c2810cc977c7a13155452de76e1268427\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x9dgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:57Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.187280 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9z9q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4492bc6-9e61-4748-935e-e070a703c05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aac5ac1e508036a7e3cee3eb11577734e7923c4076399288863f294b7efa3fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqzrd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9z9q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:57Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.203396 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-72cm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6998d742-8d17-4f20-ab52-c30d9f7b0b89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93e9f047fd6ffc420a229d1159b5c53c40c5efe1f945e45bafa3ba93a69caef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9hpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-72cm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:57Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.219373 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7002f9af-9339-4bec-8b7a-ce0c1d20f3ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96ea8ece692f200d9dbe3650ba30a3680897c6d7c33c9b359b27e31692689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f580c83a6ce9959ed5a1c8b78834f7d6caecaeff3ed366fdef991ff728b08d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed30aa5b737df201bfca9e17073255b5d791430429d86149107650c38a1c0c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 07:48:39.005956 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 07:48:39.006283 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 07:48:39.007038 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3527224080/tls.crt::/tmp/serving-cert-3527224080/tls.key\\\\\\\"\\\\nI1003 07:48:39.474760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 07:48:39.479276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 07:48:39.479354 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 07:48:39.479401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 07:48:39.479427 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 07:48:39.489134 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 07:48:39.489225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 07:48:39.489291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 07:48:39.489311 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 07:48:39.489336 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 07:48:39.489506 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 07:48:39.491496 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1078b48064376958ba64f9040dc1973628494ff4c9935c90c296bdb8a27f8738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:57Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.238556 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.238715 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.238739 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.238769 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.238790 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:57Z","lastTransitionTime":"2025-10-03T07:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.240990 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce0e6687a56096c1c89207f11b667732301910fdefad7e84fabca337fb456c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9a2122c70537d35a5c0829063c331dcaecb4f1835ebc5bf577c2441b5ccae3d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T07:48:55Z\\\",\\\"message\\\":\\\"946 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 07:48:55.868387 5946 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1003 07:48:55.868399 5946 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 07:48:55.868411 5946 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1003 07:48:55.867999 5946 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1003 07:48:55.868515 5946 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1003 07:48:55.868533 5946 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1003 07:48:55.868543 5946 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 07:48:55.868590 5946 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1003 07:48:55.868627 5946 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 07:48:55.868655 5946 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1003 07:48:55.868705 5946 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1003 07:48:55.868708 5946 factory.go:656] Stopping watch factory\\\\nI1003 07:48:55.868734 5946 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2jpvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:57Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.281397 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h865c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e441c3-e8db-4705-9da1-0c6513d57048\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de7994382833aa2a4b6d7926e96257038511c4f547f24336eb11f3cf89e2d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h865c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:57Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.299530 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8078a587-9c95-43a8-9cf8-4286835f8134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938aaa7e3e507e4f04632ecbdeb12d1e169b59fdcd8c19838e21fedc4b55d721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682632866f1e606691047ac952506c02f250c825f142a1676a4abcca32e58466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7eed5fa26eeb30a803512265d01d5fbb9fffad00d857a683939158da6b0bb7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e4e705e4bd02aaf141572ec91291240f8f7ec92b33cd45ba98ea8ce83f06918\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:57Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.341952 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.342033 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.342060 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.342116 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.342146 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:57Z","lastTransitionTime":"2025-10-03T07:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.358052 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:57Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.372715 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d6a6dd75fa3f860ac6164f4065d5fbb3ea31148102e726b792655d5b08e34d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:57Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.388893 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c51653288453da9ac24f91fdd67f9cec6bea6dc9b90432eeab5350b7e48f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://741ad5608cab1c104f0a912ca7325733265167e85291eb6bc1b08dd5a23a9b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:57Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.406321 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a8b21d2fb73dbe0bf41ef524cce8befb05dca708bc568a567ccd71bc1434b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:57Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.421622 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:57Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.435161 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hqfhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e03e57b4-cb64-42fa-b8c5-ee4863291568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31867b4c35050cbe6c4247edfe085dc9429d5516c19ad8da432699262b7fd092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hqfhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:57Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.446848 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.446922 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.446938 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.446962 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.446981 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:57Z","lastTransitionTime":"2025-10-03T07:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.549867 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.549919 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.549929 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.549945 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.549960 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:57Z","lastTransitionTime":"2025-10-03T07:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.653017 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.653088 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.653103 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.653134 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.653161 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:57Z","lastTransitionTime":"2025-10-03T07:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.756644 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.756711 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.756726 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.756746 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.756760 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:57Z","lastTransitionTime":"2025-10-03T07:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.859927 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.860012 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.860041 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.860072 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.860092 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:57Z","lastTransitionTime":"2025-10-03T07:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.876396 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:48:57 crc kubenswrapper[4664]: E1003 07:48:57.876689 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.876881 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.876978 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:48:57 crc kubenswrapper[4664]: E1003 07:48:57.877125 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:48:57 crc kubenswrapper[4664]: E1003 07:48:57.877244 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.964358 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.964403 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.964414 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.964434 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:57 crc kubenswrapper[4664]: I1003 07:48:57.964447 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:57Z","lastTransitionTime":"2025-10-03T07:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.067279 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.067329 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.067340 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.067359 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.067374 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:58Z","lastTransitionTime":"2025-10-03T07:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.122909 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-27zqq"] Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.123468 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-27zqq" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.125814 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.126926 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.136844 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2jpvm_8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d/ovnkube-controller/1.log" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.138086 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2jpvm_8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d/ovnkube-controller/0.log" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.141679 4664 generic.go:334] "Generic (PLEG): container finished" podID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerID="4ce0e6687a56096c1c89207f11b667732301910fdefad7e84fabca337fb456c4" exitCode=1 Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.141755 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" event={"ID":"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d","Type":"ContainerDied","Data":"4ce0e6687a56096c1c89207f11b667732301910fdefad7e84fabca337fb456c4"} Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.141842 4664 scope.go:117] "RemoveContainer" containerID="b9a2122c70537d35a5c0829063c331dcaecb4f1835ebc5bf577c2441b5ccae3d" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.142863 4664 scope.go:117] "RemoveContainer" containerID="4ce0e6687a56096c1c89207f11b667732301910fdefad7e84fabca337fb456c4" Oct 03 07:48:58 crc kubenswrapper[4664]: E1003 07:48:58.143095 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-2jpvm_openshift-ovn-kubernetes(8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.145862 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7002f9af-9339-4bec-8b7a-ce0c1d20f3ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96ea8ece692f200d9dbe3650ba30a3680897c6d7c33c9b359b27e31692689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f580c83a6ce9959ed5a1c8b78834f7d6caecaeff3ed366fdef991ff728b08d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed30aa5b737df201bfca9e17073255b5d791430429d86149107650c38a1c0c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 07:48:39.005956 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 07:48:39.006283 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 07:48:39.007038 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3527224080/tls.crt::/tmp/serving-cert-3527224080/tls.key\\\\\\\"\\\\nI1003 07:48:39.474760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 07:48:39.479276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 07:48:39.479354 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 07:48:39.479401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 07:48:39.479427 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 07:48:39.489134 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 07:48:39.489225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 07:48:39.489291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 07:48:39.489311 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 07:48:39.489336 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 07:48:39.489506 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 07:48:39.491496 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1078b48064376958ba64f9040dc1973628494ff4c9935c90c296bdb8a27f8738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:58Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.168321 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce0e6687a56096c1c89207f11b667732301910fdefad7e84fabca337fb456c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9a2122c70537d35a5c0829063c331dcaecb4f1835ebc5bf577c2441b5ccae3d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T07:48:55Z\\\",\\\"message\\\":\\\"946 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 07:48:55.868387 5946 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1003 07:48:55.868399 5946 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 07:48:55.868411 5946 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1003 07:48:55.867999 5946 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1003 07:48:55.868515 5946 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1003 07:48:55.868533 5946 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1003 07:48:55.868543 5946 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 07:48:55.868590 5946 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1003 07:48:55.868627 5946 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 07:48:55.868655 5946 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1003 07:48:55.868705 5946 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1003 07:48:55.868708 5946 factory.go:656] Stopping watch factory\\\\nI1003 07:48:55.868734 5946 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2jpvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:58Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.170065 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.170107 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.170123 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.170143 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.170156 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:58Z","lastTransitionTime":"2025-10-03T07:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.187005 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h865c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e441c3-e8db-4705-9da1-0c6513d57048\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de7994382833aa2a4b6d7926e96257038511c4f547f24336eb11f3cf89e2d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h865c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:58Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.199918 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-27zqq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988af70a-5398-4c96-b2a7-4b8e143303bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cr7ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cr7ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-27zqq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:58Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.212396 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a8b21d2fb73dbe0bf41ef524cce8befb05dca708bc568a567ccd71bc1434b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:58Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.226679 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:58Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.241703 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hqfhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e03e57b4-cb64-42fa-b8c5-ee4863291568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31867b4c35050cbe6c4247edfe085dc9429d5516c19ad8da432699262b7fd092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hqfhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:58Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.248352 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/988af70a-5398-4c96-b2a7-4b8e143303bc-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-27zqq\" (UID: \"988af70a-5398-4c96-b2a7-4b8e143303bc\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-27zqq" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.248455 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr7ss\" (UniqueName: \"kubernetes.io/projected/988af70a-5398-4c96-b2a7-4b8e143303bc-kube-api-access-cr7ss\") pod \"ovnkube-control-plane-749d76644c-27zqq\" (UID: \"988af70a-5398-4c96-b2a7-4b8e143303bc\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-27zqq" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.249040 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/988af70a-5398-4c96-b2a7-4b8e143303bc-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-27zqq\" (UID: \"988af70a-5398-4c96-b2a7-4b8e143303bc\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-27zqq" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.249092 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/988af70a-5398-4c96-b2a7-4b8e143303bc-env-overrides\") pod \"ovnkube-control-plane-749d76644c-27zqq\" (UID: \"988af70a-5398-4c96-b2a7-4b8e143303bc\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-27zqq" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.257211 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8078a587-9c95-43a8-9cf8-4286835f8134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938aaa7e3e507e4f04632ecbdeb12d1e169b59fdcd8c19838e21fedc4b55d721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682632866f1e606691047ac952506c02f250c825f142a1676a4abcca32e58466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7eed5fa26eeb30a803512265d01d5fbb9fffad00d857a683939158da6b0bb7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e4e705e4bd02aaf141572ec91291240f8f7ec92b33cd45ba98ea8ce83f06918\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:58Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.271306 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:58Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.272787 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.272850 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.272862 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.272879 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.272889 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:58Z","lastTransitionTime":"2025-10-03T07:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.287382 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d6a6dd75fa3f860ac6164f4065d5fbb3ea31148102e726b792655d5b08e34d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:58Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.305635 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c51653288453da9ac24f91fdd67f9cec6bea6dc9b90432eeab5350b7e48f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://741ad5608cab1c104f0a912ca7325733265167e85291eb6bc1b08dd5a23a9b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:58Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.321119 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:58Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.335085 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b81ce-0ce7-498f-9337-ae5e6e64682b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf6aded6ff3dcd5fbea90e46bb42d6db6b963c9688abfed6fa40662139fb80f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b446e000609801e974810aa233005f7c2810cc977c7a13155452de76e1268427\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x9dgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:58Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.346073 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9z9q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4492bc6-9e61-4748-935e-e070a703c05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aac5ac1e508036a7e3cee3eb11577734e7923c4076399288863f294b7efa3fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqzrd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9z9q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:58Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.349934 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cr7ss\" (UniqueName: \"kubernetes.io/projected/988af70a-5398-4c96-b2a7-4b8e143303bc-kube-api-access-cr7ss\") pod \"ovnkube-control-plane-749d76644c-27zqq\" (UID: \"988af70a-5398-4c96-b2a7-4b8e143303bc\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-27zqq" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.350085 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/988af70a-5398-4c96-b2a7-4b8e143303bc-env-overrides\") pod \"ovnkube-control-plane-749d76644c-27zqq\" (UID: \"988af70a-5398-4c96-b2a7-4b8e143303bc\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-27zqq" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.350111 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/988af70a-5398-4c96-b2a7-4b8e143303bc-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-27zqq\" (UID: \"988af70a-5398-4c96-b2a7-4b8e143303bc\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-27zqq" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.350137 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/988af70a-5398-4c96-b2a7-4b8e143303bc-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-27zqq\" (UID: \"988af70a-5398-4c96-b2a7-4b8e143303bc\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-27zqq" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.351689 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/988af70a-5398-4c96-b2a7-4b8e143303bc-env-overrides\") pod \"ovnkube-control-plane-749d76644c-27zqq\" (UID: \"988af70a-5398-4c96-b2a7-4b8e143303bc\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-27zqq" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.352005 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/988af70a-5398-4c96-b2a7-4b8e143303bc-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-27zqq\" (UID: \"988af70a-5398-4c96-b2a7-4b8e143303bc\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-27zqq" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.357446 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/988af70a-5398-4c96-b2a7-4b8e143303bc-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-27zqq\" (UID: \"988af70a-5398-4c96-b2a7-4b8e143303bc\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-27zqq" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.363177 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-72cm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6998d742-8d17-4f20-ab52-c30d9f7b0b89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93e9f047fd6ffc420a229d1159b5c53c40c5efe1f945e45bafa3ba93a69caef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9hpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-72cm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:58Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.367342 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr7ss\" (UniqueName: \"kubernetes.io/projected/988af70a-5398-4c96-b2a7-4b8e143303bc-kube-api-access-cr7ss\") pod \"ovnkube-control-plane-749d76644c-27zqq\" (UID: \"988af70a-5398-4c96-b2a7-4b8e143303bc\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-27zqq" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.376325 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.376976 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.376989 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.377011 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.377024 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:58Z","lastTransitionTime":"2025-10-03T07:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.380210 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7002f9af-9339-4bec-8b7a-ce0c1d20f3ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96ea8ece692f200d9dbe3650ba30a3680897c6d7c33c9b359b27e31692689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f580c83a6ce9959ed5a1c8b78834f7d6caecaeff3ed366fdef991ff728b08d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed30aa5b737df201bfca9e17073255b5d791430429d86149107650c38a1c0c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 07:48:39.005956 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 07:48:39.006283 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 07:48:39.007038 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3527224080/tls.crt::/tmp/serving-cert-3527224080/tls.key\\\\\\\"\\\\nI1003 07:48:39.474760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 07:48:39.479276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 07:48:39.479354 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 07:48:39.479401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 07:48:39.479427 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 07:48:39.489134 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 07:48:39.489225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 07:48:39.489291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 07:48:39.489311 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 07:48:39.489336 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 07:48:39.489506 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 07:48:39.491496 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1078b48064376958ba64f9040dc1973628494ff4c9935c90c296bdb8a27f8738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:58Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.399990 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce0e6687a56096c1c89207f11b667732301910fdefad7e84fabca337fb456c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9a2122c70537d35a5c0829063c331dcaecb4f1835ebc5bf577c2441b5ccae3d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T07:48:55Z\\\",\\\"message\\\":\\\"946 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 07:48:55.868387 5946 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1003 07:48:55.868399 5946 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 07:48:55.868411 5946 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1003 07:48:55.867999 5946 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1003 07:48:55.868515 5946 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1003 07:48:55.868533 5946 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1003 07:48:55.868543 5946 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 07:48:55.868590 5946 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1003 07:48:55.868627 5946 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 07:48:55.868655 5946 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1003 07:48:55.868705 5946 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1003 07:48:55.868708 5946 factory.go:656] Stopping watch factory\\\\nI1003 07:48:55.868734 5946 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ce0e6687a56096c1c89207f11b667732301910fdefad7e84fabca337fb456c4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T07:48:57Z\\\",\\\"message\\\":\\\"ode-ca-9z9q9\\\\nI1003 07:48:57.083722 6091 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1003 07:48:57.083725 6091 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI1003 07:48:57.083739 6091 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1003 07:48:57.083744 6091 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1003 07:48:57.083746 6091 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nF1003 07:48:57.083749 6091 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2jpvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:58Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.415626 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h865c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e441c3-e8db-4705-9da1-0c6513d57048\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de7994382833aa2a4b6d7926e96257038511c4f547f24336eb11f3cf89e2d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h865c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:58Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.430534 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-27zqq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988af70a-5398-4c96-b2a7-4b8e143303bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cr7ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cr7ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-27zqq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:58Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.438717 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-27zqq" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.441691 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hqfhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e03e57b4-cb64-42fa-b8c5-ee4863291568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31867b4c35050cbe6c4247edfe085dc9429d5516c19ad8da432699262b7fd092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hqfhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:58Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.455917 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8078a587-9c95-43a8-9cf8-4286835f8134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938aaa7e3e507e4f04632ecbdeb12d1e169b59fdcd8c19838e21fedc4b55d721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682632866f1e606691047ac952506c02f250c825f142a1676a4abcca32e58466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7eed5fa26eeb30a803512265d01d5fbb9fffad00d857a683939158da6b0bb7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e4e705e4bd02aaf141572ec91291240f8f7ec92b33cd45ba98ea8ce83f06918\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:58Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.473550 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:58Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.480103 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.480140 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.480151 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.480166 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.480190 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:58Z","lastTransitionTime":"2025-10-03T07:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.485753 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d6a6dd75fa3f860ac6164f4065d5fbb3ea31148102e726b792655d5b08e34d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:58Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.499149 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c51653288453da9ac24f91fdd67f9cec6bea6dc9b90432eeab5350b7e48f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://741ad5608cab1c104f0a912ca7325733265167e85291eb6bc1b08dd5a23a9b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:58Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.514726 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a8b21d2fb73dbe0bf41ef524cce8befb05dca708bc568a567ccd71bc1434b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:58Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.530682 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:58Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.546281 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:58Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.564282 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b81ce-0ce7-498f-9337-ae5e6e64682b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf6aded6ff3dcd5fbea90e46bb42d6db6b963c9688abfed6fa40662139fb80f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b446e000609801e974810aa233005f7c2810cc977c7a13155452de76e1268427\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x9dgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:58Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.577163 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9z9q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4492bc6-9e61-4748-935e-e070a703c05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aac5ac1e508036a7e3cee3eb11577734e7923c4076399288863f294b7efa3fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqzrd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9z9q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:58Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.582781 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.582831 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.582848 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.582873 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.582890 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:58Z","lastTransitionTime":"2025-10-03T07:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.593427 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-72cm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6998d742-8d17-4f20-ab52-c30d9f7b0b89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93e9f047fd6ffc420a229d1159b5c53c40c5efe1f945e45bafa3ba93a69caef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9hpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-72cm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:58Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.685986 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.686069 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.686096 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.686137 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.686165 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:58Z","lastTransitionTime":"2025-10-03T07:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.789831 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.789907 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.789927 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.789959 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.789979 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:58Z","lastTransitionTime":"2025-10-03T07:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.893153 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.893199 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.893209 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.893225 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.893234 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:58Z","lastTransitionTime":"2025-10-03T07:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.995965 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.996021 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.996033 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.996054 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:58 crc kubenswrapper[4664]: I1003 07:48:58.996066 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:58Z","lastTransitionTime":"2025-10-03T07:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.099492 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.099556 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.099574 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.099592 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.099639 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:59Z","lastTransitionTime":"2025-10-03T07:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.150815 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2jpvm_8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d/ovnkube-controller/1.log" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.156071 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-27zqq" event={"ID":"988af70a-5398-4c96-b2a7-4b8e143303bc","Type":"ContainerStarted","Data":"a7471662111bde87f7e570b0e24713cd97e01ff556ab714cf0869a809b88e23b"} Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.203132 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.203194 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.203208 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.203233 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.203251 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:59Z","lastTransitionTime":"2025-10-03T07:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.306978 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.307052 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.307065 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.307089 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.307103 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:59Z","lastTransitionTime":"2025-10-03T07:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.410068 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.410151 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.410171 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.410201 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.410263 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:59Z","lastTransitionTime":"2025-10-03T07:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.513553 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.513653 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.513672 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.513702 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.513720 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:59Z","lastTransitionTime":"2025-10-03T07:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.597174 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-l687s"] Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.597623 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:48:59 crc kubenswrapper[4664]: E1003 07:48:59.597681 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l687s" podUID="7f2800e0-b66e-4ab2-ad4f-37c5ffe60120" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.609509 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:59Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.616989 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.617025 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.617034 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.617049 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.617058 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:59Z","lastTransitionTime":"2025-10-03T07:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.620642 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b81ce-0ce7-498f-9337-ae5e6e64682b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf6aded6ff3dcd5fbea90e46bb42d6db6b963c9688abfed6fa40662139fb80f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b446e000609801e974810aa233005f7c2810cc977c7a13155452de76e1268427\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x9dgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:59Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.630999 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9z9q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4492bc6-9e61-4748-935e-e070a703c05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aac5ac1e508036a7e3cee3eb11577734e7923c4076399288863f294b7efa3fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqzrd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9z9q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:59Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.648050 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-72cm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6998d742-8d17-4f20-ab52-c30d9f7b0b89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93e9f047fd6ffc420a229d1159b5c53c40c5efe1f945e45bafa3ba93a69caef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9hpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-72cm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:59Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.657498 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l687s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f2800e0-b66e-4ab2-ad4f-37c5ffe60120\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6h72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6h72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l687s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:59Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.664501 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6h72r\" (UniqueName: \"kubernetes.io/projected/7f2800e0-b66e-4ab2-ad4f-37c5ffe60120-kube-api-access-6h72r\") pod \"network-metrics-daemon-l687s\" (UID: \"7f2800e0-b66e-4ab2-ad4f-37c5ffe60120\") " pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.664539 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7f2800e0-b66e-4ab2-ad4f-37c5ffe60120-metrics-certs\") pod \"network-metrics-daemon-l687s\" (UID: \"7f2800e0-b66e-4ab2-ad4f-37c5ffe60120\") " pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.671982 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7002f9af-9339-4bec-8b7a-ce0c1d20f3ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96ea8ece692f200d9dbe3650ba30a3680897c6d7c33c9b359b27e31692689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f580c83a6ce9959ed5a1c8b78834f7d6caecaeff3ed366fdef991ff728b08d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed30aa5b737df201bfca9e17073255b5d791430429d86149107650c38a1c0c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 07:48:39.005956 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 07:48:39.006283 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 07:48:39.007038 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3527224080/tls.crt::/tmp/serving-cert-3527224080/tls.key\\\\\\\"\\\\nI1003 07:48:39.474760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 07:48:39.479276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 07:48:39.479354 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 07:48:39.479401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 07:48:39.479427 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 07:48:39.489134 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 07:48:39.489225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 07:48:39.489291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 07:48:39.489311 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 07:48:39.489336 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 07:48:39.489506 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 07:48:39.491496 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1078b48064376958ba64f9040dc1973628494ff4c9935c90c296bdb8a27f8738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:59Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.690390 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce0e6687a56096c1c89207f11b667732301910fdefad7e84fabca337fb456c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9a2122c70537d35a5c0829063c331dcaecb4f1835ebc5bf577c2441b5ccae3d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T07:48:55Z\\\",\\\"message\\\":\\\"946 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 07:48:55.868387 5946 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1003 07:48:55.868399 5946 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 07:48:55.868411 5946 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1003 07:48:55.867999 5946 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1003 07:48:55.868515 5946 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1003 07:48:55.868533 5946 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1003 07:48:55.868543 5946 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 07:48:55.868590 5946 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1003 07:48:55.868627 5946 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 07:48:55.868655 5946 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1003 07:48:55.868705 5946 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1003 07:48:55.868708 5946 factory.go:656] Stopping watch factory\\\\nI1003 07:48:55.868734 5946 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ce0e6687a56096c1c89207f11b667732301910fdefad7e84fabca337fb456c4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T07:48:57Z\\\",\\\"message\\\":\\\"ode-ca-9z9q9\\\\nI1003 07:48:57.083722 6091 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1003 07:48:57.083725 6091 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI1003 07:48:57.083739 6091 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1003 07:48:57.083744 6091 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1003 07:48:57.083746 6091 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nF1003 07:48:57.083749 6091 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2jpvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:59Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.704478 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h865c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e441c3-e8db-4705-9da1-0c6513d57048\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de7994382833aa2a4b6d7926e96257038511c4f547f24336eb11f3cf89e2d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h865c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:59Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.715272 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-27zqq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988af70a-5398-4c96-b2a7-4b8e143303bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cr7ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cr7ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-27zqq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:59Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.719113 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.719151 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.719162 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.719178 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.719189 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:59Z","lastTransitionTime":"2025-10-03T07:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.728667 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c51653288453da9ac24f91fdd67f9cec6bea6dc9b90432eeab5350b7e48f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://741ad5608cab1c104f0a912ca7325733265167e85291eb6bc1b08dd5a23a9b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:59Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.741411 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a8b21d2fb73dbe0bf41ef524cce8befb05dca708bc568a567ccd71bc1434b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:59Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.757865 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:59Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.765573 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7f2800e0-b66e-4ab2-ad4f-37c5ffe60120-metrics-certs\") pod \"network-metrics-daemon-l687s\" (UID: \"7f2800e0-b66e-4ab2-ad4f-37c5ffe60120\") " pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.765716 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6h72r\" (UniqueName: \"kubernetes.io/projected/7f2800e0-b66e-4ab2-ad4f-37c5ffe60120-kube-api-access-6h72r\") pod \"network-metrics-daemon-l687s\" (UID: \"7f2800e0-b66e-4ab2-ad4f-37c5ffe60120\") " pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:48:59 crc kubenswrapper[4664]: E1003 07:48:59.765773 4664 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 07:48:59 crc kubenswrapper[4664]: E1003 07:48:59.765856 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f2800e0-b66e-4ab2-ad4f-37c5ffe60120-metrics-certs podName:7f2800e0-b66e-4ab2-ad4f-37c5ffe60120 nodeName:}" failed. No retries permitted until 2025-10-03 07:49:00.26583363 +0000 UTC m=+41.087024120 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7f2800e0-b66e-4ab2-ad4f-37c5ffe60120-metrics-certs") pod "network-metrics-daemon-l687s" (UID: "7f2800e0-b66e-4ab2-ad4f-37c5ffe60120") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.770249 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hqfhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e03e57b4-cb64-42fa-b8c5-ee4863291568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31867b4c35050cbe6c4247edfe085dc9429d5516c19ad8da432699262b7fd092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hqfhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:59Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.784998 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8078a587-9c95-43a8-9cf8-4286835f8134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938aaa7e3e507e4f04632ecbdeb12d1e169b59fdcd8c19838e21fedc4b55d721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682632866f1e606691047ac952506c02f250c825f142a1676a4abcca32e58466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7eed5fa26eeb30a803512265d01d5fbb9fffad00d857a683939158da6b0bb7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e4e705e4bd02aaf141572ec91291240f8f7ec92b33cd45ba98ea8ce83f06918\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:59Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.786057 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6h72r\" (UniqueName: \"kubernetes.io/projected/7f2800e0-b66e-4ab2-ad4f-37c5ffe60120-kube-api-access-6h72r\") pod \"network-metrics-daemon-l687s\" (UID: \"7f2800e0-b66e-4ab2-ad4f-37c5ffe60120\") " pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.802089 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:59Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.814209 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d6a6dd75fa3f860ac6164f4065d5fbb3ea31148102e726b792655d5b08e34d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:59Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.822082 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.822127 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.822137 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.822152 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.822162 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:59Z","lastTransitionTime":"2025-10-03T07:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.875295 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.875384 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:48:59 crc kubenswrapper[4664]: E1003 07:48:59.875432 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.875302 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:48:59 crc kubenswrapper[4664]: E1003 07:48:59.875753 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:48:59 crc kubenswrapper[4664]: E1003 07:48:59.875839 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.887889 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l687s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f2800e0-b66e-4ab2-ad4f-37c5ffe60120\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6h72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6h72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l687s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:59Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.901709 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7002f9af-9339-4bec-8b7a-ce0c1d20f3ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96ea8ece692f200d9dbe3650ba30a3680897c6d7c33c9b359b27e31692689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f580c83a6ce9959ed5a1c8b78834f7d6caecaeff3ed366fdef991ff728b08d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed30aa5b737df201bfca9e17073255b5d791430429d86149107650c38a1c0c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 07:48:39.005956 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 07:48:39.006283 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 07:48:39.007038 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3527224080/tls.crt::/tmp/serving-cert-3527224080/tls.key\\\\\\\"\\\\nI1003 07:48:39.474760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 07:48:39.479276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 07:48:39.479354 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 07:48:39.479401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 07:48:39.479427 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 07:48:39.489134 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 07:48:39.489225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 07:48:39.489291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 07:48:39.489311 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 07:48:39.489336 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 07:48:39.489506 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 07:48:39.491496 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1078b48064376958ba64f9040dc1973628494ff4c9935c90c296bdb8a27f8738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:59Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.922290 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce0e6687a56096c1c89207f11b667732301910fdefad7e84fabca337fb456c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9a2122c70537d35a5c0829063c331dcaecb4f1835ebc5bf577c2441b5ccae3d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T07:48:55Z\\\",\\\"message\\\":\\\"946 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 07:48:55.868387 5946 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1003 07:48:55.868399 5946 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 07:48:55.868411 5946 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1003 07:48:55.867999 5946 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1003 07:48:55.868515 5946 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1003 07:48:55.868533 5946 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1003 07:48:55.868543 5946 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 07:48:55.868590 5946 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1003 07:48:55.868627 5946 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 07:48:55.868655 5946 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1003 07:48:55.868705 5946 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1003 07:48:55.868708 5946 factory.go:656] Stopping watch factory\\\\nI1003 07:48:55.868734 5946 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ce0e6687a56096c1c89207f11b667732301910fdefad7e84fabca337fb456c4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T07:48:57Z\\\",\\\"message\\\":\\\"ode-ca-9z9q9\\\\nI1003 07:48:57.083722 6091 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1003 07:48:57.083725 6091 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI1003 07:48:57.083739 6091 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1003 07:48:57.083744 6091 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1003 07:48:57.083746 6091 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nF1003 07:48:57.083749 6091 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2jpvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:59Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.925340 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.925395 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.925406 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.925423 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.925816 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:48:59Z","lastTransitionTime":"2025-10-03T07:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.938681 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h865c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e441c3-e8db-4705-9da1-0c6513d57048\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de7994382833aa2a4b6d7926e96257038511c4f547f24336eb11f3cf89e2d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h865c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:59Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.951845 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-27zqq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988af70a-5398-4c96-b2a7-4b8e143303bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cr7ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cr7ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-27zqq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:59Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.965310 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c51653288453da9ac24f91fdd67f9cec6bea6dc9b90432eeab5350b7e48f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://741ad5608cab1c104f0a912ca7325733265167e85291eb6bc1b08dd5a23a9b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:59Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.980300 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a8b21d2fb73dbe0bf41ef524cce8befb05dca708bc568a567ccd71bc1434b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:59Z is after 2025-08-24T17:21:41Z" Oct 03 07:48:59 crc kubenswrapper[4664]: I1003 07:48:59.994038 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:48:59Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.005463 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hqfhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e03e57b4-cb64-42fa-b8c5-ee4863291568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31867b4c35050cbe6c4247edfe085dc9429d5516c19ad8da432699262b7fd092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hqfhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:00Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.029013 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.029081 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.029104 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.029132 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.029151 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:00Z","lastTransitionTime":"2025-10-03T07:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.044109 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8078a587-9c95-43a8-9cf8-4286835f8134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938aaa7e3e507e4f04632ecbdeb12d1e169b59fdcd8c19838e21fedc4b55d721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682632866f1e606691047ac952506c02f250c825f142a1676a4abcca32e58466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7eed5fa26eeb30a803512265d01d5fbb9fffad00d857a683939158da6b0bb7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e4e705e4bd02aaf141572ec91291240f8f7ec92b33cd45ba98ea8ce83f06918\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:00Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.077031 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:00Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.096584 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d6a6dd75fa3f860ac6164f4065d5fbb3ea31148102e726b792655d5b08e34d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:00Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.110850 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:00Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.125424 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b81ce-0ce7-498f-9337-ae5e6e64682b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf6aded6ff3dcd5fbea90e46bb42d6db6b963c9688abfed6fa40662139fb80f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b446e000609801e974810aa233005f7c2810cc977c7a13155452de76e1268427\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x9dgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:00Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.131263 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.131319 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.131329 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.131348 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.131361 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:00Z","lastTransitionTime":"2025-10-03T07:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.141076 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9z9q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4492bc6-9e61-4748-935e-e070a703c05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aac5ac1e508036a7e3cee3eb11577734e7923c4076399288863f294b7efa3fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqzrd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9z9q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:00Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.154798 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-72cm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6998d742-8d17-4f20-ab52-c30d9f7b0b89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93e9f047fd6ffc420a229d1159b5c53c40c5efe1f945e45bafa3ba93a69caef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9hpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-72cm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:00Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.161548 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-27zqq" event={"ID":"988af70a-5398-4c96-b2a7-4b8e143303bc","Type":"ContainerStarted","Data":"bd25565edabdcc002b63aece712eb1cf0ff4eece8277aa01ad4875f2977f63ac"} Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.161617 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-27zqq" event={"ID":"988af70a-5398-4c96-b2a7-4b8e143303bc","Type":"ContainerStarted","Data":"7f95198276df57dfddfafb72c325577486b3eddb40e73fbe4afa95e8ce8a6183"} Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.175702 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-72cm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6998d742-8d17-4f20-ab52-c30d9f7b0b89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93e9f047fd6ffc420a229d1159b5c53c40c5efe1f945e45bafa3ba93a69caef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9hpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-72cm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:00Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.191963 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h865c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e441c3-e8db-4705-9da1-0c6513d57048\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de7994382833aa2a4b6d7926e96257038511c4f547f24336eb11f3cf89e2d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h865c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:00Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.204103 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-27zqq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988af70a-5398-4c96-b2a7-4b8e143303bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f95198276df57dfddfafb72c325577486b3eddb40e73fbe4afa95e8ce8a6183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cr7ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd25565edabdcc002b63aece712eb1cf0ff4eece8277aa01ad4875f2977f63ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cr7ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-27zqq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:00Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.216123 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l687s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f2800e0-b66e-4ab2-ad4f-37c5ffe60120\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6h72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6h72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l687s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:00Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.235079 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7002f9af-9339-4bec-8b7a-ce0c1d20f3ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96ea8ece692f200d9dbe3650ba30a3680897c6d7c33c9b359b27e31692689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f580c83a6ce9959ed5a1c8b78834f7d6caecaeff3ed366fdef991ff728b08d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed30aa5b737df201bfca9e17073255b5d791430429d86149107650c38a1c0c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 07:48:39.005956 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 07:48:39.006283 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 07:48:39.007038 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3527224080/tls.crt::/tmp/serving-cert-3527224080/tls.key\\\\\\\"\\\\nI1003 07:48:39.474760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 07:48:39.479276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 07:48:39.479354 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 07:48:39.479401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 07:48:39.479427 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 07:48:39.489134 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 07:48:39.489225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 07:48:39.489291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 07:48:39.489311 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 07:48:39.489336 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 07:48:39.489506 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 07:48:39.491496 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1078b48064376958ba64f9040dc1973628494ff4c9935c90c296bdb8a27f8738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:00Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.240045 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.240081 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.240091 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.240111 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.240123 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:00Z","lastTransitionTime":"2025-10-03T07:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.257634 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce0e6687a56096c1c89207f11b667732301910fdefad7e84fabca337fb456c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9a2122c70537d35a5c0829063c331dcaecb4f1835ebc5bf577c2441b5ccae3d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T07:48:55Z\\\",\\\"message\\\":\\\"946 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 07:48:55.868387 5946 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1003 07:48:55.868399 5946 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 07:48:55.868411 5946 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1003 07:48:55.867999 5946 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1003 07:48:55.868515 5946 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1003 07:48:55.868533 5946 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1003 07:48:55.868543 5946 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 07:48:55.868590 5946 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1003 07:48:55.868627 5946 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 07:48:55.868655 5946 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1003 07:48:55.868705 5946 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1003 07:48:55.868708 5946 factory.go:656] Stopping watch factory\\\\nI1003 07:48:55.868734 5946 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ce0e6687a56096c1c89207f11b667732301910fdefad7e84fabca337fb456c4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T07:48:57Z\\\",\\\"message\\\":\\\"ode-ca-9z9q9\\\\nI1003 07:48:57.083722 6091 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1003 07:48:57.083725 6091 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI1003 07:48:57.083739 6091 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1003 07:48:57.083744 6091 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1003 07:48:57.083746 6091 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nF1003 07:48:57.083749 6091 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2jpvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:00Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.271296 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7f2800e0-b66e-4ab2-ad4f-37c5ffe60120-metrics-certs\") pod \"network-metrics-daemon-l687s\" (UID: \"7f2800e0-b66e-4ab2-ad4f-37c5ffe60120\") " pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:49:00 crc kubenswrapper[4664]: E1003 07:49:00.271495 4664 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 07:49:00 crc kubenswrapper[4664]: E1003 07:49:00.271588 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f2800e0-b66e-4ab2-ad4f-37c5ffe60120-metrics-certs podName:7f2800e0-b66e-4ab2-ad4f-37c5ffe60120 nodeName:}" failed. No retries permitted until 2025-10-03 07:49:01.271561971 +0000 UTC m=+42.092752461 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7f2800e0-b66e-4ab2-ad4f-37c5ffe60120-metrics-certs") pod "network-metrics-daemon-l687s" (UID: "7f2800e0-b66e-4ab2-ad4f-37c5ffe60120") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.273035 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:00Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.295469 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d6a6dd75fa3f860ac6164f4065d5fbb3ea31148102e726b792655d5b08e34d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:00Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.317256 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c51653288453da9ac24f91fdd67f9cec6bea6dc9b90432eeab5350b7e48f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://741ad5608cab1c104f0a912ca7325733265167e85291eb6bc1b08dd5a23a9b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:00Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.330586 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a8b21d2fb73dbe0bf41ef524cce8befb05dca708bc568a567ccd71bc1434b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:00Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.342508 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.342550 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.342564 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.342588 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.342636 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:00Z","lastTransitionTime":"2025-10-03T07:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.347019 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:00Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.362229 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hqfhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e03e57b4-cb64-42fa-b8c5-ee4863291568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31867b4c35050cbe6c4247edfe085dc9429d5516c19ad8da432699262b7fd092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hqfhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:00Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.375203 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8078a587-9c95-43a8-9cf8-4286835f8134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938aaa7e3e507e4f04632ecbdeb12d1e169b59fdcd8c19838e21fedc4b55d721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682632866f1e606691047ac952506c02f250c825f142a1676a4abcca32e58466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7eed5fa26eeb30a803512265d01d5fbb9fffad00d857a683939158da6b0bb7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e4e705e4bd02aaf141572ec91291240f8f7ec92b33cd45ba98ea8ce83f06918\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:00Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.387087 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b81ce-0ce7-498f-9337-ae5e6e64682b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf6aded6ff3dcd5fbea90e46bb42d6db6b963c9688abfed6fa40662139fb80f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b446e000609801e974810aa233005f7c2810cc977c7a13155452de76e1268427\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x9dgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:00Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.401476 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9z9q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4492bc6-9e61-4748-935e-e070a703c05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aac5ac1e508036a7e3cee3eb11577734e7923c4076399288863f294b7efa3fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqzrd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9z9q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:00Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.418385 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:00Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.446625 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.446679 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.446692 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.446714 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.446729 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:00Z","lastTransitionTime":"2025-10-03T07:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.560200 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.560265 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.560278 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.560298 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.560310 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:00Z","lastTransitionTime":"2025-10-03T07:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.663447 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.663543 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.663565 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.663590 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.663627 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:00Z","lastTransitionTime":"2025-10-03T07:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.766134 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.766174 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.766185 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.766202 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.766212 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:00Z","lastTransitionTime":"2025-10-03T07:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.869432 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.869473 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.869484 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.869503 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.869518 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:00Z","lastTransitionTime":"2025-10-03T07:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.875211 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:49:00 crc kubenswrapper[4664]: E1003 07:49:00.875418 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l687s" podUID="7f2800e0-b66e-4ab2-ad4f-37c5ffe60120" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.972381 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.972443 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.972456 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.972473 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:00 crc kubenswrapper[4664]: I1003 07:49:00.972485 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:00Z","lastTransitionTime":"2025-10-03T07:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:01 crc kubenswrapper[4664]: I1003 07:49:01.074572 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:01 crc kubenswrapper[4664]: I1003 07:49:01.074652 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:01 crc kubenswrapper[4664]: I1003 07:49:01.074666 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:01 crc kubenswrapper[4664]: I1003 07:49:01.074691 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:01 crc kubenswrapper[4664]: I1003 07:49:01.074712 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:01Z","lastTransitionTime":"2025-10-03T07:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:01 crc kubenswrapper[4664]: I1003 07:49:01.176227 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:01 crc kubenswrapper[4664]: I1003 07:49:01.176269 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:01 crc kubenswrapper[4664]: I1003 07:49:01.176278 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:01 crc kubenswrapper[4664]: I1003 07:49:01.176292 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:01 crc kubenswrapper[4664]: I1003 07:49:01.176304 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:01Z","lastTransitionTime":"2025-10-03T07:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:01 crc kubenswrapper[4664]: I1003 07:49:01.279733 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:01 crc kubenswrapper[4664]: I1003 07:49:01.279803 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:01 crc kubenswrapper[4664]: I1003 07:49:01.279822 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:01 crc kubenswrapper[4664]: I1003 07:49:01.279850 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:01 crc kubenswrapper[4664]: I1003 07:49:01.279864 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:01Z","lastTransitionTime":"2025-10-03T07:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:01 crc kubenswrapper[4664]: I1003 07:49:01.281056 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7f2800e0-b66e-4ab2-ad4f-37c5ffe60120-metrics-certs\") pod \"network-metrics-daemon-l687s\" (UID: \"7f2800e0-b66e-4ab2-ad4f-37c5ffe60120\") " pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:49:01 crc kubenswrapper[4664]: E1003 07:49:01.281629 4664 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 07:49:01 crc kubenswrapper[4664]: E1003 07:49:01.281714 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f2800e0-b66e-4ab2-ad4f-37c5ffe60120-metrics-certs podName:7f2800e0-b66e-4ab2-ad4f-37c5ffe60120 nodeName:}" failed. No retries permitted until 2025-10-03 07:49:03.281691714 +0000 UTC m=+44.102882214 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7f2800e0-b66e-4ab2-ad4f-37c5ffe60120-metrics-certs") pod "network-metrics-daemon-l687s" (UID: "7f2800e0-b66e-4ab2-ad4f-37c5ffe60120") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 07:49:01 crc kubenswrapper[4664]: I1003 07:49:01.382840 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:01 crc kubenswrapper[4664]: I1003 07:49:01.382897 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:01 crc kubenswrapper[4664]: I1003 07:49:01.382911 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:01 crc kubenswrapper[4664]: I1003 07:49:01.382931 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:01 crc kubenswrapper[4664]: I1003 07:49:01.382945 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:01Z","lastTransitionTime":"2025-10-03T07:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:01 crc kubenswrapper[4664]: I1003 07:49:01.485812 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:01 crc kubenswrapper[4664]: I1003 07:49:01.485853 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:01 crc kubenswrapper[4664]: I1003 07:49:01.485862 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:01 crc kubenswrapper[4664]: I1003 07:49:01.485882 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:01 crc kubenswrapper[4664]: I1003 07:49:01.485892 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:01Z","lastTransitionTime":"2025-10-03T07:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:01 crc kubenswrapper[4664]: I1003 07:49:01.587961 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:01 crc kubenswrapper[4664]: I1003 07:49:01.588013 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:01 crc kubenswrapper[4664]: I1003 07:49:01.588026 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:01 crc kubenswrapper[4664]: I1003 07:49:01.588044 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:01 crc kubenswrapper[4664]: I1003 07:49:01.588055 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:01Z","lastTransitionTime":"2025-10-03T07:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:01 crc kubenswrapper[4664]: I1003 07:49:01.690118 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:01 crc kubenswrapper[4664]: I1003 07:49:01.690159 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:01 crc kubenswrapper[4664]: I1003 07:49:01.690169 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:01 crc kubenswrapper[4664]: I1003 07:49:01.690184 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:01 crc kubenswrapper[4664]: I1003 07:49:01.690194 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:01Z","lastTransitionTime":"2025-10-03T07:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:01 crc kubenswrapper[4664]: I1003 07:49:01.792266 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:01 crc kubenswrapper[4664]: I1003 07:49:01.792296 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:01 crc kubenswrapper[4664]: I1003 07:49:01.792305 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:01 crc kubenswrapper[4664]: I1003 07:49:01.792317 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:01 crc kubenswrapper[4664]: I1003 07:49:01.792326 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:01Z","lastTransitionTime":"2025-10-03T07:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:01 crc kubenswrapper[4664]: I1003 07:49:01.875868 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:49:01 crc kubenswrapper[4664]: I1003 07:49:01.875948 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:49:01 crc kubenswrapper[4664]: E1003 07:49:01.876031 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:49:01 crc kubenswrapper[4664]: I1003 07:49:01.876267 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:49:01 crc kubenswrapper[4664]: E1003 07:49:01.876401 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:49:01 crc kubenswrapper[4664]: I1003 07:49:01.876587 4664 scope.go:117] "RemoveContainer" containerID="387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9" Oct 03 07:49:01 crc kubenswrapper[4664]: E1003 07:49:01.876595 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:49:01 crc kubenswrapper[4664]: I1003 07:49:01.893854 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:01 crc kubenswrapper[4664]: I1003 07:49:01.893881 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:01 crc kubenswrapper[4664]: I1003 07:49:01.893890 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:01 crc kubenswrapper[4664]: I1003 07:49:01.893905 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:01 crc kubenswrapper[4664]: I1003 07:49:01.893913 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:01Z","lastTransitionTime":"2025-10-03T07:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:01 crc kubenswrapper[4664]: I1003 07:49:01.996470 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:01 crc kubenswrapper[4664]: I1003 07:49:01.996503 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:01 crc kubenswrapper[4664]: I1003 07:49:01.996514 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:01 crc kubenswrapper[4664]: I1003 07:49:01.996530 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:01 crc kubenswrapper[4664]: I1003 07:49:01.996541 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:01Z","lastTransitionTime":"2025-10-03T07:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:02 crc kubenswrapper[4664]: I1003 07:49:02.099291 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:02 crc kubenswrapper[4664]: I1003 07:49:02.099335 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:02 crc kubenswrapper[4664]: I1003 07:49:02.099345 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:02 crc kubenswrapper[4664]: I1003 07:49:02.099359 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:02 crc kubenswrapper[4664]: I1003 07:49:02.099370 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:02Z","lastTransitionTime":"2025-10-03T07:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:02 crc kubenswrapper[4664]: I1003 07:49:02.172626 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 03 07:49:02 crc kubenswrapper[4664]: I1003 07:49:02.175520 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"89484df39df055036ea8f81fb4e052a14292e7268e7c52088a497348141c3a81"} Oct 03 07:49:02 crc kubenswrapper[4664]: I1003 07:49:02.202498 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:02 crc kubenswrapper[4664]: I1003 07:49:02.202541 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:02 crc kubenswrapper[4664]: I1003 07:49:02.202550 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:02 crc kubenswrapper[4664]: I1003 07:49:02.202565 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:02 crc kubenswrapper[4664]: I1003 07:49:02.202578 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:02Z","lastTransitionTime":"2025-10-03T07:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:02 crc kubenswrapper[4664]: I1003 07:49:02.305473 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:02 crc kubenswrapper[4664]: I1003 07:49:02.305510 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:02 crc kubenswrapper[4664]: I1003 07:49:02.305522 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:02 crc kubenswrapper[4664]: I1003 07:49:02.305539 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:02 crc kubenswrapper[4664]: I1003 07:49:02.305549 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:02Z","lastTransitionTime":"2025-10-03T07:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:02 crc kubenswrapper[4664]: I1003 07:49:02.408968 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:02 crc kubenswrapper[4664]: I1003 07:49:02.409012 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:02 crc kubenswrapper[4664]: I1003 07:49:02.409028 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:02 crc kubenswrapper[4664]: I1003 07:49:02.409052 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:02 crc kubenswrapper[4664]: I1003 07:49:02.409068 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:02Z","lastTransitionTime":"2025-10-03T07:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:02 crc kubenswrapper[4664]: I1003 07:49:02.511214 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:02 crc kubenswrapper[4664]: I1003 07:49:02.511257 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:02 crc kubenswrapper[4664]: I1003 07:49:02.511267 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:02 crc kubenswrapper[4664]: I1003 07:49:02.511281 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:02 crc kubenswrapper[4664]: I1003 07:49:02.511295 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:02Z","lastTransitionTime":"2025-10-03T07:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:02 crc kubenswrapper[4664]: I1003 07:49:02.614382 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:02 crc kubenswrapper[4664]: I1003 07:49:02.614441 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:02 crc kubenswrapper[4664]: I1003 07:49:02.614452 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:02 crc kubenswrapper[4664]: I1003 07:49:02.614469 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:02 crc kubenswrapper[4664]: I1003 07:49:02.614481 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:02Z","lastTransitionTime":"2025-10-03T07:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:02 crc kubenswrapper[4664]: I1003 07:49:02.718681 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:02 crc kubenswrapper[4664]: I1003 07:49:02.718765 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:02 crc kubenswrapper[4664]: I1003 07:49:02.718783 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:02 crc kubenswrapper[4664]: I1003 07:49:02.718803 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:02 crc kubenswrapper[4664]: I1003 07:49:02.718819 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:02Z","lastTransitionTime":"2025-10-03T07:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:02 crc kubenswrapper[4664]: I1003 07:49:02.820940 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:02 crc kubenswrapper[4664]: I1003 07:49:02.820993 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:02 crc kubenswrapper[4664]: I1003 07:49:02.821020 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:02 crc kubenswrapper[4664]: I1003 07:49:02.821044 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:02 crc kubenswrapper[4664]: I1003 07:49:02.821059 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:02Z","lastTransitionTime":"2025-10-03T07:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:02 crc kubenswrapper[4664]: I1003 07:49:02.875953 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:49:02 crc kubenswrapper[4664]: E1003 07:49:02.876129 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l687s" podUID="7f2800e0-b66e-4ab2-ad4f-37c5ffe60120" Oct 03 07:49:02 crc kubenswrapper[4664]: I1003 07:49:02.924453 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:02 crc kubenswrapper[4664]: I1003 07:49:02.924510 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:02 crc kubenswrapper[4664]: I1003 07:49:02.924525 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:02 crc kubenswrapper[4664]: I1003 07:49:02.924547 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:02 crc kubenswrapper[4664]: I1003 07:49:02.924564 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:02Z","lastTransitionTime":"2025-10-03T07:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.027492 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.027549 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.027564 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.027586 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.027599 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:03Z","lastTransitionTime":"2025-10-03T07:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.130404 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.130447 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.130457 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.130473 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.130486 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:03Z","lastTransitionTime":"2025-10-03T07:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.179227 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.194895 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:03Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.208591 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b81ce-0ce7-498f-9337-ae5e6e64682b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf6aded6ff3dcd5fbea90e46bb42d6db6b963c9688abfed6fa40662139fb80f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b446e000609801e974810aa233005f7c2810cc977c7a13155452de76e1268427\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x9dgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:03Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.220837 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9z9q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4492bc6-9e61-4748-935e-e070a703c05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aac5ac1e508036a7e3cee3eb11577734e7923c4076399288863f294b7efa3fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqzrd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9z9q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:03Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.234255 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.234305 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.234318 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.234338 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.234350 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:03Z","lastTransitionTime":"2025-10-03T07:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.237125 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-72cm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6998d742-8d17-4f20-ab52-c30d9f7b0b89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93e9f047fd6ffc420a229d1159b5c53c40c5efe1f945e45bafa3ba93a69caef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9hpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-72cm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:03Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.253438 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7002f9af-9339-4bec-8b7a-ce0c1d20f3ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96ea8ece692f200d9dbe3650ba30a3680897c6d7c33c9b359b27e31692689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f580c83a6ce9959ed5a1c8b78834f7d6caecaeff3ed366fdef991ff728b08d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed30aa5b737df201bfca9e17073255b5d791430429d86149107650c38a1c0c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89484df39df055036ea8f81fb4e052a14292e7268e7c52088a497348141c3a81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 07:48:39.005956 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 07:48:39.006283 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 07:48:39.007038 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3527224080/tls.crt::/tmp/serving-cert-3527224080/tls.key\\\\\\\"\\\\nI1003 07:48:39.474760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 07:48:39.479276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 07:48:39.479354 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 07:48:39.479401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 07:48:39.479427 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 07:48:39.489134 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 07:48:39.489225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 07:48:39.489291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 07:48:39.489311 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 07:48:39.489336 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 07:48:39.489506 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 07:48:39.491496 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1078b48064376958ba64f9040dc1973628494ff4c9935c90c296bdb8a27f8738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:03Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.275353 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce0e6687a56096c1c89207f11b667732301910fdefad7e84fabca337fb456c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9a2122c70537d35a5c0829063c331dcaecb4f1835ebc5bf577c2441b5ccae3d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T07:48:55Z\\\",\\\"message\\\":\\\"946 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 07:48:55.868387 5946 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1003 07:48:55.868399 5946 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 07:48:55.868411 5946 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1003 07:48:55.867999 5946 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1003 07:48:55.868515 5946 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1003 07:48:55.868533 5946 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1003 07:48:55.868543 5946 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 07:48:55.868590 5946 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1003 07:48:55.868627 5946 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 07:48:55.868655 5946 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1003 07:48:55.868705 5946 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1003 07:48:55.868708 5946 factory.go:656] Stopping watch factory\\\\nI1003 07:48:55.868734 5946 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ce0e6687a56096c1c89207f11b667732301910fdefad7e84fabca337fb456c4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T07:48:57Z\\\",\\\"message\\\":\\\"ode-ca-9z9q9\\\\nI1003 07:48:57.083722 6091 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1003 07:48:57.083725 6091 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI1003 07:48:57.083739 6091 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1003 07:48:57.083744 6091 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1003 07:48:57.083746 6091 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nF1003 07:48:57.083749 6091 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2jpvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:03Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.291734 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h865c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e441c3-e8db-4705-9da1-0c6513d57048\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de7994382833aa2a4b6d7926e96257038511c4f547f24336eb11f3cf89e2d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h865c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:03Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.304686 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-27zqq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988af70a-5398-4c96-b2a7-4b8e143303bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f95198276df57dfddfafb72c325577486b3eddb40e73fbe4afa95e8ce8a6183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cr7ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd25565edabdcc002b63aece712eb1cf0ff4eece8277aa01ad4875f2977f63ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cr7ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-27zqq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:03Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.304841 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7f2800e0-b66e-4ab2-ad4f-37c5ffe60120-metrics-certs\") pod \"network-metrics-daemon-l687s\" (UID: \"7f2800e0-b66e-4ab2-ad4f-37c5ffe60120\") " pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:49:03 crc kubenswrapper[4664]: E1003 07:49:03.304953 4664 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 07:49:03 crc kubenswrapper[4664]: E1003 07:49:03.305027 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f2800e0-b66e-4ab2-ad4f-37c5ffe60120-metrics-certs podName:7f2800e0-b66e-4ab2-ad4f-37c5ffe60120 nodeName:}" failed. No retries permitted until 2025-10-03 07:49:07.305009407 +0000 UTC m=+48.126199897 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7f2800e0-b66e-4ab2-ad4f-37c5ffe60120-metrics-certs") pod "network-metrics-daemon-l687s" (UID: "7f2800e0-b66e-4ab2-ad4f-37c5ffe60120") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.317938 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l687s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f2800e0-b66e-4ab2-ad4f-37c5ffe60120\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6h72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6h72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l687s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:03Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.331300 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:03Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.336672 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.336736 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.336747 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.336773 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.336787 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:03Z","lastTransitionTime":"2025-10-03T07:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.343327 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hqfhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e03e57b4-cb64-42fa-b8c5-ee4863291568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31867b4c35050cbe6c4247edfe085dc9429d5516c19ad8da432699262b7fd092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hqfhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:03Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.355145 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8078a587-9c95-43a8-9cf8-4286835f8134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938aaa7e3e507e4f04632ecbdeb12d1e169b59fdcd8c19838e21fedc4b55d721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682632866f1e606691047ac952506c02f250c825f142a1676a4abcca32e58466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7eed5fa26eeb30a803512265d01d5fbb9fffad00d857a683939158da6b0bb7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e4e705e4bd02aaf141572ec91291240f8f7ec92b33cd45ba98ea8ce83f06918\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:03Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.368279 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:03Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.380706 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d6a6dd75fa3f860ac6164f4065d5fbb3ea31148102e726b792655d5b08e34d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:03Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.394812 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c51653288453da9ac24f91fdd67f9cec6bea6dc9b90432eeab5350b7e48f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://741ad5608cab1c104f0a912ca7325733265167e85291eb6bc1b08dd5a23a9b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:03Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.407344 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a8b21d2fb73dbe0bf41ef524cce8befb05dca708bc568a567ccd71bc1434b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:03Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.438960 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.439011 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.439022 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.439038 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.439051 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:03Z","lastTransitionTime":"2025-10-03T07:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.541973 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.542018 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.542029 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.542046 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.542058 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:03Z","lastTransitionTime":"2025-10-03T07:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.643788 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.643832 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.643843 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.643858 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.643868 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:03Z","lastTransitionTime":"2025-10-03T07:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.745956 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.745991 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.746002 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.746019 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.746027 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:03Z","lastTransitionTime":"2025-10-03T07:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.848682 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.848725 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.848747 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.848762 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.848852 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:03Z","lastTransitionTime":"2025-10-03T07:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.876126 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.876166 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.876270 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:49:03 crc kubenswrapper[4664]: E1003 07:49:03.876371 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:49:03 crc kubenswrapper[4664]: E1003 07:49:03.876482 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:49:03 crc kubenswrapper[4664]: E1003 07:49:03.876581 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.951313 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.951361 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.951371 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.951388 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:03 crc kubenswrapper[4664]: I1003 07:49:03.951410 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:03Z","lastTransitionTime":"2025-10-03T07:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:04 crc kubenswrapper[4664]: I1003 07:49:04.054098 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:04 crc kubenswrapper[4664]: I1003 07:49:04.054138 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:04 crc kubenswrapper[4664]: I1003 07:49:04.054148 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:04 crc kubenswrapper[4664]: I1003 07:49:04.054165 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:04 crc kubenswrapper[4664]: I1003 07:49:04.054175 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:04Z","lastTransitionTime":"2025-10-03T07:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:04 crc kubenswrapper[4664]: I1003 07:49:04.156408 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:04 crc kubenswrapper[4664]: I1003 07:49:04.156451 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:04 crc kubenswrapper[4664]: I1003 07:49:04.156464 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:04 crc kubenswrapper[4664]: I1003 07:49:04.156478 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:04 crc kubenswrapper[4664]: I1003 07:49:04.156490 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:04Z","lastTransitionTime":"2025-10-03T07:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:04 crc kubenswrapper[4664]: I1003 07:49:04.259009 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:04 crc kubenswrapper[4664]: I1003 07:49:04.259049 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:04 crc kubenswrapper[4664]: I1003 07:49:04.259060 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:04 crc kubenswrapper[4664]: I1003 07:49:04.259075 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:04 crc kubenswrapper[4664]: I1003 07:49:04.259086 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:04Z","lastTransitionTime":"2025-10-03T07:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:04 crc kubenswrapper[4664]: I1003 07:49:04.361096 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:04 crc kubenswrapper[4664]: I1003 07:49:04.361126 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:04 crc kubenswrapper[4664]: I1003 07:49:04.361143 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:04 crc kubenswrapper[4664]: I1003 07:49:04.361160 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:04 crc kubenswrapper[4664]: I1003 07:49:04.361170 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:04Z","lastTransitionTime":"2025-10-03T07:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:04 crc kubenswrapper[4664]: I1003 07:49:04.463301 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:04 crc kubenswrapper[4664]: I1003 07:49:04.463336 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:04 crc kubenswrapper[4664]: I1003 07:49:04.463346 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:04 crc kubenswrapper[4664]: I1003 07:49:04.463360 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:04 crc kubenswrapper[4664]: I1003 07:49:04.463373 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:04Z","lastTransitionTime":"2025-10-03T07:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:04 crc kubenswrapper[4664]: I1003 07:49:04.566445 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:04 crc kubenswrapper[4664]: I1003 07:49:04.566502 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:04 crc kubenswrapper[4664]: I1003 07:49:04.566513 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:04 crc kubenswrapper[4664]: I1003 07:49:04.566527 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:04 crc kubenswrapper[4664]: I1003 07:49:04.566540 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:04Z","lastTransitionTime":"2025-10-03T07:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:04 crc kubenswrapper[4664]: I1003 07:49:04.669737 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:04 crc kubenswrapper[4664]: I1003 07:49:04.669777 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:04 crc kubenswrapper[4664]: I1003 07:49:04.669790 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:04 crc kubenswrapper[4664]: I1003 07:49:04.669806 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:04 crc kubenswrapper[4664]: I1003 07:49:04.669816 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:04Z","lastTransitionTime":"2025-10-03T07:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:04 crc kubenswrapper[4664]: I1003 07:49:04.772084 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:04 crc kubenswrapper[4664]: I1003 07:49:04.772149 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:04 crc kubenswrapper[4664]: I1003 07:49:04.772160 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:04 crc kubenswrapper[4664]: I1003 07:49:04.772182 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:04 crc kubenswrapper[4664]: I1003 07:49:04.772195 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:04Z","lastTransitionTime":"2025-10-03T07:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:04 crc kubenswrapper[4664]: I1003 07:49:04.875211 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:04 crc kubenswrapper[4664]: I1003 07:49:04.875268 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:49:04 crc kubenswrapper[4664]: E1003 07:49:04.875358 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l687s" podUID="7f2800e0-b66e-4ab2-ad4f-37c5ffe60120" Oct 03 07:49:04 crc kubenswrapper[4664]: I1003 07:49:04.875270 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:04 crc kubenswrapper[4664]: I1003 07:49:04.875505 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:04 crc kubenswrapper[4664]: I1003 07:49:04.875543 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:04 crc kubenswrapper[4664]: I1003 07:49:04.875557 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:04Z","lastTransitionTime":"2025-10-03T07:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:04 crc kubenswrapper[4664]: I1003 07:49:04.978042 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:04 crc kubenswrapper[4664]: I1003 07:49:04.978112 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:04 crc kubenswrapper[4664]: I1003 07:49:04.978126 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:04 crc kubenswrapper[4664]: I1003 07:49:04.978151 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:04 crc kubenswrapper[4664]: I1003 07:49:04.978169 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:04Z","lastTransitionTime":"2025-10-03T07:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.081182 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.081237 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.081247 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.081267 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.081278 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:05Z","lastTransitionTime":"2025-10-03T07:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.184192 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.184248 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.184259 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.184282 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.184295 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:05Z","lastTransitionTime":"2025-10-03T07:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.288070 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.288130 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.288140 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.288157 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.288168 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:05Z","lastTransitionTime":"2025-10-03T07:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.391635 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.391678 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.391687 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.391711 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.391721 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:05Z","lastTransitionTime":"2025-10-03T07:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.495812 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.495867 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.495878 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.495896 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.495907 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:05Z","lastTransitionTime":"2025-10-03T07:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.599356 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.599415 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.599425 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.599445 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.599455 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:05Z","lastTransitionTime":"2025-10-03T07:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.702125 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.702171 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.702180 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.702198 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.702209 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:05Z","lastTransitionTime":"2025-10-03T07:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.804687 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.804740 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.804751 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.804765 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.804774 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:05Z","lastTransitionTime":"2025-10-03T07:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.833513 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.833557 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.833569 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.833590 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.833641 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:05Z","lastTransitionTime":"2025-10-03T07:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:05 crc kubenswrapper[4664]: E1003 07:49:05.846932 4664 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eca81c87-676e-4667-a87b-e015ec0be81c\\\",\\\"systemUUID\\\":\\\"7be6e848-96ef-48b1-8627-9ddc13d5cc87\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:05Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.851041 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.851105 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.851132 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.851147 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.851157 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:05Z","lastTransitionTime":"2025-10-03T07:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:05 crc kubenswrapper[4664]: E1003 07:49:05.863323 4664 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eca81c87-676e-4667-a87b-e015ec0be81c\\\",\\\"systemUUID\\\":\\\"7be6e848-96ef-48b1-8627-9ddc13d5cc87\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:05Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.867000 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.867073 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.867085 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.867108 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.867122 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:05Z","lastTransitionTime":"2025-10-03T07:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.875427 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.875427 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:49:05 crc kubenswrapper[4664]: E1003 07:49:05.875548 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:49:05 crc kubenswrapper[4664]: E1003 07:49:05.875595 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.875448 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:49:05 crc kubenswrapper[4664]: E1003 07:49:05.875721 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:49:05 crc kubenswrapper[4664]: E1003 07:49:05.878900 4664 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eca81c87-676e-4667-a87b-e015ec0be81c\\\",\\\"systemUUID\\\":\\\"7be6e848-96ef-48b1-8627-9ddc13d5cc87\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:05Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.882478 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.882506 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.882518 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.882533 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.882545 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:05Z","lastTransitionTime":"2025-10-03T07:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:05 crc kubenswrapper[4664]: E1003 07:49:05.894281 4664 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eca81c87-676e-4667-a87b-e015ec0be81c\\\",\\\"systemUUID\\\":\\\"7be6e848-96ef-48b1-8627-9ddc13d5cc87\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:05Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.897611 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.897665 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.897674 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.897687 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.897697 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:05Z","lastTransitionTime":"2025-10-03T07:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:05 crc kubenswrapper[4664]: E1003 07:49:05.909002 4664 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eca81c87-676e-4667-a87b-e015ec0be81c\\\",\\\"systemUUID\\\":\\\"7be6e848-96ef-48b1-8627-9ddc13d5cc87\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:05Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:05 crc kubenswrapper[4664]: E1003 07:49:05.909185 4664 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.910571 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.910606 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.910617 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.910647 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:05 crc kubenswrapper[4664]: I1003 07:49:05.910659 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:05Z","lastTransitionTime":"2025-10-03T07:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:06 crc kubenswrapper[4664]: I1003 07:49:06.013963 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:06 crc kubenswrapper[4664]: I1003 07:49:06.014028 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:06 crc kubenswrapper[4664]: I1003 07:49:06.014042 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:06 crc kubenswrapper[4664]: I1003 07:49:06.014065 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:06 crc kubenswrapper[4664]: I1003 07:49:06.014084 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:06Z","lastTransitionTime":"2025-10-03T07:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:06 crc kubenswrapper[4664]: I1003 07:49:06.116642 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:06 crc kubenswrapper[4664]: I1003 07:49:06.117230 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:06 crc kubenswrapper[4664]: I1003 07:49:06.117247 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:06 crc kubenswrapper[4664]: I1003 07:49:06.117273 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:06 crc kubenswrapper[4664]: I1003 07:49:06.117287 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:06Z","lastTransitionTime":"2025-10-03T07:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:06 crc kubenswrapper[4664]: I1003 07:49:06.220044 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:06 crc kubenswrapper[4664]: I1003 07:49:06.220087 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:06 crc kubenswrapper[4664]: I1003 07:49:06.220098 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:06 crc kubenswrapper[4664]: I1003 07:49:06.220113 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:06 crc kubenswrapper[4664]: I1003 07:49:06.220123 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:06Z","lastTransitionTime":"2025-10-03T07:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:06 crc kubenswrapper[4664]: I1003 07:49:06.322995 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:06 crc kubenswrapper[4664]: I1003 07:49:06.323041 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:06 crc kubenswrapper[4664]: I1003 07:49:06.323051 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:06 crc kubenswrapper[4664]: I1003 07:49:06.323065 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:06 crc kubenswrapper[4664]: I1003 07:49:06.323080 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:06Z","lastTransitionTime":"2025-10-03T07:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:06 crc kubenswrapper[4664]: I1003 07:49:06.425706 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:06 crc kubenswrapper[4664]: I1003 07:49:06.425758 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:06 crc kubenswrapper[4664]: I1003 07:49:06.425769 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:06 crc kubenswrapper[4664]: I1003 07:49:06.425788 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:06 crc kubenswrapper[4664]: I1003 07:49:06.425800 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:06Z","lastTransitionTime":"2025-10-03T07:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:06 crc kubenswrapper[4664]: I1003 07:49:06.528654 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:06 crc kubenswrapper[4664]: I1003 07:49:06.528708 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:06 crc kubenswrapper[4664]: I1003 07:49:06.528722 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:06 crc kubenswrapper[4664]: I1003 07:49:06.528747 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:06 crc kubenswrapper[4664]: I1003 07:49:06.528767 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:06Z","lastTransitionTime":"2025-10-03T07:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:06 crc kubenswrapper[4664]: I1003 07:49:06.631281 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:06 crc kubenswrapper[4664]: I1003 07:49:06.631326 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:06 crc kubenswrapper[4664]: I1003 07:49:06.631336 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:06 crc kubenswrapper[4664]: I1003 07:49:06.631351 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:06 crc kubenswrapper[4664]: I1003 07:49:06.631361 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:06Z","lastTransitionTime":"2025-10-03T07:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:06 crc kubenswrapper[4664]: I1003 07:49:06.733654 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:06 crc kubenswrapper[4664]: I1003 07:49:06.733705 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:06 crc kubenswrapper[4664]: I1003 07:49:06.733715 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:06 crc kubenswrapper[4664]: I1003 07:49:06.733729 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:06 crc kubenswrapper[4664]: I1003 07:49:06.733738 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:06Z","lastTransitionTime":"2025-10-03T07:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:06 crc kubenswrapper[4664]: I1003 07:49:06.836333 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:06 crc kubenswrapper[4664]: I1003 07:49:06.836383 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:06 crc kubenswrapper[4664]: I1003 07:49:06.836395 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:06 crc kubenswrapper[4664]: I1003 07:49:06.836411 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:06 crc kubenswrapper[4664]: I1003 07:49:06.836423 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:06Z","lastTransitionTime":"2025-10-03T07:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:06 crc kubenswrapper[4664]: I1003 07:49:06.875559 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:49:06 crc kubenswrapper[4664]: E1003 07:49:06.875784 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l687s" podUID="7f2800e0-b66e-4ab2-ad4f-37c5ffe60120" Oct 03 07:49:06 crc kubenswrapper[4664]: I1003 07:49:06.939300 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:06 crc kubenswrapper[4664]: I1003 07:49:06.939340 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:06 crc kubenswrapper[4664]: I1003 07:49:06.939353 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:06 crc kubenswrapper[4664]: I1003 07:49:06.939369 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:06 crc kubenswrapper[4664]: I1003 07:49:06.939381 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:06Z","lastTransitionTime":"2025-10-03T07:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.041421 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.041465 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.041476 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.041491 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.041503 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:07Z","lastTransitionTime":"2025-10-03T07:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.143667 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.143705 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.143714 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.143732 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.143744 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:07Z","lastTransitionTime":"2025-10-03T07:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.246031 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.246079 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.246091 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.246109 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.246141 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:07Z","lastTransitionTime":"2025-10-03T07:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.348061 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.348098 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.348106 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.348119 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.348128 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:07Z","lastTransitionTime":"2025-10-03T07:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.348495 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7f2800e0-b66e-4ab2-ad4f-37c5ffe60120-metrics-certs\") pod \"network-metrics-daemon-l687s\" (UID: \"7f2800e0-b66e-4ab2-ad4f-37c5ffe60120\") " pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:49:07 crc kubenswrapper[4664]: E1003 07:49:07.348680 4664 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 07:49:07 crc kubenswrapper[4664]: E1003 07:49:07.348738 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f2800e0-b66e-4ab2-ad4f-37c5ffe60120-metrics-certs podName:7f2800e0-b66e-4ab2-ad4f-37c5ffe60120 nodeName:}" failed. No retries permitted until 2025-10-03 07:49:15.34872142 +0000 UTC m=+56.169911910 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7f2800e0-b66e-4ab2-ad4f-37c5ffe60120-metrics-certs") pod "network-metrics-daemon-l687s" (UID: "7f2800e0-b66e-4ab2-ad4f-37c5ffe60120") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.450307 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.450355 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.450367 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.450385 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.450395 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:07Z","lastTransitionTime":"2025-10-03T07:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.552059 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.552105 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.552115 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.552131 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.552144 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:07Z","lastTransitionTime":"2025-10-03T07:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.654036 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.654069 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.654078 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.654090 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.654098 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:07Z","lastTransitionTime":"2025-10-03T07:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.755758 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.755800 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.755813 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.755830 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.755842 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:07Z","lastTransitionTime":"2025-10-03T07:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.786469 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.787270 4664 scope.go:117] "RemoveContainer" containerID="4ce0e6687a56096c1c89207f11b667732301910fdefad7e84fabca337fb456c4" Oct 03 07:49:07 crc kubenswrapper[4664]: E1003 07:49:07.787443 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-2jpvm_openshift-ovn-kubernetes(8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.801242 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:07Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.815452 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b81ce-0ce7-498f-9337-ae5e6e64682b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf6aded6ff3dcd5fbea90e46bb42d6db6b963c9688abfed6fa40662139fb80f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b446e000609801e974810aa233005f7c2810cc977c7a13155452de76e1268427\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x9dgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:07Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.827711 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9z9q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4492bc6-9e61-4748-935e-e070a703c05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aac5ac1e508036a7e3cee3eb11577734e7923c4076399288863f294b7efa3fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqzrd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9z9q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:07Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.841861 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-72cm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6998d742-8d17-4f20-ab52-c30d9f7b0b89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93e9f047fd6ffc420a229d1159b5c53c40c5efe1f945e45bafa3ba93a69caef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9hpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-72cm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:07Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.858652 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.858701 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.858713 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.858730 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.858743 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:07Z","lastTransitionTime":"2025-10-03T07:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.865049 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce0e6687a56096c1c89207f11b667732301910fdefad7e84fabca337fb456c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ce0e6687a56096c1c89207f11b667732301910fdefad7e84fabca337fb456c4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T07:48:57Z\\\",\\\"message\\\":\\\"ode-ca-9z9q9\\\\nI1003 07:48:57.083722 6091 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1003 07:48:57.083725 6091 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI1003 07:48:57.083739 6091 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1003 07:48:57.083744 6091 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1003 07:48:57.083746 6091 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nF1003 07:48:57.083749 6091 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-2jpvm_openshift-ovn-kubernetes(8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2jpvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:07Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.875971 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.876012 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.876096 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:49:07 crc kubenswrapper[4664]: E1003 07:49:07.876241 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:49:07 crc kubenswrapper[4664]: E1003 07:49:07.876340 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:49:07 crc kubenswrapper[4664]: E1003 07:49:07.876519 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.884104 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h865c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e441c3-e8db-4705-9da1-0c6513d57048\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de7994382833aa2a4b6d7926e96257038511c4f547f24336eb11f3cf89e2d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h865c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:07Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.897196 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-27zqq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988af70a-5398-4c96-b2a7-4b8e143303bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f95198276df57dfddfafb72c325577486b3eddb40e73fbe4afa95e8ce8a6183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cr7ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd25565edabdcc002b63aece712eb1cf0ff4eece8277aa01ad4875f2977f63ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cr7ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-27zqq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:07Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.908639 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l687s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f2800e0-b66e-4ab2-ad4f-37c5ffe60120\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6h72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6h72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l687s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:07Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.927801 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7002f9af-9339-4bec-8b7a-ce0c1d20f3ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96ea8ece692f200d9dbe3650ba30a3680897c6d7c33c9b359b27e31692689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f580c83a6ce9959ed5a1c8b78834f7d6caecaeff3ed366fdef991ff728b08d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed30aa5b737df201bfca9e17073255b5d791430429d86149107650c38a1c0c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89484df39df055036ea8f81fb4e052a14292e7268e7c52088a497348141c3a81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 07:48:39.005956 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 07:48:39.006283 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 07:48:39.007038 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3527224080/tls.crt::/tmp/serving-cert-3527224080/tls.key\\\\\\\"\\\\nI1003 07:48:39.474760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 07:48:39.479276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 07:48:39.479354 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 07:48:39.479401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 07:48:39.479427 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 07:48:39.489134 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 07:48:39.489225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 07:48:39.489291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 07:48:39.489311 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 07:48:39.489336 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 07:48:39.489506 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 07:48:39.491496 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1078b48064376958ba64f9040dc1973628494ff4c9935c90c296bdb8a27f8738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:07Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.940990 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:07Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.952708 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d6a6dd75fa3f860ac6164f4065d5fbb3ea31148102e726b792655d5b08e34d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:07Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.961268 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.961307 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.961315 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.961328 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.961338 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:07Z","lastTransitionTime":"2025-10-03T07:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.965864 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c51653288453da9ac24f91fdd67f9cec6bea6dc9b90432eeab5350b7e48f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://741ad5608cab1c104f0a912ca7325733265167e85291eb6bc1b08dd5a23a9b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:07Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.979546 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a8b21d2fb73dbe0bf41ef524cce8befb05dca708bc568a567ccd71bc1434b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:07Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:07 crc kubenswrapper[4664]: I1003 07:49:07.993746 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:07Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.003079 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hqfhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e03e57b4-cb64-42fa-b8c5-ee4863291568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31867b4c35050cbe6c4247edfe085dc9429d5516c19ad8da432699262b7fd092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hqfhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:08Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.013957 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8078a587-9c95-43a8-9cf8-4286835f8134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938aaa7e3e507e4f04632ecbdeb12d1e169b59fdcd8c19838e21fedc4b55d721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682632866f1e606691047ac952506c02f250c825f142a1676a4abcca32e58466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7eed5fa26eeb30a803512265d01d5fbb9fffad00d857a683939158da6b0bb7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e4e705e4bd02aaf141572ec91291240f8f7ec92b33cd45ba98ea8ce83f06918\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:08Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.063915 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.063971 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.063983 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.064001 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.064013 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:08Z","lastTransitionTime":"2025-10-03T07:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.166437 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.166486 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.166503 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.166520 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.166534 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:08Z","lastTransitionTime":"2025-10-03T07:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.268694 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.268732 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.268745 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.268761 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.268774 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:08Z","lastTransitionTime":"2025-10-03T07:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.371144 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.371191 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.371201 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.371218 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.371231 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:08Z","lastTransitionTime":"2025-10-03T07:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.474417 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.474472 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.474485 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.474508 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.474522 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:08Z","lastTransitionTime":"2025-10-03T07:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.577561 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.577651 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.577669 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.577692 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.577708 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:08Z","lastTransitionTime":"2025-10-03T07:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.591778 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.601501 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.608402 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:08Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.621541 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b81ce-0ce7-498f-9337-ae5e6e64682b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf6aded6ff3dcd5fbea90e46bb42d6db6b963c9688abfed6fa40662139fb80f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b446e000609801e974810aa233005f7c2810cc977c7a13155452de76e1268427\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x9dgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:08Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.633824 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9z9q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4492bc6-9e61-4748-935e-e070a703c05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aac5ac1e508036a7e3cee3eb11577734e7923c4076399288863f294b7efa3fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqzrd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9z9q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:08Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.650008 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-72cm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6998d742-8d17-4f20-ab52-c30d9f7b0b89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93e9f047fd6ffc420a229d1159b5c53c40c5efe1f945e45bafa3ba93a69caef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9hpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-72cm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:08Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.665774 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7002f9af-9339-4bec-8b7a-ce0c1d20f3ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96ea8ece692f200d9dbe3650ba30a3680897c6d7c33c9b359b27e31692689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f580c83a6ce9959ed5a1c8b78834f7d6caecaeff3ed366fdef991ff728b08d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed30aa5b737df201bfca9e17073255b5d791430429d86149107650c38a1c0c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89484df39df055036ea8f81fb4e052a14292e7268e7c52088a497348141c3a81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 07:48:39.005956 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 07:48:39.006283 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 07:48:39.007038 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3527224080/tls.crt::/tmp/serving-cert-3527224080/tls.key\\\\\\\"\\\\nI1003 07:48:39.474760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 07:48:39.479276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 07:48:39.479354 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 07:48:39.479401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 07:48:39.479427 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 07:48:39.489134 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 07:48:39.489225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 07:48:39.489291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 07:48:39.489311 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 07:48:39.489336 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 07:48:39.489506 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 07:48:39.491496 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1078b48064376958ba64f9040dc1973628494ff4c9935c90c296bdb8a27f8738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:08Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.680227 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.680262 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.680272 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.680284 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.680295 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:08Z","lastTransitionTime":"2025-10-03T07:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.686499 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce0e6687a56096c1c89207f11b667732301910fdefad7e84fabca337fb456c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ce0e6687a56096c1c89207f11b667732301910fdefad7e84fabca337fb456c4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T07:48:57Z\\\",\\\"message\\\":\\\"ode-ca-9z9q9\\\\nI1003 07:48:57.083722 6091 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1003 07:48:57.083725 6091 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI1003 07:48:57.083739 6091 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1003 07:48:57.083744 6091 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1003 07:48:57.083746 6091 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nF1003 07:48:57.083749 6091 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-2jpvm_openshift-ovn-kubernetes(8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2jpvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:08Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.703226 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h865c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e441c3-e8db-4705-9da1-0c6513d57048\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de7994382833aa2a4b6d7926e96257038511c4f547f24336eb11f3cf89e2d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h865c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:08Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.715351 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-27zqq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988af70a-5398-4c96-b2a7-4b8e143303bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f95198276df57dfddfafb72c325577486b3eddb40e73fbe4afa95e8ce8a6183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cr7ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd25565edabdcc002b63aece712eb1cf0ff4eece8277aa01ad4875f2977f63ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cr7ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-27zqq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:08Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.729807 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l687s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f2800e0-b66e-4ab2-ad4f-37c5ffe60120\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6h72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6h72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l687s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:08Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.744828 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8078a587-9c95-43a8-9cf8-4286835f8134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938aaa7e3e507e4f04632ecbdeb12d1e169b59fdcd8c19838e21fedc4b55d721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682632866f1e606691047ac952506c02f250c825f142a1676a4abcca32e58466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7eed5fa26eeb30a803512265d01d5fbb9fffad00d857a683939158da6b0bb7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e4e705e4bd02aaf141572ec91291240f8f7ec92b33cd45ba98ea8ce83f06918\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:08Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.756920 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:08Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.768070 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d6a6dd75fa3f860ac6164f4065d5fbb3ea31148102e726b792655d5b08e34d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:08Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.780212 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c51653288453da9ac24f91fdd67f9cec6bea6dc9b90432eeab5350b7e48f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://741ad5608cab1c104f0a912ca7325733265167e85291eb6bc1b08dd5a23a9b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:08Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.781829 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.781870 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.781883 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.781899 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.781911 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:08Z","lastTransitionTime":"2025-10-03T07:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.793420 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a8b21d2fb73dbe0bf41ef524cce8befb05dca708bc568a567ccd71bc1434b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:08Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.806084 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:08Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.815869 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hqfhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e03e57b4-cb64-42fa-b8c5-ee4863291568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31867b4c35050cbe6c4247edfe085dc9429d5516c19ad8da432699262b7fd092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hqfhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:08Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.875679 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:49:08 crc kubenswrapper[4664]: E1003 07:49:08.875829 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l687s" podUID="7f2800e0-b66e-4ab2-ad4f-37c5ffe60120" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.883854 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.883885 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.883898 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.883913 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.883924 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:08Z","lastTransitionTime":"2025-10-03T07:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.986682 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.986727 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.986735 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.986750 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:08 crc kubenswrapper[4664]: I1003 07:49:08.986759 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:08Z","lastTransitionTime":"2025-10-03T07:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.089406 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.089480 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.089491 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.089506 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.089518 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:09Z","lastTransitionTime":"2025-10-03T07:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.191232 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.191261 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.191269 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.191282 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.191290 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:09Z","lastTransitionTime":"2025-10-03T07:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.293561 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.293648 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.293657 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.293671 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.293680 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:09Z","lastTransitionTime":"2025-10-03T07:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.396758 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.396825 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.396845 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.396869 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.396887 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:09Z","lastTransitionTime":"2025-10-03T07:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.499856 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.499897 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.499908 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.499924 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.499938 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:09Z","lastTransitionTime":"2025-10-03T07:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.602082 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.602136 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.602146 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.602162 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.602173 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:09Z","lastTransitionTime":"2025-10-03T07:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.704134 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.704221 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.704238 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.704258 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.704276 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:09Z","lastTransitionTime":"2025-10-03T07:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.807105 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.807144 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.807159 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.807176 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.807189 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:09Z","lastTransitionTime":"2025-10-03T07:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.875304 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.875407 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:49:09 crc kubenswrapper[4664]: E1003 07:49:09.875505 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.875516 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:49:09 crc kubenswrapper[4664]: E1003 07:49:09.875645 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:49:09 crc kubenswrapper[4664]: E1003 07:49:09.875730 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.894904 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c51653288453da9ac24f91fdd67f9cec6bea6dc9b90432eeab5350b7e48f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://741ad5608cab1c104f0a912ca7325733265167e85291eb6bc1b08dd5a23a9b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:09Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.908457 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a8b21d2fb73dbe0bf41ef524cce8befb05dca708bc568a567ccd71bc1434b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:09Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.909394 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.909439 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.909448 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.909463 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.909475 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:09Z","lastTransitionTime":"2025-10-03T07:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.921050 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:09Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.932272 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hqfhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e03e57b4-cb64-42fa-b8c5-ee4863291568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31867b4c35050cbe6c4247edfe085dc9429d5516c19ad8da432699262b7fd092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hqfhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:09Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.945444 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8078a587-9c95-43a8-9cf8-4286835f8134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938aaa7e3e507e4f04632ecbdeb12d1e169b59fdcd8c19838e21fedc4b55d721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682632866f1e606691047ac952506c02f250c825f142a1676a4abcca32e58466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7eed5fa26eeb30a803512265d01d5fbb9fffad00d857a683939158da6b0bb7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e4e705e4bd02aaf141572ec91291240f8f7ec92b33cd45ba98ea8ce83f06918\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:09Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.957201 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b74b4b6-2022-4308-8071-0f972dd1d922\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b1c43aa954d4f0ba95fd6bafe459a5f2d2df82b8cdbcd661c2a6e6238526fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd376fcbd2d46400852a695c1682ba853522bcc98038fcdb55c89d87d13ef012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f5a207be586adc5a1d852c18ed4ca1bf7d81b707a2f55ac2e76f31c5a94ffb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373216c551b7fed67ebbcb9c4cac5afd1e715a5fda849d977798c3690dc4f928\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://373216c551b7fed67ebbcb9c4cac5afd1e715a5fda849d977798c3690dc4f928\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:09Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.975155 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:09Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.987458 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d6a6dd75fa3f860ac6164f4065d5fbb3ea31148102e726b792655d5b08e34d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:09Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:09 crc kubenswrapper[4664]: I1003 07:49:09.999208 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:09Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.011371 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.011410 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.011420 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.011440 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.011450 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:10Z","lastTransitionTime":"2025-10-03T07:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.011651 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b81ce-0ce7-498f-9337-ae5e6e64682b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf6aded6ff3dcd5fbea90e46bb42d6db6b963c9688abfed6fa40662139fb80f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b446e000609801e974810aa233005f7c2810cc977c7a13155452de76e1268427\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x9dgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:10Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.022754 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9z9q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4492bc6-9e61-4748-935e-e070a703c05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aac5ac1e508036a7e3cee3eb11577734e7923c4076399288863f294b7efa3fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqzrd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9z9q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:10Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.035472 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-72cm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6998d742-8d17-4f20-ab52-c30d9f7b0b89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93e9f047fd6ffc420a229d1159b5c53c40c5efe1f945e45bafa3ba93a69caef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9hpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-72cm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:10Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.045022 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l687s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f2800e0-b66e-4ab2-ad4f-37c5ffe60120\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6h72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6h72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l687s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:10Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.057843 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7002f9af-9339-4bec-8b7a-ce0c1d20f3ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96ea8ece692f200d9dbe3650ba30a3680897c6d7c33c9b359b27e31692689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f580c83a6ce9959ed5a1c8b78834f7d6caecaeff3ed366fdef991ff728b08d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed30aa5b737df201bfca9e17073255b5d791430429d86149107650c38a1c0c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89484df39df055036ea8f81fb4e052a14292e7268e7c52088a497348141c3a81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 07:48:39.005956 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 07:48:39.006283 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 07:48:39.007038 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3527224080/tls.crt::/tmp/serving-cert-3527224080/tls.key\\\\\\\"\\\\nI1003 07:48:39.474760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 07:48:39.479276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 07:48:39.479354 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 07:48:39.479401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 07:48:39.479427 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 07:48:39.489134 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 07:48:39.489225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 07:48:39.489291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 07:48:39.489311 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 07:48:39.489336 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 07:48:39.489506 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 07:48:39.491496 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1078b48064376958ba64f9040dc1973628494ff4c9935c90c296bdb8a27f8738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:10Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.094063 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce0e6687a56096c1c89207f11b667732301910fdefad7e84fabca337fb456c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ce0e6687a56096c1c89207f11b667732301910fdefad7e84fabca337fb456c4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T07:48:57Z\\\",\\\"message\\\":\\\"ode-ca-9z9q9\\\\nI1003 07:48:57.083722 6091 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1003 07:48:57.083725 6091 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI1003 07:48:57.083739 6091 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1003 07:48:57.083744 6091 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1003 07:48:57.083746 6091 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nF1003 07:48:57.083749 6091 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-2jpvm_openshift-ovn-kubernetes(8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2jpvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:10Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.113227 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.113262 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.113271 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.113284 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.113293 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:10Z","lastTransitionTime":"2025-10-03T07:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.115248 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h865c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e441c3-e8db-4705-9da1-0c6513d57048\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de7994382833aa2a4b6d7926e96257038511c4f547f24336eb11f3cf89e2d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h865c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:10Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.126890 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-27zqq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988af70a-5398-4c96-b2a7-4b8e143303bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f95198276df57dfddfafb72c325577486b3eddb40e73fbe4afa95e8ce8a6183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cr7ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd25565edabdcc002b63aece712eb1cf0ff4eece8277aa01ad4875f2977f63ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cr7ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-27zqq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:10Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.215398 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.215443 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.215453 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.215469 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.215480 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:10Z","lastTransitionTime":"2025-10-03T07:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.317690 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.317731 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.317739 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.317752 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.317761 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:10Z","lastTransitionTime":"2025-10-03T07:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.420052 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.420097 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.420109 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.420126 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.420138 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:10Z","lastTransitionTime":"2025-10-03T07:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.522974 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.523017 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.523026 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.523039 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.523050 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:10Z","lastTransitionTime":"2025-10-03T07:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.625474 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.625510 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.625519 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.625533 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.625543 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:10Z","lastTransitionTime":"2025-10-03T07:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.727938 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.727979 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.727988 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.728005 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.728014 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:10Z","lastTransitionTime":"2025-10-03T07:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.830520 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.830572 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.830584 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.830603 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.830664 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:10Z","lastTransitionTime":"2025-10-03T07:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.875453 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:49:10 crc kubenswrapper[4664]: E1003 07:49:10.875568 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l687s" podUID="7f2800e0-b66e-4ab2-ad4f-37c5ffe60120" Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.933144 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.933228 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.933257 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.933288 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:10 crc kubenswrapper[4664]: I1003 07:49:10.933313 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:10Z","lastTransitionTime":"2025-10-03T07:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.039422 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.039677 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.039691 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.039712 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.039726 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:11Z","lastTransitionTime":"2025-10-03T07:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.141770 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.141812 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.141823 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.141838 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.141847 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:11Z","lastTransitionTime":"2025-10-03T07:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.244095 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.244135 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.244146 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.244160 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.244172 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:11Z","lastTransitionTime":"2025-10-03T07:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.346754 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.346796 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.346810 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.346827 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.346841 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:11Z","lastTransitionTime":"2025-10-03T07:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.449010 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.449059 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.449070 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.449085 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.449095 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:11Z","lastTransitionTime":"2025-10-03T07:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.551542 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.551589 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.551655 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.551675 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.551688 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:11Z","lastTransitionTime":"2025-10-03T07:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.653590 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.653673 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.653685 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.653701 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.653713 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:11Z","lastTransitionTime":"2025-10-03T07:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.692831 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:49:11 crc kubenswrapper[4664]: E1003 07:49:11.692974 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 07:49:43.692950945 +0000 UTC m=+84.514141435 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.693073 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.693104 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.693137 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.693154 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:49:11 crc kubenswrapper[4664]: E1003 07:49:11.693233 4664 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 07:49:11 crc kubenswrapper[4664]: E1003 07:49:11.693260 4664 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 07:49:11 crc kubenswrapper[4664]: E1003 07:49:11.693279 4664 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 07:49:11 crc kubenswrapper[4664]: E1003 07:49:11.693265 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 07:49:43.693258434 +0000 UTC m=+84.514448924 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 07:49:11 crc kubenswrapper[4664]: E1003 07:49:11.693305 4664 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 07:49:11 crc kubenswrapper[4664]: E1003 07:49:11.693311 4664 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 07:49:11 crc kubenswrapper[4664]: E1003 07:49:11.693334 4664 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 07:49:11 crc kubenswrapper[4664]: E1003 07:49:11.693343 4664 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 07:49:11 crc kubenswrapper[4664]: E1003 07:49:11.693315 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 07:49:43.693302225 +0000 UTC m=+84.514492795 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 07:49:11 crc kubenswrapper[4664]: E1003 07:49:11.693320 4664 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 07:49:11 crc kubenswrapper[4664]: E1003 07:49:11.693501 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 07:49:43.693421468 +0000 UTC m=+84.514611968 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 07:49:11 crc kubenswrapper[4664]: E1003 07:49:11.693765 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 07:49:43.693695936 +0000 UTC m=+84.514886466 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.757083 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.757140 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.757156 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.757177 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.757190 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:11Z","lastTransitionTime":"2025-10-03T07:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.859298 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.859340 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.859349 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.859364 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.859375 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:11Z","lastTransitionTime":"2025-10-03T07:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.876079 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.876128 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:49:11 crc kubenswrapper[4664]: E1003 07:49:11.876213 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.876298 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:49:11 crc kubenswrapper[4664]: E1003 07:49:11.876367 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:49:11 crc kubenswrapper[4664]: E1003 07:49:11.876538 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.962763 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.962802 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.962811 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.962826 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:11 crc kubenswrapper[4664]: I1003 07:49:11.962837 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:11Z","lastTransitionTime":"2025-10-03T07:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:12 crc kubenswrapper[4664]: I1003 07:49:12.065005 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:12 crc kubenswrapper[4664]: I1003 07:49:12.065053 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:12 crc kubenswrapper[4664]: I1003 07:49:12.065065 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:12 crc kubenswrapper[4664]: I1003 07:49:12.065082 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:12 crc kubenswrapper[4664]: I1003 07:49:12.065091 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:12Z","lastTransitionTime":"2025-10-03T07:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:12 crc kubenswrapper[4664]: I1003 07:49:12.167723 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:12 crc kubenswrapper[4664]: I1003 07:49:12.167784 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:12 crc kubenswrapper[4664]: I1003 07:49:12.167797 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:12 crc kubenswrapper[4664]: I1003 07:49:12.167816 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:12 crc kubenswrapper[4664]: I1003 07:49:12.167828 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:12Z","lastTransitionTime":"2025-10-03T07:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:12 crc kubenswrapper[4664]: I1003 07:49:12.269807 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:12 crc kubenswrapper[4664]: I1003 07:49:12.269846 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:12 crc kubenswrapper[4664]: I1003 07:49:12.269855 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:12 crc kubenswrapper[4664]: I1003 07:49:12.269869 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:12 crc kubenswrapper[4664]: I1003 07:49:12.269877 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:12Z","lastTransitionTime":"2025-10-03T07:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:12 crc kubenswrapper[4664]: I1003 07:49:12.372703 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:12 crc kubenswrapper[4664]: I1003 07:49:12.372734 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:12 crc kubenswrapper[4664]: I1003 07:49:12.372743 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:12 crc kubenswrapper[4664]: I1003 07:49:12.372756 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:12 crc kubenswrapper[4664]: I1003 07:49:12.372769 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:12Z","lastTransitionTime":"2025-10-03T07:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:12 crc kubenswrapper[4664]: I1003 07:49:12.475490 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:12 crc kubenswrapper[4664]: I1003 07:49:12.475536 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:12 crc kubenswrapper[4664]: I1003 07:49:12.475545 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:12 crc kubenswrapper[4664]: I1003 07:49:12.475559 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:12 crc kubenswrapper[4664]: I1003 07:49:12.475570 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:12Z","lastTransitionTime":"2025-10-03T07:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:12 crc kubenswrapper[4664]: I1003 07:49:12.578085 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:12 crc kubenswrapper[4664]: I1003 07:49:12.578135 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:12 crc kubenswrapper[4664]: I1003 07:49:12.578149 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:12 crc kubenswrapper[4664]: I1003 07:49:12.578162 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:12 crc kubenswrapper[4664]: I1003 07:49:12.578173 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:12Z","lastTransitionTime":"2025-10-03T07:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:12 crc kubenswrapper[4664]: I1003 07:49:12.680563 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:12 crc kubenswrapper[4664]: I1003 07:49:12.680639 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:12 crc kubenswrapper[4664]: I1003 07:49:12.680652 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:12 crc kubenswrapper[4664]: I1003 07:49:12.680672 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:12 crc kubenswrapper[4664]: I1003 07:49:12.680685 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:12Z","lastTransitionTime":"2025-10-03T07:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:12 crc kubenswrapper[4664]: I1003 07:49:12.783281 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:12 crc kubenswrapper[4664]: I1003 07:49:12.783318 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:12 crc kubenswrapper[4664]: I1003 07:49:12.783325 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:12 crc kubenswrapper[4664]: I1003 07:49:12.783339 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:12 crc kubenswrapper[4664]: I1003 07:49:12.783348 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:12Z","lastTransitionTime":"2025-10-03T07:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:12 crc kubenswrapper[4664]: I1003 07:49:12.875981 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:49:12 crc kubenswrapper[4664]: E1003 07:49:12.876111 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l687s" podUID="7f2800e0-b66e-4ab2-ad4f-37c5ffe60120" Oct 03 07:49:12 crc kubenswrapper[4664]: I1003 07:49:12.886479 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:12 crc kubenswrapper[4664]: I1003 07:49:12.886506 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:12 crc kubenswrapper[4664]: I1003 07:49:12.886515 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:12 crc kubenswrapper[4664]: I1003 07:49:12.886525 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:12 crc kubenswrapper[4664]: I1003 07:49:12.886535 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:12Z","lastTransitionTime":"2025-10-03T07:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:12 crc kubenswrapper[4664]: I1003 07:49:12.989820 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:12 crc kubenswrapper[4664]: I1003 07:49:12.989858 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:12 crc kubenswrapper[4664]: I1003 07:49:12.989867 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:12 crc kubenswrapper[4664]: I1003 07:49:12.989881 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:12 crc kubenswrapper[4664]: I1003 07:49:12.989891 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:12Z","lastTransitionTime":"2025-10-03T07:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:13 crc kubenswrapper[4664]: I1003 07:49:13.092169 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:13 crc kubenswrapper[4664]: I1003 07:49:13.092206 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:13 crc kubenswrapper[4664]: I1003 07:49:13.092215 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:13 crc kubenswrapper[4664]: I1003 07:49:13.092229 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:13 crc kubenswrapper[4664]: I1003 07:49:13.092239 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:13Z","lastTransitionTime":"2025-10-03T07:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:13 crc kubenswrapper[4664]: I1003 07:49:13.195024 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:13 crc kubenswrapper[4664]: I1003 07:49:13.195089 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:13 crc kubenswrapper[4664]: I1003 07:49:13.195110 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:13 crc kubenswrapper[4664]: I1003 07:49:13.195133 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:13 crc kubenswrapper[4664]: I1003 07:49:13.195150 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:13Z","lastTransitionTime":"2025-10-03T07:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:13 crc kubenswrapper[4664]: I1003 07:49:13.298208 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:13 crc kubenswrapper[4664]: I1003 07:49:13.298268 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:13 crc kubenswrapper[4664]: I1003 07:49:13.298284 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:13 crc kubenswrapper[4664]: I1003 07:49:13.298309 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:13 crc kubenswrapper[4664]: I1003 07:49:13.298325 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:13Z","lastTransitionTime":"2025-10-03T07:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:13 crc kubenswrapper[4664]: I1003 07:49:13.405134 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:13 crc kubenswrapper[4664]: I1003 07:49:13.405177 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:13 crc kubenswrapper[4664]: I1003 07:49:13.405188 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:13 crc kubenswrapper[4664]: I1003 07:49:13.405203 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:13 crc kubenswrapper[4664]: I1003 07:49:13.405214 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:13Z","lastTransitionTime":"2025-10-03T07:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:13 crc kubenswrapper[4664]: I1003 07:49:13.508025 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:13 crc kubenswrapper[4664]: I1003 07:49:13.508089 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:13 crc kubenswrapper[4664]: I1003 07:49:13.508105 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:13 crc kubenswrapper[4664]: I1003 07:49:13.508128 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:13 crc kubenswrapper[4664]: I1003 07:49:13.508145 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:13Z","lastTransitionTime":"2025-10-03T07:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:13 crc kubenswrapper[4664]: I1003 07:49:13.609913 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:13 crc kubenswrapper[4664]: I1003 07:49:13.609962 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:13 crc kubenswrapper[4664]: I1003 07:49:13.609972 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:13 crc kubenswrapper[4664]: I1003 07:49:13.609990 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:13 crc kubenswrapper[4664]: I1003 07:49:13.609999 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:13Z","lastTransitionTime":"2025-10-03T07:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:13 crc kubenswrapper[4664]: I1003 07:49:13.712485 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:13 crc kubenswrapper[4664]: I1003 07:49:13.712534 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:13 crc kubenswrapper[4664]: I1003 07:49:13.712541 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:13 crc kubenswrapper[4664]: I1003 07:49:13.712554 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:13 crc kubenswrapper[4664]: I1003 07:49:13.712577 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:13Z","lastTransitionTime":"2025-10-03T07:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:13 crc kubenswrapper[4664]: I1003 07:49:13.815791 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:13 crc kubenswrapper[4664]: I1003 07:49:13.815841 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:13 crc kubenswrapper[4664]: I1003 07:49:13.815859 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:13 crc kubenswrapper[4664]: I1003 07:49:13.815880 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:13 crc kubenswrapper[4664]: I1003 07:49:13.815895 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:13Z","lastTransitionTime":"2025-10-03T07:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:13 crc kubenswrapper[4664]: I1003 07:49:13.875695 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:49:13 crc kubenswrapper[4664]: I1003 07:49:13.875755 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:49:13 crc kubenswrapper[4664]: E1003 07:49:13.875807 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:49:13 crc kubenswrapper[4664]: E1003 07:49:13.875926 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:49:13 crc kubenswrapper[4664]: I1003 07:49:13.875960 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:49:13 crc kubenswrapper[4664]: E1003 07:49:13.876121 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:49:13 crc kubenswrapper[4664]: I1003 07:49:13.917733 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:13 crc kubenswrapper[4664]: I1003 07:49:13.917794 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:13 crc kubenswrapper[4664]: I1003 07:49:13.917803 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:13 crc kubenswrapper[4664]: I1003 07:49:13.917820 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:13 crc kubenswrapper[4664]: I1003 07:49:13.917837 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:13Z","lastTransitionTime":"2025-10-03T07:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:14 crc kubenswrapper[4664]: I1003 07:49:14.019846 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:14 crc kubenswrapper[4664]: I1003 07:49:14.019898 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:14 crc kubenswrapper[4664]: I1003 07:49:14.019906 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:14 crc kubenswrapper[4664]: I1003 07:49:14.019920 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:14 crc kubenswrapper[4664]: I1003 07:49:14.019929 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:14Z","lastTransitionTime":"2025-10-03T07:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:14 crc kubenswrapper[4664]: I1003 07:49:14.123412 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:14 crc kubenswrapper[4664]: I1003 07:49:14.123459 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:14 crc kubenswrapper[4664]: I1003 07:49:14.123474 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:14 crc kubenswrapper[4664]: I1003 07:49:14.123489 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:14 crc kubenswrapper[4664]: I1003 07:49:14.123498 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:14Z","lastTransitionTime":"2025-10-03T07:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:14 crc kubenswrapper[4664]: I1003 07:49:14.225806 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:14 crc kubenswrapper[4664]: I1003 07:49:14.225857 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:14 crc kubenswrapper[4664]: I1003 07:49:14.225865 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:14 crc kubenswrapper[4664]: I1003 07:49:14.225879 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:14 crc kubenswrapper[4664]: I1003 07:49:14.225887 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:14Z","lastTransitionTime":"2025-10-03T07:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:14 crc kubenswrapper[4664]: I1003 07:49:14.327971 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:14 crc kubenswrapper[4664]: I1003 07:49:14.328030 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:14 crc kubenswrapper[4664]: I1003 07:49:14.328070 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:14 crc kubenswrapper[4664]: I1003 07:49:14.328092 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:14 crc kubenswrapper[4664]: I1003 07:49:14.328109 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:14Z","lastTransitionTime":"2025-10-03T07:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:14 crc kubenswrapper[4664]: I1003 07:49:14.430170 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:14 crc kubenswrapper[4664]: I1003 07:49:14.430226 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:14 crc kubenswrapper[4664]: I1003 07:49:14.430244 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:14 crc kubenswrapper[4664]: I1003 07:49:14.430263 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:14 crc kubenswrapper[4664]: I1003 07:49:14.430274 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:14Z","lastTransitionTime":"2025-10-03T07:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:14 crc kubenswrapper[4664]: I1003 07:49:14.532347 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:14 crc kubenswrapper[4664]: I1003 07:49:14.532393 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:14 crc kubenswrapper[4664]: I1003 07:49:14.532405 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:14 crc kubenswrapper[4664]: I1003 07:49:14.532421 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:14 crc kubenswrapper[4664]: I1003 07:49:14.532433 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:14Z","lastTransitionTime":"2025-10-03T07:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:14 crc kubenswrapper[4664]: I1003 07:49:14.634634 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:14 crc kubenswrapper[4664]: I1003 07:49:14.634674 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:14 crc kubenswrapper[4664]: I1003 07:49:14.634683 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:14 crc kubenswrapper[4664]: I1003 07:49:14.634697 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:14 crc kubenswrapper[4664]: I1003 07:49:14.634706 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:14Z","lastTransitionTime":"2025-10-03T07:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:14 crc kubenswrapper[4664]: I1003 07:49:14.736822 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:14 crc kubenswrapper[4664]: I1003 07:49:14.736861 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:14 crc kubenswrapper[4664]: I1003 07:49:14.736875 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:14 crc kubenswrapper[4664]: I1003 07:49:14.736898 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:14 crc kubenswrapper[4664]: I1003 07:49:14.736908 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:14Z","lastTransitionTime":"2025-10-03T07:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:14 crc kubenswrapper[4664]: I1003 07:49:14.839449 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:14 crc kubenswrapper[4664]: I1003 07:49:14.839500 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:14 crc kubenswrapper[4664]: I1003 07:49:14.839512 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:14 crc kubenswrapper[4664]: I1003 07:49:14.839525 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:14 crc kubenswrapper[4664]: I1003 07:49:14.839533 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:14Z","lastTransitionTime":"2025-10-03T07:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:14 crc kubenswrapper[4664]: I1003 07:49:14.875420 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:49:14 crc kubenswrapper[4664]: E1003 07:49:14.875567 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l687s" podUID="7f2800e0-b66e-4ab2-ad4f-37c5ffe60120" Oct 03 07:49:14 crc kubenswrapper[4664]: I1003 07:49:14.941784 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:14 crc kubenswrapper[4664]: I1003 07:49:14.941826 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:14 crc kubenswrapper[4664]: I1003 07:49:14.941837 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:14 crc kubenswrapper[4664]: I1003 07:49:14.941853 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:14 crc kubenswrapper[4664]: I1003 07:49:14.941864 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:14Z","lastTransitionTime":"2025-10-03T07:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:15 crc kubenswrapper[4664]: I1003 07:49:15.044638 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:15 crc kubenswrapper[4664]: I1003 07:49:15.044684 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:15 crc kubenswrapper[4664]: I1003 07:49:15.044696 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:15 crc kubenswrapper[4664]: I1003 07:49:15.044715 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:15 crc kubenswrapper[4664]: I1003 07:49:15.044726 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:15Z","lastTransitionTime":"2025-10-03T07:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:15 crc kubenswrapper[4664]: I1003 07:49:15.147565 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:15 crc kubenswrapper[4664]: I1003 07:49:15.147663 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:15 crc kubenswrapper[4664]: I1003 07:49:15.147680 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:15 crc kubenswrapper[4664]: I1003 07:49:15.147706 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:15 crc kubenswrapper[4664]: I1003 07:49:15.147717 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:15Z","lastTransitionTime":"2025-10-03T07:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:15 crc kubenswrapper[4664]: I1003 07:49:15.250109 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:15 crc kubenswrapper[4664]: I1003 07:49:15.250154 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:15 crc kubenswrapper[4664]: I1003 07:49:15.250164 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:15 crc kubenswrapper[4664]: I1003 07:49:15.250180 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:15 crc kubenswrapper[4664]: I1003 07:49:15.250193 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:15Z","lastTransitionTime":"2025-10-03T07:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:15 crc kubenswrapper[4664]: I1003 07:49:15.352685 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:15 crc kubenswrapper[4664]: I1003 07:49:15.352719 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:15 crc kubenswrapper[4664]: I1003 07:49:15.352727 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:15 crc kubenswrapper[4664]: I1003 07:49:15.352740 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:15 crc kubenswrapper[4664]: I1003 07:49:15.352748 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:15Z","lastTransitionTime":"2025-10-03T07:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:15 crc kubenswrapper[4664]: I1003 07:49:15.431515 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7f2800e0-b66e-4ab2-ad4f-37c5ffe60120-metrics-certs\") pod \"network-metrics-daemon-l687s\" (UID: \"7f2800e0-b66e-4ab2-ad4f-37c5ffe60120\") " pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:49:15 crc kubenswrapper[4664]: E1003 07:49:15.431692 4664 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 07:49:15 crc kubenswrapper[4664]: E1003 07:49:15.431743 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f2800e0-b66e-4ab2-ad4f-37c5ffe60120-metrics-certs podName:7f2800e0-b66e-4ab2-ad4f-37c5ffe60120 nodeName:}" failed. No retries permitted until 2025-10-03 07:49:31.431729118 +0000 UTC m=+72.252919608 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7f2800e0-b66e-4ab2-ad4f-37c5ffe60120-metrics-certs") pod "network-metrics-daemon-l687s" (UID: "7f2800e0-b66e-4ab2-ad4f-37c5ffe60120") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 07:49:15 crc kubenswrapper[4664]: I1003 07:49:15.454957 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:15 crc kubenswrapper[4664]: I1003 07:49:15.455169 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:15 crc kubenswrapper[4664]: I1003 07:49:15.455234 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:15 crc kubenswrapper[4664]: I1003 07:49:15.455324 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:15 crc kubenswrapper[4664]: I1003 07:49:15.455385 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:15Z","lastTransitionTime":"2025-10-03T07:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:15 crc kubenswrapper[4664]: I1003 07:49:15.557932 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:15 crc kubenswrapper[4664]: I1003 07:49:15.557977 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:15 crc kubenswrapper[4664]: I1003 07:49:15.557988 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:15 crc kubenswrapper[4664]: I1003 07:49:15.558007 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:15 crc kubenswrapper[4664]: I1003 07:49:15.558020 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:15Z","lastTransitionTime":"2025-10-03T07:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:15 crc kubenswrapper[4664]: I1003 07:49:15.660147 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:15 crc kubenswrapper[4664]: I1003 07:49:15.660424 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:15 crc kubenswrapper[4664]: I1003 07:49:15.660488 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:15 crc kubenswrapper[4664]: I1003 07:49:15.660556 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:15 crc kubenswrapper[4664]: I1003 07:49:15.660629 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:15Z","lastTransitionTime":"2025-10-03T07:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:15 crc kubenswrapper[4664]: I1003 07:49:15.762474 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:15 crc kubenswrapper[4664]: I1003 07:49:15.762505 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:15 crc kubenswrapper[4664]: I1003 07:49:15.762513 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:15 crc kubenswrapper[4664]: I1003 07:49:15.762526 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:15 crc kubenswrapper[4664]: I1003 07:49:15.762535 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:15Z","lastTransitionTime":"2025-10-03T07:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:15 crc kubenswrapper[4664]: I1003 07:49:15.864981 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:15 crc kubenswrapper[4664]: I1003 07:49:15.865044 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:15 crc kubenswrapper[4664]: I1003 07:49:15.865058 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:15 crc kubenswrapper[4664]: I1003 07:49:15.865076 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:15 crc kubenswrapper[4664]: I1003 07:49:15.865091 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:15Z","lastTransitionTime":"2025-10-03T07:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:15 crc kubenswrapper[4664]: I1003 07:49:15.875514 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:49:15 crc kubenswrapper[4664]: I1003 07:49:15.875545 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:49:15 crc kubenswrapper[4664]: E1003 07:49:15.875670 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:49:15 crc kubenswrapper[4664]: I1003 07:49:15.875684 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:49:15 crc kubenswrapper[4664]: E1003 07:49:15.876235 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:49:15 crc kubenswrapper[4664]: E1003 07:49:15.876815 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:49:15 crc kubenswrapper[4664]: I1003 07:49:15.967000 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:15 crc kubenswrapper[4664]: I1003 07:49:15.967046 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:15 crc kubenswrapper[4664]: I1003 07:49:15.967056 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:15 crc kubenswrapper[4664]: I1003 07:49:15.967071 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:15 crc kubenswrapper[4664]: I1003 07:49:15.967082 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:15Z","lastTransitionTime":"2025-10-03T07:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.068878 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.068921 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.068933 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.068950 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.068959 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:16Z","lastTransitionTime":"2025-10-03T07:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.171069 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.171101 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.171111 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.171123 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.171132 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:16Z","lastTransitionTime":"2025-10-03T07:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.224302 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.224338 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.224349 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.224364 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.224373 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:16Z","lastTransitionTime":"2025-10-03T07:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:16 crc kubenswrapper[4664]: E1003 07:49:16.234944 4664 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eca81c87-676e-4667-a87b-e015ec0be81c\\\",\\\"systemUUID\\\":\\\"7be6e848-96ef-48b1-8627-9ddc13d5cc87\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:16Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.237761 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.237797 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.237808 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.237823 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.237834 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:16Z","lastTransitionTime":"2025-10-03T07:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:16 crc kubenswrapper[4664]: E1003 07:49:16.249060 4664 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eca81c87-676e-4667-a87b-e015ec0be81c\\\",\\\"systemUUID\\\":\\\"7be6e848-96ef-48b1-8627-9ddc13d5cc87\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:16Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.252279 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.252329 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.252338 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.252355 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.252365 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:16Z","lastTransitionTime":"2025-10-03T07:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:16 crc kubenswrapper[4664]: E1003 07:49:16.263317 4664 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eca81c87-676e-4667-a87b-e015ec0be81c\\\",\\\"systemUUID\\\":\\\"7be6e848-96ef-48b1-8627-9ddc13d5cc87\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:16Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.266339 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.266369 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.266377 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.266391 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.266400 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:16Z","lastTransitionTime":"2025-10-03T07:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:16 crc kubenswrapper[4664]: E1003 07:49:16.279734 4664 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eca81c87-676e-4667-a87b-e015ec0be81c\\\",\\\"systemUUID\\\":\\\"7be6e848-96ef-48b1-8627-9ddc13d5cc87\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:16Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.283398 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.283434 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.283444 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.283458 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.283468 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:16Z","lastTransitionTime":"2025-10-03T07:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:16 crc kubenswrapper[4664]: E1003 07:49:16.295110 4664 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eca81c87-676e-4667-a87b-e015ec0be81c\\\",\\\"systemUUID\\\":\\\"7be6e848-96ef-48b1-8627-9ddc13d5cc87\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:16Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:16 crc kubenswrapper[4664]: E1003 07:49:16.295342 4664 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.296839 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.296876 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.296887 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.296904 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.296917 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:16Z","lastTransitionTime":"2025-10-03T07:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.406512 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.406556 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.406565 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.406579 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.406588 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:16Z","lastTransitionTime":"2025-10-03T07:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.508712 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.508751 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.508759 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.508775 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.508785 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:16Z","lastTransitionTime":"2025-10-03T07:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.611075 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.611122 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.611130 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.611144 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.611153 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:16Z","lastTransitionTime":"2025-10-03T07:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.714192 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.714236 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.714245 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.714258 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.714267 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:16Z","lastTransitionTime":"2025-10-03T07:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.816642 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.816678 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.816687 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.816701 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.816709 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:16Z","lastTransitionTime":"2025-10-03T07:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.875853 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:49:16 crc kubenswrapper[4664]: E1003 07:49:16.875989 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l687s" podUID="7f2800e0-b66e-4ab2-ad4f-37c5ffe60120" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.918885 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.918927 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.918941 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.918964 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:16 crc kubenswrapper[4664]: I1003 07:49:16.918979 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:16Z","lastTransitionTime":"2025-10-03T07:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.021568 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.021644 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.021658 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.021674 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.021684 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:17Z","lastTransitionTime":"2025-10-03T07:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.080655 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.095669 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7002f9af-9339-4bec-8b7a-ce0c1d20f3ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96ea8ece692f200d9dbe3650ba30a3680897c6d7c33c9b359b27e31692689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f580c83a6ce9959ed5a1c8b78834f7d6caecaeff3ed366fdef991ff728b08d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed30aa5b737df201bfca9e17073255b5d791430429d86149107650c38a1c0c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89484df39df055036ea8f81fb4e052a14292e7268e7c52088a497348141c3a81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 07:48:39.005956 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 07:48:39.006283 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 07:48:39.007038 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3527224080/tls.crt::/tmp/serving-cert-3527224080/tls.key\\\\\\\"\\\\nI1003 07:48:39.474760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 07:48:39.479276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 07:48:39.479354 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 07:48:39.479401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 07:48:39.479427 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 07:48:39.489134 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 07:48:39.489225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 07:48:39.489291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 07:48:39.489311 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 07:48:39.489336 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 07:48:39.489506 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 07:48:39.491496 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1078b48064376958ba64f9040dc1973628494ff4c9935c90c296bdb8a27f8738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:17Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.123974 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.124021 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.124032 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.124056 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.124069 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:17Z","lastTransitionTime":"2025-10-03T07:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.126203 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce0e6687a56096c1c89207f11b667732301910fdefad7e84fabca337fb456c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ce0e6687a56096c1c89207f11b667732301910fdefad7e84fabca337fb456c4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T07:48:57Z\\\",\\\"message\\\":\\\"ode-ca-9z9q9\\\\nI1003 07:48:57.083722 6091 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1003 07:48:57.083725 6091 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI1003 07:48:57.083739 6091 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1003 07:48:57.083744 6091 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1003 07:48:57.083746 6091 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nF1003 07:48:57.083749 6091 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-2jpvm_openshift-ovn-kubernetes(8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2jpvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:17Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.143880 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h865c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e441c3-e8db-4705-9da1-0c6513d57048\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de7994382833aa2a4b6d7926e96257038511c4f547f24336eb11f3cf89e2d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h865c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:17Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.157754 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-27zqq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988af70a-5398-4c96-b2a7-4b8e143303bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f95198276df57dfddfafb72c325577486b3eddb40e73fbe4afa95e8ce8a6183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cr7ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd25565edabdcc002b63aece712eb1cf0ff4eece8277aa01ad4875f2977f63ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cr7ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-27zqq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:17Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.170058 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l687s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f2800e0-b66e-4ab2-ad4f-37c5ffe60120\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6h72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6h72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l687s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:17Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.183194 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a8b21d2fb73dbe0bf41ef524cce8befb05dca708bc568a567ccd71bc1434b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:17Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.194595 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:17Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.204193 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hqfhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e03e57b4-cb64-42fa-b8c5-ee4863291568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31867b4c35050cbe6c4247edfe085dc9429d5516c19ad8da432699262b7fd092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hqfhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:17Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.214818 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8078a587-9c95-43a8-9cf8-4286835f8134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938aaa7e3e507e4f04632ecbdeb12d1e169b59fdcd8c19838e21fedc4b55d721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682632866f1e606691047ac952506c02f250c825f142a1676a4abcca32e58466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7eed5fa26eeb30a803512265d01d5fbb9fffad00d857a683939158da6b0bb7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e4e705e4bd02aaf141572ec91291240f8f7ec92b33cd45ba98ea8ce83f06918\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:17Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.225116 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b74b4b6-2022-4308-8071-0f972dd1d922\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b1c43aa954d4f0ba95fd6bafe459a5f2d2df82b8cdbcd661c2a6e6238526fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd376fcbd2d46400852a695c1682ba853522bcc98038fcdb55c89d87d13ef012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f5a207be586adc5a1d852c18ed4ca1bf7d81b707a2f55ac2e76f31c5a94ffb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373216c551b7fed67ebbcb9c4cac5afd1e715a5fda849d977798c3690dc4f928\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://373216c551b7fed67ebbcb9c4cac5afd1e715a5fda849d977798c3690dc4f928\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:17Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.225916 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.225954 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.225966 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.225983 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.225996 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:17Z","lastTransitionTime":"2025-10-03T07:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.236090 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:17Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.245112 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d6a6dd75fa3f860ac6164f4065d5fbb3ea31148102e726b792655d5b08e34d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:17Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.255504 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c51653288453da9ac24f91fdd67f9cec6bea6dc9b90432eeab5350b7e48f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://741ad5608cab1c104f0a912ca7325733265167e85291eb6bc1b08dd5a23a9b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:17Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.266037 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:17Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.274811 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b81ce-0ce7-498f-9337-ae5e6e64682b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf6aded6ff3dcd5fbea90e46bb42d6db6b963c9688abfed6fa40662139fb80f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b446e000609801e974810aa233005f7c2810cc977c7a13155452de76e1268427\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x9dgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:17Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.283802 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9z9q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4492bc6-9e61-4748-935e-e070a703c05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aac5ac1e508036a7e3cee3eb11577734e7923c4076399288863f294b7efa3fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqzrd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9z9q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:17Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.294494 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-72cm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6998d742-8d17-4f20-ab52-c30d9f7b0b89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93e9f047fd6ffc420a229d1159b5c53c40c5efe1f945e45bafa3ba93a69caef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9hpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-72cm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:17Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.328255 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.328384 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.328450 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.328519 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.328584 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:17Z","lastTransitionTime":"2025-10-03T07:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.431044 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.431111 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.431124 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.431141 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.431150 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:17Z","lastTransitionTime":"2025-10-03T07:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.533577 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.533842 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.533953 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.534041 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.534129 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:17Z","lastTransitionTime":"2025-10-03T07:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.636243 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.636491 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.636506 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.636524 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.636537 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:17Z","lastTransitionTime":"2025-10-03T07:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.739048 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.739078 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.739086 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.739098 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.739107 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:17Z","lastTransitionTime":"2025-10-03T07:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.841089 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.841129 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.841139 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.841157 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.841169 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:17Z","lastTransitionTime":"2025-10-03T07:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.875458 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.875496 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.875469 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:49:17 crc kubenswrapper[4664]: E1003 07:49:17.875695 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:49:17 crc kubenswrapper[4664]: E1003 07:49:17.875746 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:49:17 crc kubenswrapper[4664]: E1003 07:49:17.875810 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.876384 4664 scope.go:117] "RemoveContainer" containerID="4ce0e6687a56096c1c89207f11b667732301910fdefad7e84fabca337fb456c4" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.944153 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.944197 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.944207 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.944223 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:17 crc kubenswrapper[4664]: I1003 07:49:17.944236 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:17Z","lastTransitionTime":"2025-10-03T07:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.047031 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.047373 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.047388 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.047404 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.047414 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:18Z","lastTransitionTime":"2025-10-03T07:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.150250 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.150278 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.150288 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.150306 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.150317 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:18Z","lastTransitionTime":"2025-10-03T07:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.225178 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2jpvm_8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d/ovnkube-controller/1.log" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.227748 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" event={"ID":"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d","Type":"ContainerStarted","Data":"af9dfaab4099a0e60f66ad4c28ea685e11df2e1c3af1ac6805a4a1d70d841574"} Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.228749 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.245999 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c51653288453da9ac24f91fdd67f9cec6bea6dc9b90432eeab5350b7e48f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://741ad5608cab1c104f0a912ca7325733265167e85291eb6bc1b08dd5a23a9b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:18Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.254419 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.254451 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.254459 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.254473 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.254488 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:18Z","lastTransitionTime":"2025-10-03T07:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.265365 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a8b21d2fb73dbe0bf41ef524cce8befb05dca708bc568a567ccd71bc1434b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:18Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.287753 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:18Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.298214 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hqfhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e03e57b4-cb64-42fa-b8c5-ee4863291568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31867b4c35050cbe6c4247edfe085dc9429d5516c19ad8da432699262b7fd092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hqfhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:18Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.309522 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8078a587-9c95-43a8-9cf8-4286835f8134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938aaa7e3e507e4f04632ecbdeb12d1e169b59fdcd8c19838e21fedc4b55d721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682632866f1e606691047ac952506c02f250c825f142a1676a4abcca32e58466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7eed5fa26eeb30a803512265d01d5fbb9fffad00d857a683939158da6b0bb7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e4e705e4bd02aaf141572ec91291240f8f7ec92b33cd45ba98ea8ce83f06918\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:18Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.324355 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b74b4b6-2022-4308-8071-0f972dd1d922\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b1c43aa954d4f0ba95fd6bafe459a5f2d2df82b8cdbcd661c2a6e6238526fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd376fcbd2d46400852a695c1682ba853522bcc98038fcdb55c89d87d13ef012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f5a207be586adc5a1d852c18ed4ca1bf7d81b707a2f55ac2e76f31c5a94ffb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373216c551b7fed67ebbcb9c4cac5afd1e715a5fda849d977798c3690dc4f928\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://373216c551b7fed67ebbcb9c4cac5afd1e715a5fda849d977798c3690dc4f928\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:18Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.336380 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:18Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.355778 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d6a6dd75fa3f860ac6164f4065d5fbb3ea31148102e726b792655d5b08e34d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:18Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.357141 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.357182 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.357196 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.357215 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.357225 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:18Z","lastTransitionTime":"2025-10-03T07:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.368688 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:18Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.379799 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b81ce-0ce7-498f-9337-ae5e6e64682b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf6aded6ff3dcd5fbea90e46bb42d6db6b963c9688abfed6fa40662139fb80f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b446e000609801e974810aa233005f7c2810cc977c7a13155452de76e1268427\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x9dgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:18Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.389750 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9z9q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4492bc6-9e61-4748-935e-e070a703c05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aac5ac1e508036a7e3cee3eb11577734e7923c4076399288863f294b7efa3fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqzrd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9z9q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:18Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.402679 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-72cm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6998d742-8d17-4f20-ab52-c30d9f7b0b89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93e9f047fd6ffc420a229d1159b5c53c40c5efe1f945e45bafa3ba93a69caef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9hpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-72cm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:18Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.413243 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l687s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f2800e0-b66e-4ab2-ad4f-37c5ffe60120\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6h72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6h72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l687s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:18Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.427967 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7002f9af-9339-4bec-8b7a-ce0c1d20f3ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96ea8ece692f200d9dbe3650ba30a3680897c6d7c33c9b359b27e31692689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f580c83a6ce9959ed5a1c8b78834f7d6caecaeff3ed366fdef991ff728b08d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed30aa5b737df201bfca9e17073255b5d791430429d86149107650c38a1c0c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89484df39df055036ea8f81fb4e052a14292e7268e7c52088a497348141c3a81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 07:48:39.005956 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 07:48:39.006283 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 07:48:39.007038 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3527224080/tls.crt::/tmp/serving-cert-3527224080/tls.key\\\\\\\"\\\\nI1003 07:48:39.474760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 07:48:39.479276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 07:48:39.479354 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 07:48:39.479401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 07:48:39.479427 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 07:48:39.489134 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 07:48:39.489225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 07:48:39.489291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 07:48:39.489311 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 07:48:39.489336 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 07:48:39.489506 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 07:48:39.491496 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1078b48064376958ba64f9040dc1973628494ff4c9935c90c296bdb8a27f8738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:18Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.448203 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af9dfaab4099a0e60f66ad4c28ea685e11df2e1c3af1ac6805a4a1d70d841574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ce0e6687a56096c1c89207f11b667732301910fdefad7e84fabca337fb456c4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T07:48:57Z\\\",\\\"message\\\":\\\"ode-ca-9z9q9\\\\nI1003 07:48:57.083722 6091 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1003 07:48:57.083725 6091 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI1003 07:48:57.083739 6091 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1003 07:48:57.083744 6091 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1003 07:48:57.083746 6091 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nF1003 07:48:57.083749 6091 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2jpvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:18Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.459106 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.459143 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.459152 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.459166 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.459176 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:18Z","lastTransitionTime":"2025-10-03T07:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.462104 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h865c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e441c3-e8db-4705-9da1-0c6513d57048\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de7994382833aa2a4b6d7926e96257038511c4f547f24336eb11f3cf89e2d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h865c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:18Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.472449 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-27zqq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988af70a-5398-4c96-b2a7-4b8e143303bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f95198276df57dfddfafb72c325577486b3eddb40e73fbe4afa95e8ce8a6183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cr7ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd25565edabdcc002b63aece712eb1cf0ff4eece8277aa01ad4875f2977f63ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cr7ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-27zqq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:18Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.561242 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.561291 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.561302 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.561321 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.561338 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:18Z","lastTransitionTime":"2025-10-03T07:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.663562 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.663600 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.663625 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.663641 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.663656 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:18Z","lastTransitionTime":"2025-10-03T07:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.766013 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.766059 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.766076 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.766091 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.766102 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:18Z","lastTransitionTime":"2025-10-03T07:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.868138 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.868174 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.868185 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.868200 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.868211 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:18Z","lastTransitionTime":"2025-10-03T07:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.875668 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:49:18 crc kubenswrapper[4664]: E1003 07:49:18.875796 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l687s" podUID="7f2800e0-b66e-4ab2-ad4f-37c5ffe60120" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.970134 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.970170 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.970183 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.970199 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:18 crc kubenswrapper[4664]: I1003 07:49:18.970210 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:18Z","lastTransitionTime":"2025-10-03T07:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.072675 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.072719 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.072729 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.072742 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.072750 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:19Z","lastTransitionTime":"2025-10-03T07:49:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.174928 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.174981 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.174998 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.175022 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.175035 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:19Z","lastTransitionTime":"2025-10-03T07:49:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.231163 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2jpvm_8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d/ovnkube-controller/2.log" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.231681 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2jpvm_8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d/ovnkube-controller/1.log" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.233819 4664 generic.go:334] "Generic (PLEG): container finished" podID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerID="af9dfaab4099a0e60f66ad4c28ea685e11df2e1c3af1ac6805a4a1d70d841574" exitCode=1 Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.233850 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" event={"ID":"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d","Type":"ContainerDied","Data":"af9dfaab4099a0e60f66ad4c28ea685e11df2e1c3af1ac6805a4a1d70d841574"} Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.234042 4664 scope.go:117] "RemoveContainer" containerID="4ce0e6687a56096c1c89207f11b667732301910fdefad7e84fabca337fb456c4" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.234440 4664 scope.go:117] "RemoveContainer" containerID="af9dfaab4099a0e60f66ad4c28ea685e11df2e1c3af1ac6805a4a1d70d841574" Oct 03 07:49:19 crc kubenswrapper[4664]: E1003 07:49:19.234596 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2jpvm_openshift-ovn-kubernetes(8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.245886 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:19Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.256832 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b81ce-0ce7-498f-9337-ae5e6e64682b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf6aded6ff3dcd5fbea90e46bb42d6db6b963c9688abfed6fa40662139fb80f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b446e000609801e974810aa233005f7c2810cc977c7a13155452de76e1268427\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x9dgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:19Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.266679 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9z9q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4492bc6-9e61-4748-935e-e070a703c05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aac5ac1e508036a7e3cee3eb11577734e7923c4076399288863f294b7efa3fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqzrd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9z9q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:19Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.276886 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.276924 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.276935 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.276951 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.276963 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:19Z","lastTransitionTime":"2025-10-03T07:49:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.278128 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-72cm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6998d742-8d17-4f20-ab52-c30d9f7b0b89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93e9f047fd6ffc420a229d1159b5c53c40c5efe1f945e45bafa3ba93a69caef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9hpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-72cm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:19Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.290871 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7002f9af-9339-4bec-8b7a-ce0c1d20f3ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96ea8ece692f200d9dbe3650ba30a3680897c6d7c33c9b359b27e31692689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f580c83a6ce9959ed5a1c8b78834f7d6caecaeff3ed366fdef991ff728b08d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed30aa5b737df201bfca9e17073255b5d791430429d86149107650c38a1c0c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89484df39df055036ea8f81fb4e052a14292e7268e7c52088a497348141c3a81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 07:48:39.005956 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 07:48:39.006283 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 07:48:39.007038 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3527224080/tls.crt::/tmp/serving-cert-3527224080/tls.key\\\\\\\"\\\\nI1003 07:48:39.474760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 07:48:39.479276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 07:48:39.479354 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 07:48:39.479401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 07:48:39.479427 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 07:48:39.489134 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 07:48:39.489225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 07:48:39.489291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 07:48:39.489311 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 07:48:39.489336 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 07:48:39.489506 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 07:48:39.491496 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1078b48064376958ba64f9040dc1973628494ff4c9935c90c296bdb8a27f8738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:19Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.308327 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af9dfaab4099a0e60f66ad4c28ea685e11df2e1c3af1ac6805a4a1d70d841574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ce0e6687a56096c1c89207f11b667732301910fdefad7e84fabca337fb456c4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T07:48:57Z\\\",\\\"message\\\":\\\"ode-ca-9z9q9\\\\nI1003 07:48:57.083722 6091 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1003 07:48:57.083725 6091 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI1003 07:48:57.083739 6091 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1003 07:48:57.083744 6091 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1003 07:48:57.083746 6091 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nF1003 07:48:57.083749 6091 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9dfaab4099a0e60f66ad4c28ea685e11df2e1c3af1ac6805a4a1d70d841574\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T07:49:18Z\\\",\\\"message\\\":\\\"Pod openshift-ovn-kubernetes/ovnkube-node-2jpvm\\\\nI1003 07:49:18.633255 6386 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1003 07:49:18.633257 6386 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-27zqq\\\\nI1003 07:49:18.633264 6386 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1003 07:49:18.633264 6386 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-h865c\\\\nI1003 07:49:18.633271 6386 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1003 07:49:18.633218 6386 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nF1003 07:49:18.633264 6386 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network contr\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2jpvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:19Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.323976 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h865c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e441c3-e8db-4705-9da1-0c6513d57048\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de7994382833aa2a4b6d7926e96257038511c4f547f24336eb11f3cf89e2d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h865c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:19Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.334035 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-27zqq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988af70a-5398-4c96-b2a7-4b8e143303bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f95198276df57dfddfafb72c325577486b3eddb40e73fbe4afa95e8ce8a6183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cr7ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd25565edabdcc002b63aece712eb1cf0ff4eece8277aa01ad4875f2977f63ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cr7ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-27zqq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:19Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.344279 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l687s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f2800e0-b66e-4ab2-ad4f-37c5ffe60120\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6h72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6h72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l687s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:19Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.356716 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a8b21d2fb73dbe0bf41ef524cce8befb05dca708bc568a567ccd71bc1434b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:19Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.371316 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:19Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.379233 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.379392 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.379505 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.379584 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.379685 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:19Z","lastTransitionTime":"2025-10-03T07:49:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.383050 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hqfhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e03e57b4-cb64-42fa-b8c5-ee4863291568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31867b4c35050cbe6c4247edfe085dc9429d5516c19ad8da432699262b7fd092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hqfhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:19Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.393737 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8078a587-9c95-43a8-9cf8-4286835f8134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938aaa7e3e507e4f04632ecbdeb12d1e169b59fdcd8c19838e21fedc4b55d721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682632866f1e606691047ac952506c02f250c825f142a1676a4abcca32e58466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7eed5fa26eeb30a803512265d01d5fbb9fffad00d857a683939158da6b0bb7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e4e705e4bd02aaf141572ec91291240f8f7ec92b33cd45ba98ea8ce83f06918\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:19Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.402664 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b74b4b6-2022-4308-8071-0f972dd1d922\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b1c43aa954d4f0ba95fd6bafe459a5f2d2df82b8cdbcd661c2a6e6238526fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd376fcbd2d46400852a695c1682ba853522bcc98038fcdb55c89d87d13ef012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f5a207be586adc5a1d852c18ed4ca1bf7d81b707a2f55ac2e76f31c5a94ffb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373216c551b7fed67ebbcb9c4cac5afd1e715a5fda849d977798c3690dc4f928\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://373216c551b7fed67ebbcb9c4cac5afd1e715a5fda849d977798c3690dc4f928\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:19Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.412576 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:19Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.422098 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d6a6dd75fa3f860ac6164f4065d5fbb3ea31148102e726b792655d5b08e34d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:19Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.431771 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c51653288453da9ac24f91fdd67f9cec6bea6dc9b90432eeab5350b7e48f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://741ad5608cab1c104f0a912ca7325733265167e85291eb6bc1b08dd5a23a9b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:19Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.481872 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.481908 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.481916 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.481928 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.481938 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:19Z","lastTransitionTime":"2025-10-03T07:49:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.584328 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.584393 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.584405 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.584420 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.584428 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:19Z","lastTransitionTime":"2025-10-03T07:49:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.687048 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.687748 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.687805 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.687838 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.687866 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:19Z","lastTransitionTime":"2025-10-03T07:49:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.790188 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.790226 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.790235 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.790249 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.790258 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:19Z","lastTransitionTime":"2025-10-03T07:49:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.875184 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:49:19 crc kubenswrapper[4664]: E1003 07:49:19.875311 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.875189 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.875194 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:49:19 crc kubenswrapper[4664]: E1003 07:49:19.875557 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:49:19 crc kubenswrapper[4664]: E1003 07:49:19.875682 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.887629 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-72cm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6998d742-8d17-4f20-ab52-c30d9f7b0b89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93e9f047fd6ffc420a229d1159b5c53c40c5efe1f945e45bafa3ba93a69caef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9hpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-72cm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:19Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.892198 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.892242 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.892255 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.892272 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.892282 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:19Z","lastTransitionTime":"2025-10-03T07:49:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.900403 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h865c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e441c3-e8db-4705-9da1-0c6513d57048\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de7994382833aa2a4b6d7926e96257038511c4f547f24336eb11f3cf89e2d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h865c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:19Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.911025 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-27zqq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988af70a-5398-4c96-b2a7-4b8e143303bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f95198276df57dfddfafb72c325577486b3eddb40e73fbe4afa95e8ce8a6183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cr7ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd25565edabdcc002b63aece712eb1cf0ff4eece8277aa01ad4875f2977f63ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cr7ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-27zqq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:19Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.922373 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l687s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f2800e0-b66e-4ab2-ad4f-37c5ffe60120\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6h72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6h72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l687s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:19Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.938396 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7002f9af-9339-4bec-8b7a-ce0c1d20f3ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96ea8ece692f200d9dbe3650ba30a3680897c6d7c33c9b359b27e31692689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f580c83a6ce9959ed5a1c8b78834f7d6caecaeff3ed366fdef991ff728b08d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed30aa5b737df201bfca9e17073255b5d791430429d86149107650c38a1c0c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89484df39df055036ea8f81fb4e052a14292e7268e7c52088a497348141c3a81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 07:48:39.005956 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 07:48:39.006283 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 07:48:39.007038 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3527224080/tls.crt::/tmp/serving-cert-3527224080/tls.key\\\\\\\"\\\\nI1003 07:48:39.474760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 07:48:39.479276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 07:48:39.479354 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 07:48:39.479401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 07:48:39.479427 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 07:48:39.489134 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 07:48:39.489225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 07:48:39.489291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 07:48:39.489311 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 07:48:39.489336 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 07:48:39.489506 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 07:48:39.491496 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1078b48064376958ba64f9040dc1973628494ff4c9935c90c296bdb8a27f8738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:19Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.961478 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af9dfaab4099a0e60f66ad4c28ea685e11df2e1c3af1ac6805a4a1d70d841574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ce0e6687a56096c1c89207f11b667732301910fdefad7e84fabca337fb456c4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T07:48:57Z\\\",\\\"message\\\":\\\"ode-ca-9z9q9\\\\nI1003 07:48:57.083722 6091 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1003 07:48:57.083725 6091 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI1003 07:48:57.083739 6091 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1003 07:48:57.083744 6091 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1003 07:48:57.083746 6091 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nF1003 07:48:57.083749 6091 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9dfaab4099a0e60f66ad4c28ea685e11df2e1c3af1ac6805a4a1d70d841574\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T07:49:18Z\\\",\\\"message\\\":\\\"Pod openshift-ovn-kubernetes/ovnkube-node-2jpvm\\\\nI1003 07:49:18.633255 6386 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1003 07:49:18.633257 6386 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-27zqq\\\\nI1003 07:49:18.633264 6386 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1003 07:49:18.633264 6386 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-h865c\\\\nI1003 07:49:18.633271 6386 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1003 07:49:18.633218 6386 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nF1003 07:49:18.633264 6386 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network contr\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2jpvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:19Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.977454 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:19Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.990335 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d6a6dd75fa3f860ac6164f4065d5fbb3ea31148102e726b792655d5b08e34d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:19Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.994046 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.994090 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.994103 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.994120 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:19 crc kubenswrapper[4664]: I1003 07:49:19.994130 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:19Z","lastTransitionTime":"2025-10-03T07:49:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.003196 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c51653288453da9ac24f91fdd67f9cec6bea6dc9b90432eeab5350b7e48f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://741ad5608cab1c104f0a912ca7325733265167e85291eb6bc1b08dd5a23a9b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:20Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.015132 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a8b21d2fb73dbe0bf41ef524cce8befb05dca708bc568a567ccd71bc1434b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:20Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.027549 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:20Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.037439 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hqfhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e03e57b4-cb64-42fa-b8c5-ee4863291568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31867b4c35050cbe6c4247edfe085dc9429d5516c19ad8da432699262b7fd092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hqfhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:20Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.048881 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8078a587-9c95-43a8-9cf8-4286835f8134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938aaa7e3e507e4f04632ecbdeb12d1e169b59fdcd8c19838e21fedc4b55d721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682632866f1e606691047ac952506c02f250c825f142a1676a4abcca32e58466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7eed5fa26eeb30a803512265d01d5fbb9fffad00d857a683939158da6b0bb7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e4e705e4bd02aaf141572ec91291240f8f7ec92b33cd45ba98ea8ce83f06918\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:20Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.059238 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b74b4b6-2022-4308-8071-0f972dd1d922\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b1c43aa954d4f0ba95fd6bafe459a5f2d2df82b8cdbcd661c2a6e6238526fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd376fcbd2d46400852a695c1682ba853522bcc98038fcdb55c89d87d13ef012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f5a207be586adc5a1d852c18ed4ca1bf7d81b707a2f55ac2e76f31c5a94ffb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373216c551b7fed67ebbcb9c4cac5afd1e715a5fda849d977798c3690dc4f928\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://373216c551b7fed67ebbcb9c4cac5afd1e715a5fda849d977798c3690dc4f928\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:20Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.069662 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b81ce-0ce7-498f-9337-ae5e6e64682b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf6aded6ff3dcd5fbea90e46bb42d6db6b963c9688abfed6fa40662139fb80f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b446e000609801e974810aa233005f7c2810cc977c7a13155452de76e1268427\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x9dgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:20Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.079440 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9z9q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4492bc6-9e61-4748-935e-e070a703c05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aac5ac1e508036a7e3cee3eb11577734e7923c4076399288863f294b7efa3fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqzrd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9z9q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:20Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.090348 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:20Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.097205 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.097248 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.097258 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.097273 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.097285 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:20Z","lastTransitionTime":"2025-10-03T07:49:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.200194 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.200240 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.200256 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.200273 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.200283 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:20Z","lastTransitionTime":"2025-10-03T07:49:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.239550 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2jpvm_8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d/ovnkube-controller/2.log" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.242826 4664 scope.go:117] "RemoveContainer" containerID="af9dfaab4099a0e60f66ad4c28ea685e11df2e1c3af1ac6805a4a1d70d841574" Oct 03 07:49:20 crc kubenswrapper[4664]: E1003 07:49:20.242957 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2jpvm_openshift-ovn-kubernetes(8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.258354 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a8b21d2fb73dbe0bf41ef524cce8befb05dca708bc568a567ccd71bc1434b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:20Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.273965 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:20Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.285751 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hqfhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e03e57b4-cb64-42fa-b8c5-ee4863291568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31867b4c35050cbe6c4247edfe085dc9429d5516c19ad8da432699262b7fd092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hqfhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:20Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.297877 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8078a587-9c95-43a8-9cf8-4286835f8134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938aaa7e3e507e4f04632ecbdeb12d1e169b59fdcd8c19838e21fedc4b55d721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682632866f1e606691047ac952506c02f250c825f142a1676a4abcca32e58466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7eed5fa26eeb30a803512265d01d5fbb9fffad00d857a683939158da6b0bb7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e4e705e4bd02aaf141572ec91291240f8f7ec92b33cd45ba98ea8ce83f06918\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:20Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.302252 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.302283 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.302292 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.302305 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.302318 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:20Z","lastTransitionTime":"2025-10-03T07:49:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.309354 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b74b4b6-2022-4308-8071-0f972dd1d922\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b1c43aa954d4f0ba95fd6bafe459a5f2d2df82b8cdbcd661c2a6e6238526fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd376fcbd2d46400852a695c1682ba853522bcc98038fcdb55c89d87d13ef012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f5a207be586adc5a1d852c18ed4ca1bf7d81b707a2f55ac2e76f31c5a94ffb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373216c551b7fed67ebbcb9c4cac5afd1e715a5fda849d977798c3690dc4f928\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://373216c551b7fed67ebbcb9c4cac5afd1e715a5fda849d977798c3690dc4f928\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:20Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.320635 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:20Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.332350 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d6a6dd75fa3f860ac6164f4065d5fbb3ea31148102e726b792655d5b08e34d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:20Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.347256 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c51653288453da9ac24f91fdd67f9cec6bea6dc9b90432eeab5350b7e48f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://741ad5608cab1c104f0a912ca7325733265167e85291eb6bc1b08dd5a23a9b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:20Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.362096 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:20Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.376273 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b81ce-0ce7-498f-9337-ae5e6e64682b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf6aded6ff3dcd5fbea90e46bb42d6db6b963c9688abfed6fa40662139fb80f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b446e000609801e974810aa233005f7c2810cc977c7a13155452de76e1268427\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x9dgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:20Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.388806 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9z9q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4492bc6-9e61-4748-935e-e070a703c05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aac5ac1e508036a7e3cee3eb11577734e7923c4076399288863f294b7efa3fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqzrd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9z9q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:20Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.402862 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-72cm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6998d742-8d17-4f20-ab52-c30d9f7b0b89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93e9f047fd6ffc420a229d1159b5c53c40c5efe1f945e45bafa3ba93a69caef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9hpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-72cm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:20Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.404175 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.404212 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.404221 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.404236 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.404247 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:20Z","lastTransitionTime":"2025-10-03T07:49:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.420131 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7002f9af-9339-4bec-8b7a-ce0c1d20f3ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96ea8ece692f200d9dbe3650ba30a3680897c6d7c33c9b359b27e31692689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f580c83a6ce9959ed5a1c8b78834f7d6caecaeff3ed366fdef991ff728b08d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed30aa5b737df201bfca9e17073255b5d791430429d86149107650c38a1c0c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89484df39df055036ea8f81fb4e052a14292e7268e7c52088a497348141c3a81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 07:48:39.005956 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 07:48:39.006283 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 07:48:39.007038 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3527224080/tls.crt::/tmp/serving-cert-3527224080/tls.key\\\\\\\"\\\\nI1003 07:48:39.474760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 07:48:39.479276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 07:48:39.479354 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 07:48:39.479401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 07:48:39.479427 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 07:48:39.489134 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 07:48:39.489225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 07:48:39.489291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 07:48:39.489311 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 07:48:39.489336 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 07:48:39.489506 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 07:48:39.491496 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1078b48064376958ba64f9040dc1973628494ff4c9935c90c296bdb8a27f8738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:20Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.439691 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af9dfaab4099a0e60f66ad4c28ea685e11df2e1c3af1ac6805a4a1d70d841574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9dfaab4099a0e60f66ad4c28ea685e11df2e1c3af1ac6805a4a1d70d841574\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T07:49:18Z\\\",\\\"message\\\":\\\"Pod openshift-ovn-kubernetes/ovnkube-node-2jpvm\\\\nI1003 07:49:18.633255 6386 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1003 07:49:18.633257 6386 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-27zqq\\\\nI1003 07:49:18.633264 6386 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1003 07:49:18.633264 6386 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-h865c\\\\nI1003 07:49:18.633271 6386 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1003 07:49:18.633218 6386 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nF1003 07:49:18.633264 6386 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network contr\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:49:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2jpvm_openshift-ovn-kubernetes(8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2jpvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:20Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.453582 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h865c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e441c3-e8db-4705-9da1-0c6513d57048\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de7994382833aa2a4b6d7926e96257038511c4f547f24336eb11f3cf89e2d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h865c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:20Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.466386 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-27zqq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988af70a-5398-4c96-b2a7-4b8e143303bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f95198276df57dfddfafb72c325577486b3eddb40e73fbe4afa95e8ce8a6183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cr7ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd25565edabdcc002b63aece712eb1cf0ff4eece8277aa01ad4875f2977f63ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cr7ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-27zqq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:20Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.477053 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l687s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f2800e0-b66e-4ab2-ad4f-37c5ffe60120\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6h72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6h72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l687s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:20Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.507160 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.507200 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.507211 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.507225 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.507235 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:20Z","lastTransitionTime":"2025-10-03T07:49:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.610004 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.610048 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.610057 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.610072 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.610082 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:20Z","lastTransitionTime":"2025-10-03T07:49:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.712061 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.712091 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.712100 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.712112 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.712120 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:20Z","lastTransitionTime":"2025-10-03T07:49:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.814733 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.814774 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.814784 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.814798 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.814808 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:20Z","lastTransitionTime":"2025-10-03T07:49:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.875353 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:49:20 crc kubenswrapper[4664]: E1003 07:49:20.875485 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l687s" podUID="7f2800e0-b66e-4ab2-ad4f-37c5ffe60120" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.917085 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.917128 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.917139 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.917152 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:20 crc kubenswrapper[4664]: I1003 07:49:20.917160 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:20Z","lastTransitionTime":"2025-10-03T07:49:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:21 crc kubenswrapper[4664]: I1003 07:49:21.019595 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:21 crc kubenswrapper[4664]: I1003 07:49:21.019651 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:21 crc kubenswrapper[4664]: I1003 07:49:21.019660 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:21 crc kubenswrapper[4664]: I1003 07:49:21.019673 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:21 crc kubenswrapper[4664]: I1003 07:49:21.019682 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:21Z","lastTransitionTime":"2025-10-03T07:49:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:21 crc kubenswrapper[4664]: I1003 07:49:21.121860 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:21 crc kubenswrapper[4664]: I1003 07:49:21.121930 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:21 crc kubenswrapper[4664]: I1003 07:49:21.121970 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:21 crc kubenswrapper[4664]: I1003 07:49:21.121986 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:21 crc kubenswrapper[4664]: I1003 07:49:21.122001 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:21Z","lastTransitionTime":"2025-10-03T07:49:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:21 crc kubenswrapper[4664]: I1003 07:49:21.223911 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:21 crc kubenswrapper[4664]: I1003 07:49:21.223952 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:21 crc kubenswrapper[4664]: I1003 07:49:21.223962 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:21 crc kubenswrapper[4664]: I1003 07:49:21.223977 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:21 crc kubenswrapper[4664]: I1003 07:49:21.223988 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:21Z","lastTransitionTime":"2025-10-03T07:49:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:21 crc kubenswrapper[4664]: I1003 07:49:21.326460 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:21 crc kubenswrapper[4664]: I1003 07:49:21.326493 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:21 crc kubenswrapper[4664]: I1003 07:49:21.326500 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:21 crc kubenswrapper[4664]: I1003 07:49:21.326513 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:21 crc kubenswrapper[4664]: I1003 07:49:21.326522 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:21Z","lastTransitionTime":"2025-10-03T07:49:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:21 crc kubenswrapper[4664]: I1003 07:49:21.428884 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:21 crc kubenswrapper[4664]: I1003 07:49:21.428928 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:21 crc kubenswrapper[4664]: I1003 07:49:21.428937 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:21 crc kubenswrapper[4664]: I1003 07:49:21.428951 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:21 crc kubenswrapper[4664]: I1003 07:49:21.428962 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:21Z","lastTransitionTime":"2025-10-03T07:49:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:21 crc kubenswrapper[4664]: I1003 07:49:21.531778 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:21 crc kubenswrapper[4664]: I1003 07:49:21.531823 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:21 crc kubenswrapper[4664]: I1003 07:49:21.531836 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:21 crc kubenswrapper[4664]: I1003 07:49:21.531852 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:21 crc kubenswrapper[4664]: I1003 07:49:21.531864 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:21Z","lastTransitionTime":"2025-10-03T07:49:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:21 crc kubenswrapper[4664]: I1003 07:49:21.634020 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:21 crc kubenswrapper[4664]: I1003 07:49:21.634066 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:21 crc kubenswrapper[4664]: I1003 07:49:21.634078 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:21 crc kubenswrapper[4664]: I1003 07:49:21.634092 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:21 crc kubenswrapper[4664]: I1003 07:49:21.634101 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:21Z","lastTransitionTime":"2025-10-03T07:49:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:21 crc kubenswrapper[4664]: I1003 07:49:21.736249 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:21 crc kubenswrapper[4664]: I1003 07:49:21.736298 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:21 crc kubenswrapper[4664]: I1003 07:49:21.736309 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:21 crc kubenswrapper[4664]: I1003 07:49:21.736325 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:21 crc kubenswrapper[4664]: I1003 07:49:21.736333 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:21Z","lastTransitionTime":"2025-10-03T07:49:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:21 crc kubenswrapper[4664]: I1003 07:49:21.839229 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:21 crc kubenswrapper[4664]: I1003 07:49:21.839267 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:21 crc kubenswrapper[4664]: I1003 07:49:21.839275 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:21 crc kubenswrapper[4664]: I1003 07:49:21.839288 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:21 crc kubenswrapper[4664]: I1003 07:49:21.839298 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:21Z","lastTransitionTime":"2025-10-03T07:49:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:21 crc kubenswrapper[4664]: I1003 07:49:21.875911 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:49:21 crc kubenswrapper[4664]: I1003 07:49:21.875955 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:49:21 crc kubenswrapper[4664]: I1003 07:49:21.875955 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:49:21 crc kubenswrapper[4664]: E1003 07:49:21.876029 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:49:21 crc kubenswrapper[4664]: E1003 07:49:21.876194 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:49:21 crc kubenswrapper[4664]: E1003 07:49:21.876251 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:49:21 crc kubenswrapper[4664]: I1003 07:49:21.941959 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:21 crc kubenswrapper[4664]: I1003 07:49:21.941999 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:21 crc kubenswrapper[4664]: I1003 07:49:21.942007 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:21 crc kubenswrapper[4664]: I1003 07:49:21.942020 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:21 crc kubenswrapper[4664]: I1003 07:49:21.942029 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:21Z","lastTransitionTime":"2025-10-03T07:49:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:22 crc kubenswrapper[4664]: I1003 07:49:22.045310 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:22 crc kubenswrapper[4664]: I1003 07:49:22.045373 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:22 crc kubenswrapper[4664]: I1003 07:49:22.045387 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:22 crc kubenswrapper[4664]: I1003 07:49:22.045406 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:22 crc kubenswrapper[4664]: I1003 07:49:22.045419 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:22Z","lastTransitionTime":"2025-10-03T07:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:22 crc kubenswrapper[4664]: I1003 07:49:22.149209 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:22 crc kubenswrapper[4664]: I1003 07:49:22.149260 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:22 crc kubenswrapper[4664]: I1003 07:49:22.149272 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:22 crc kubenswrapper[4664]: I1003 07:49:22.149288 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:22 crc kubenswrapper[4664]: I1003 07:49:22.149300 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:22Z","lastTransitionTime":"2025-10-03T07:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:22 crc kubenswrapper[4664]: I1003 07:49:22.251435 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:22 crc kubenswrapper[4664]: I1003 07:49:22.251483 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:22 crc kubenswrapper[4664]: I1003 07:49:22.251492 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:22 crc kubenswrapper[4664]: I1003 07:49:22.251507 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:22 crc kubenswrapper[4664]: I1003 07:49:22.251516 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:22Z","lastTransitionTime":"2025-10-03T07:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:22 crc kubenswrapper[4664]: I1003 07:49:22.354637 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:22 crc kubenswrapper[4664]: I1003 07:49:22.354701 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:22 crc kubenswrapper[4664]: I1003 07:49:22.354718 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:22 crc kubenswrapper[4664]: I1003 07:49:22.354741 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:22 crc kubenswrapper[4664]: I1003 07:49:22.354757 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:22Z","lastTransitionTime":"2025-10-03T07:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:22 crc kubenswrapper[4664]: I1003 07:49:22.457337 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:22 crc kubenswrapper[4664]: I1003 07:49:22.457392 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:22 crc kubenswrapper[4664]: I1003 07:49:22.457402 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:22 crc kubenswrapper[4664]: I1003 07:49:22.457417 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:22 crc kubenswrapper[4664]: I1003 07:49:22.457426 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:22Z","lastTransitionTime":"2025-10-03T07:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:22 crc kubenswrapper[4664]: I1003 07:49:22.560211 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:22 crc kubenswrapper[4664]: I1003 07:49:22.560300 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:22 crc kubenswrapper[4664]: I1003 07:49:22.560322 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:22 crc kubenswrapper[4664]: I1003 07:49:22.560355 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:22 crc kubenswrapper[4664]: I1003 07:49:22.560375 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:22Z","lastTransitionTime":"2025-10-03T07:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:22 crc kubenswrapper[4664]: I1003 07:49:22.662922 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:22 crc kubenswrapper[4664]: I1003 07:49:22.662956 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:22 crc kubenswrapper[4664]: I1003 07:49:22.662966 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:22 crc kubenswrapper[4664]: I1003 07:49:22.662982 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:22 crc kubenswrapper[4664]: I1003 07:49:22.663300 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:22Z","lastTransitionTime":"2025-10-03T07:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:22 crc kubenswrapper[4664]: I1003 07:49:22.764953 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:22 crc kubenswrapper[4664]: I1003 07:49:22.764996 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:22 crc kubenswrapper[4664]: I1003 07:49:22.765008 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:22 crc kubenswrapper[4664]: I1003 07:49:22.765023 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:22 crc kubenswrapper[4664]: I1003 07:49:22.765033 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:22Z","lastTransitionTime":"2025-10-03T07:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:22 crc kubenswrapper[4664]: I1003 07:49:22.867704 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:22 crc kubenswrapper[4664]: I1003 07:49:22.867756 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:22 crc kubenswrapper[4664]: I1003 07:49:22.867770 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:22 crc kubenswrapper[4664]: I1003 07:49:22.867784 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:22 crc kubenswrapper[4664]: I1003 07:49:22.868090 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:22Z","lastTransitionTime":"2025-10-03T07:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:22 crc kubenswrapper[4664]: I1003 07:49:22.875903 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:49:22 crc kubenswrapper[4664]: E1003 07:49:22.875994 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l687s" podUID="7f2800e0-b66e-4ab2-ad4f-37c5ffe60120" Oct 03 07:49:22 crc kubenswrapper[4664]: I1003 07:49:22.970138 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:22 crc kubenswrapper[4664]: I1003 07:49:22.970171 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:22 crc kubenswrapper[4664]: I1003 07:49:22.970186 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:22 crc kubenswrapper[4664]: I1003 07:49:22.970201 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:22 crc kubenswrapper[4664]: I1003 07:49:22.970211 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:22Z","lastTransitionTime":"2025-10-03T07:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:23 crc kubenswrapper[4664]: I1003 07:49:23.072510 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:23 crc kubenswrapper[4664]: I1003 07:49:23.072554 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:23 crc kubenswrapper[4664]: I1003 07:49:23.072564 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:23 crc kubenswrapper[4664]: I1003 07:49:23.072578 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:23 crc kubenswrapper[4664]: I1003 07:49:23.072587 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:23Z","lastTransitionTime":"2025-10-03T07:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:23 crc kubenswrapper[4664]: I1003 07:49:23.175277 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:23 crc kubenswrapper[4664]: I1003 07:49:23.175316 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:23 crc kubenswrapper[4664]: I1003 07:49:23.175325 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:23 crc kubenswrapper[4664]: I1003 07:49:23.175340 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:23 crc kubenswrapper[4664]: I1003 07:49:23.175351 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:23Z","lastTransitionTime":"2025-10-03T07:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:23 crc kubenswrapper[4664]: I1003 07:49:23.277896 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:23 crc kubenswrapper[4664]: I1003 07:49:23.277940 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:23 crc kubenswrapper[4664]: I1003 07:49:23.277951 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:23 crc kubenswrapper[4664]: I1003 07:49:23.277967 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:23 crc kubenswrapper[4664]: I1003 07:49:23.277978 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:23Z","lastTransitionTime":"2025-10-03T07:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:23 crc kubenswrapper[4664]: I1003 07:49:23.383485 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:23 crc kubenswrapper[4664]: I1003 07:49:23.383767 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:23 crc kubenswrapper[4664]: I1003 07:49:23.383782 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:23 crc kubenswrapper[4664]: I1003 07:49:23.383797 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:23 crc kubenswrapper[4664]: I1003 07:49:23.383809 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:23Z","lastTransitionTime":"2025-10-03T07:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:23 crc kubenswrapper[4664]: I1003 07:49:23.485516 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:23 crc kubenswrapper[4664]: I1003 07:49:23.485548 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:23 crc kubenswrapper[4664]: I1003 07:49:23.485558 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:23 crc kubenswrapper[4664]: I1003 07:49:23.485572 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:23 crc kubenswrapper[4664]: I1003 07:49:23.485582 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:23Z","lastTransitionTime":"2025-10-03T07:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:23 crc kubenswrapper[4664]: I1003 07:49:23.587736 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:23 crc kubenswrapper[4664]: I1003 07:49:23.587844 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:23 crc kubenswrapper[4664]: I1003 07:49:23.587866 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:23 crc kubenswrapper[4664]: I1003 07:49:23.587898 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:23 crc kubenswrapper[4664]: I1003 07:49:23.587918 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:23Z","lastTransitionTime":"2025-10-03T07:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:23 crc kubenswrapper[4664]: I1003 07:49:23.690620 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:23 crc kubenswrapper[4664]: I1003 07:49:23.690663 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:23 crc kubenswrapper[4664]: I1003 07:49:23.690672 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:23 crc kubenswrapper[4664]: I1003 07:49:23.690688 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:23 crc kubenswrapper[4664]: I1003 07:49:23.690698 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:23Z","lastTransitionTime":"2025-10-03T07:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:23 crc kubenswrapper[4664]: I1003 07:49:23.793388 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:23 crc kubenswrapper[4664]: I1003 07:49:23.793435 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:23 crc kubenswrapper[4664]: I1003 07:49:23.793447 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:23 crc kubenswrapper[4664]: I1003 07:49:23.793462 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:23 crc kubenswrapper[4664]: I1003 07:49:23.793474 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:23Z","lastTransitionTime":"2025-10-03T07:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:23 crc kubenswrapper[4664]: I1003 07:49:23.875724 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:49:23 crc kubenswrapper[4664]: I1003 07:49:23.875755 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:49:23 crc kubenswrapper[4664]: I1003 07:49:23.875800 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:49:23 crc kubenswrapper[4664]: E1003 07:49:23.875880 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:49:23 crc kubenswrapper[4664]: E1003 07:49:23.875993 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:49:23 crc kubenswrapper[4664]: E1003 07:49:23.876132 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:49:23 crc kubenswrapper[4664]: I1003 07:49:23.896093 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:23 crc kubenswrapper[4664]: I1003 07:49:23.896136 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:23 crc kubenswrapper[4664]: I1003 07:49:23.896148 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:23 crc kubenswrapper[4664]: I1003 07:49:23.896162 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:23 crc kubenswrapper[4664]: I1003 07:49:23.896170 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:23Z","lastTransitionTime":"2025-10-03T07:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:23 crc kubenswrapper[4664]: I1003 07:49:23.998511 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:23 crc kubenswrapper[4664]: I1003 07:49:23.998558 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:23 crc kubenswrapper[4664]: I1003 07:49:23.998574 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:23 crc kubenswrapper[4664]: I1003 07:49:23.998594 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:23 crc kubenswrapper[4664]: I1003 07:49:23.998638 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:23Z","lastTransitionTime":"2025-10-03T07:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:24 crc kubenswrapper[4664]: I1003 07:49:24.100699 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:24 crc kubenswrapper[4664]: I1003 07:49:24.100739 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:24 crc kubenswrapper[4664]: I1003 07:49:24.100749 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:24 crc kubenswrapper[4664]: I1003 07:49:24.100763 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:24 crc kubenswrapper[4664]: I1003 07:49:24.100772 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:24Z","lastTransitionTime":"2025-10-03T07:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:24 crc kubenswrapper[4664]: I1003 07:49:24.204071 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:24 crc kubenswrapper[4664]: I1003 07:49:24.204113 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:24 crc kubenswrapper[4664]: I1003 07:49:24.204122 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:24 crc kubenswrapper[4664]: I1003 07:49:24.204138 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:24 crc kubenswrapper[4664]: I1003 07:49:24.204150 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:24Z","lastTransitionTime":"2025-10-03T07:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:24 crc kubenswrapper[4664]: I1003 07:49:24.307707 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:24 crc kubenswrapper[4664]: I1003 07:49:24.307763 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:24 crc kubenswrapper[4664]: I1003 07:49:24.307779 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:24 crc kubenswrapper[4664]: I1003 07:49:24.307801 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:24 crc kubenswrapper[4664]: I1003 07:49:24.307814 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:24Z","lastTransitionTime":"2025-10-03T07:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:24 crc kubenswrapper[4664]: I1003 07:49:24.411084 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:24 crc kubenswrapper[4664]: I1003 07:49:24.411133 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:24 crc kubenswrapper[4664]: I1003 07:49:24.411147 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:24 crc kubenswrapper[4664]: I1003 07:49:24.411161 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:24 crc kubenswrapper[4664]: I1003 07:49:24.411171 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:24Z","lastTransitionTime":"2025-10-03T07:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:24 crc kubenswrapper[4664]: I1003 07:49:24.514595 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:24 crc kubenswrapper[4664]: I1003 07:49:24.514698 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:24 crc kubenswrapper[4664]: I1003 07:49:24.514715 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:24 crc kubenswrapper[4664]: I1003 07:49:24.514742 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:24 crc kubenswrapper[4664]: I1003 07:49:24.514762 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:24Z","lastTransitionTime":"2025-10-03T07:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:24 crc kubenswrapper[4664]: I1003 07:49:24.617253 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:24 crc kubenswrapper[4664]: I1003 07:49:24.617301 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:24 crc kubenswrapper[4664]: I1003 07:49:24.617312 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:24 crc kubenswrapper[4664]: I1003 07:49:24.617329 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:24 crc kubenswrapper[4664]: I1003 07:49:24.617341 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:24Z","lastTransitionTime":"2025-10-03T07:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:24 crc kubenswrapper[4664]: I1003 07:49:24.720138 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:24 crc kubenswrapper[4664]: I1003 07:49:24.720184 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:24 crc kubenswrapper[4664]: I1003 07:49:24.720195 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:24 crc kubenswrapper[4664]: I1003 07:49:24.720216 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:24 crc kubenswrapper[4664]: I1003 07:49:24.720229 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:24Z","lastTransitionTime":"2025-10-03T07:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:24 crc kubenswrapper[4664]: I1003 07:49:24.823904 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:24 crc kubenswrapper[4664]: I1003 07:49:24.823998 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:24 crc kubenswrapper[4664]: I1003 07:49:24.824027 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:24 crc kubenswrapper[4664]: I1003 07:49:24.824062 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:24 crc kubenswrapper[4664]: I1003 07:49:24.824090 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:24Z","lastTransitionTime":"2025-10-03T07:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:24 crc kubenswrapper[4664]: I1003 07:49:24.876189 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:49:24 crc kubenswrapper[4664]: E1003 07:49:24.876700 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l687s" podUID="7f2800e0-b66e-4ab2-ad4f-37c5ffe60120" Oct 03 07:49:24 crc kubenswrapper[4664]: I1003 07:49:24.926950 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:24 crc kubenswrapper[4664]: I1003 07:49:24.927229 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:24 crc kubenswrapper[4664]: I1003 07:49:24.927309 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:24 crc kubenswrapper[4664]: I1003 07:49:24.927405 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:24 crc kubenswrapper[4664]: I1003 07:49:24.927502 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:24Z","lastTransitionTime":"2025-10-03T07:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:25 crc kubenswrapper[4664]: I1003 07:49:25.030036 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:25 crc kubenswrapper[4664]: I1003 07:49:25.030374 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:25 crc kubenswrapper[4664]: I1003 07:49:25.030454 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:25 crc kubenswrapper[4664]: I1003 07:49:25.030537 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:25 crc kubenswrapper[4664]: I1003 07:49:25.030656 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:25Z","lastTransitionTime":"2025-10-03T07:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:25 crc kubenswrapper[4664]: I1003 07:49:25.132967 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:25 crc kubenswrapper[4664]: I1003 07:49:25.133004 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:25 crc kubenswrapper[4664]: I1003 07:49:25.133014 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:25 crc kubenswrapper[4664]: I1003 07:49:25.133030 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:25 crc kubenswrapper[4664]: I1003 07:49:25.133042 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:25Z","lastTransitionTime":"2025-10-03T07:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:25 crc kubenswrapper[4664]: I1003 07:49:25.235987 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:25 crc kubenswrapper[4664]: I1003 07:49:25.236053 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:25 crc kubenswrapper[4664]: I1003 07:49:25.236070 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:25 crc kubenswrapper[4664]: I1003 07:49:25.236099 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:25 crc kubenswrapper[4664]: I1003 07:49:25.236116 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:25Z","lastTransitionTime":"2025-10-03T07:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:25 crc kubenswrapper[4664]: I1003 07:49:25.338802 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:25 crc kubenswrapper[4664]: I1003 07:49:25.338839 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:25 crc kubenswrapper[4664]: I1003 07:49:25.338850 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:25 crc kubenswrapper[4664]: I1003 07:49:25.338867 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:25 crc kubenswrapper[4664]: I1003 07:49:25.338879 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:25Z","lastTransitionTime":"2025-10-03T07:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:25 crc kubenswrapper[4664]: I1003 07:49:25.440707 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:25 crc kubenswrapper[4664]: I1003 07:49:25.440746 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:25 crc kubenswrapper[4664]: I1003 07:49:25.440757 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:25 crc kubenswrapper[4664]: I1003 07:49:25.440773 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:25 crc kubenswrapper[4664]: I1003 07:49:25.440787 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:25Z","lastTransitionTime":"2025-10-03T07:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:25 crc kubenswrapper[4664]: I1003 07:49:25.568824 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:25 crc kubenswrapper[4664]: I1003 07:49:25.568861 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:25 crc kubenswrapper[4664]: I1003 07:49:25.568869 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:25 crc kubenswrapper[4664]: I1003 07:49:25.568883 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:25 crc kubenswrapper[4664]: I1003 07:49:25.568894 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:25Z","lastTransitionTime":"2025-10-03T07:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:25 crc kubenswrapper[4664]: I1003 07:49:25.670882 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:25 crc kubenswrapper[4664]: I1003 07:49:25.670928 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:25 crc kubenswrapper[4664]: I1003 07:49:25.670939 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:25 crc kubenswrapper[4664]: I1003 07:49:25.670955 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:25 crc kubenswrapper[4664]: I1003 07:49:25.670968 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:25Z","lastTransitionTime":"2025-10-03T07:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:25 crc kubenswrapper[4664]: I1003 07:49:25.774183 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:25 crc kubenswrapper[4664]: I1003 07:49:25.774249 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:25 crc kubenswrapper[4664]: I1003 07:49:25.774262 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:25 crc kubenswrapper[4664]: I1003 07:49:25.774281 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:25 crc kubenswrapper[4664]: I1003 07:49:25.774294 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:25Z","lastTransitionTime":"2025-10-03T07:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:25 crc kubenswrapper[4664]: I1003 07:49:25.875560 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:49:25 crc kubenswrapper[4664]: I1003 07:49:25.875600 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:49:25 crc kubenswrapper[4664]: E1003 07:49:25.875734 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:49:25 crc kubenswrapper[4664]: E1003 07:49:25.875882 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:49:25 crc kubenswrapper[4664]: I1003 07:49:25.875952 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:49:25 crc kubenswrapper[4664]: E1003 07:49:25.876032 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:49:25 crc kubenswrapper[4664]: I1003 07:49:25.877142 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:25 crc kubenswrapper[4664]: I1003 07:49:25.877176 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:25 crc kubenswrapper[4664]: I1003 07:49:25.877188 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:25 crc kubenswrapper[4664]: I1003 07:49:25.877205 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:25 crc kubenswrapper[4664]: I1003 07:49:25.877217 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:25Z","lastTransitionTime":"2025-10-03T07:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:25 crc kubenswrapper[4664]: I1003 07:49:25.979859 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:25 crc kubenswrapper[4664]: I1003 07:49:25.979905 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:25 crc kubenswrapper[4664]: I1003 07:49:25.979916 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:25 crc kubenswrapper[4664]: I1003 07:49:25.979933 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:25 crc kubenswrapper[4664]: I1003 07:49:25.979945 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:25Z","lastTransitionTime":"2025-10-03T07:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.082291 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.082363 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.082379 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.082396 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.082406 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:26Z","lastTransitionTime":"2025-10-03T07:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.184776 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.184819 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.184830 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.184846 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.184857 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:26Z","lastTransitionTime":"2025-10-03T07:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.286889 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.286962 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.286977 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.286993 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.287003 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:26Z","lastTransitionTime":"2025-10-03T07:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.390439 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.390499 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.390510 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.390530 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.390540 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:26Z","lastTransitionTime":"2025-10-03T07:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.468941 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.468999 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.469011 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.469026 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.469038 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:26Z","lastTransitionTime":"2025-10-03T07:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:26 crc kubenswrapper[4664]: E1003 07:49:26.482684 4664 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eca81c87-676e-4667-a87b-e015ec0be81c\\\",\\\"systemUUID\\\":\\\"7be6e848-96ef-48b1-8627-9ddc13d5cc87\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:26Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.487313 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.487381 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.487397 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.487417 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.487448 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:26Z","lastTransitionTime":"2025-10-03T07:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:26 crc kubenswrapper[4664]: E1003 07:49:26.499978 4664 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eca81c87-676e-4667-a87b-e015ec0be81c\\\",\\\"systemUUID\\\":\\\"7be6e848-96ef-48b1-8627-9ddc13d5cc87\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:26Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.504910 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.504949 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.504959 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.504995 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.505009 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:26Z","lastTransitionTime":"2025-10-03T07:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:26 crc kubenswrapper[4664]: E1003 07:49:26.520489 4664 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eca81c87-676e-4667-a87b-e015ec0be81c\\\",\\\"systemUUID\\\":\\\"7be6e848-96ef-48b1-8627-9ddc13d5cc87\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:26Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.525140 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.525189 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.525202 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.525221 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.525231 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:26Z","lastTransitionTime":"2025-10-03T07:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:26 crc kubenswrapper[4664]: E1003 07:49:26.540016 4664 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eca81c87-676e-4667-a87b-e015ec0be81c\\\",\\\"systemUUID\\\":\\\"7be6e848-96ef-48b1-8627-9ddc13d5cc87\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:26Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.545099 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.545143 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.545153 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.545172 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.545183 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:26Z","lastTransitionTime":"2025-10-03T07:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:26 crc kubenswrapper[4664]: E1003 07:49:26.557374 4664 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eca81c87-676e-4667-a87b-e015ec0be81c\\\",\\\"systemUUID\\\":\\\"7be6e848-96ef-48b1-8627-9ddc13d5cc87\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:26Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:26 crc kubenswrapper[4664]: E1003 07:49:26.557500 4664 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.559488 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.559530 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.559545 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.559562 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.559574 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:26Z","lastTransitionTime":"2025-10-03T07:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.662086 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.662142 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.662151 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.662166 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.662176 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:26Z","lastTransitionTime":"2025-10-03T07:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.765177 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.765238 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.765252 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.765272 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.765283 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:26Z","lastTransitionTime":"2025-10-03T07:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.867587 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.867661 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.867670 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.867696 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.867704 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:26Z","lastTransitionTime":"2025-10-03T07:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.875996 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:49:26 crc kubenswrapper[4664]: E1003 07:49:26.876197 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l687s" podUID="7f2800e0-b66e-4ab2-ad4f-37c5ffe60120" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.970359 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.970401 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.970410 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.970425 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:26 crc kubenswrapper[4664]: I1003 07:49:26.970434 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:26Z","lastTransitionTime":"2025-10-03T07:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:27 crc kubenswrapper[4664]: I1003 07:49:27.072596 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:27 crc kubenswrapper[4664]: I1003 07:49:27.072653 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:27 crc kubenswrapper[4664]: I1003 07:49:27.072663 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:27 crc kubenswrapper[4664]: I1003 07:49:27.072679 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:27 crc kubenswrapper[4664]: I1003 07:49:27.072689 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:27Z","lastTransitionTime":"2025-10-03T07:49:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:27 crc kubenswrapper[4664]: I1003 07:49:27.175076 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:27 crc kubenswrapper[4664]: I1003 07:49:27.175112 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:27 crc kubenswrapper[4664]: I1003 07:49:27.175123 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:27 crc kubenswrapper[4664]: I1003 07:49:27.175139 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:27 crc kubenswrapper[4664]: I1003 07:49:27.175150 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:27Z","lastTransitionTime":"2025-10-03T07:49:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:27 crc kubenswrapper[4664]: I1003 07:49:27.277471 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:27 crc kubenswrapper[4664]: I1003 07:49:27.277507 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:27 crc kubenswrapper[4664]: I1003 07:49:27.277517 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:27 crc kubenswrapper[4664]: I1003 07:49:27.277533 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:27 crc kubenswrapper[4664]: I1003 07:49:27.277543 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:27Z","lastTransitionTime":"2025-10-03T07:49:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:27 crc kubenswrapper[4664]: I1003 07:49:27.380014 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:27 crc kubenswrapper[4664]: I1003 07:49:27.380051 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:27 crc kubenswrapper[4664]: I1003 07:49:27.380060 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:27 crc kubenswrapper[4664]: I1003 07:49:27.380074 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:27 crc kubenswrapper[4664]: I1003 07:49:27.380083 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:27Z","lastTransitionTime":"2025-10-03T07:49:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:27 crc kubenswrapper[4664]: I1003 07:49:27.482000 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:27 crc kubenswrapper[4664]: I1003 07:49:27.482084 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:27 crc kubenswrapper[4664]: I1003 07:49:27.482097 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:27 crc kubenswrapper[4664]: I1003 07:49:27.482116 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:27 crc kubenswrapper[4664]: I1003 07:49:27.482130 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:27Z","lastTransitionTime":"2025-10-03T07:49:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:27 crc kubenswrapper[4664]: I1003 07:49:27.584144 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:27 crc kubenswrapper[4664]: I1003 07:49:27.584190 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:27 crc kubenswrapper[4664]: I1003 07:49:27.584200 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:27 crc kubenswrapper[4664]: I1003 07:49:27.584215 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:27 crc kubenswrapper[4664]: I1003 07:49:27.584226 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:27Z","lastTransitionTime":"2025-10-03T07:49:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:27 crc kubenswrapper[4664]: I1003 07:49:27.686970 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:27 crc kubenswrapper[4664]: I1003 07:49:27.687025 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:27 crc kubenswrapper[4664]: I1003 07:49:27.687042 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:27 crc kubenswrapper[4664]: I1003 07:49:27.687061 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:27 crc kubenswrapper[4664]: I1003 07:49:27.687071 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:27Z","lastTransitionTime":"2025-10-03T07:49:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:27 crc kubenswrapper[4664]: I1003 07:49:27.789648 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:27 crc kubenswrapper[4664]: I1003 07:49:27.789694 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:27 crc kubenswrapper[4664]: I1003 07:49:27.789703 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:27 crc kubenswrapper[4664]: I1003 07:49:27.789719 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:27 crc kubenswrapper[4664]: I1003 07:49:27.789730 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:27Z","lastTransitionTime":"2025-10-03T07:49:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:27 crc kubenswrapper[4664]: I1003 07:49:27.875587 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:49:27 crc kubenswrapper[4664]: I1003 07:49:27.875599 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:49:27 crc kubenswrapper[4664]: I1003 07:49:27.875618 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:49:27 crc kubenswrapper[4664]: E1003 07:49:27.875864 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:49:27 crc kubenswrapper[4664]: E1003 07:49:27.875905 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:49:27 crc kubenswrapper[4664]: E1003 07:49:27.875733 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:49:27 crc kubenswrapper[4664]: I1003 07:49:27.892131 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:27 crc kubenswrapper[4664]: I1003 07:49:27.892167 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:27 crc kubenswrapper[4664]: I1003 07:49:27.892179 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:27 crc kubenswrapper[4664]: I1003 07:49:27.892194 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:27 crc kubenswrapper[4664]: I1003 07:49:27.892206 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:27Z","lastTransitionTime":"2025-10-03T07:49:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:27 crc kubenswrapper[4664]: I1003 07:49:27.996888 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:27 crc kubenswrapper[4664]: I1003 07:49:27.996962 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:27 crc kubenswrapper[4664]: I1003 07:49:27.996977 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:27 crc kubenswrapper[4664]: I1003 07:49:27.997000 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:27 crc kubenswrapper[4664]: I1003 07:49:27.997010 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:27Z","lastTransitionTime":"2025-10-03T07:49:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:28 crc kubenswrapper[4664]: I1003 07:49:28.099839 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:28 crc kubenswrapper[4664]: I1003 07:49:28.099902 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:28 crc kubenswrapper[4664]: I1003 07:49:28.099913 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:28 crc kubenswrapper[4664]: I1003 07:49:28.099929 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:28 crc kubenswrapper[4664]: I1003 07:49:28.099940 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:28Z","lastTransitionTime":"2025-10-03T07:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:28 crc kubenswrapper[4664]: I1003 07:49:28.203031 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:28 crc kubenswrapper[4664]: I1003 07:49:28.203124 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:28 crc kubenswrapper[4664]: I1003 07:49:28.203138 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:28 crc kubenswrapper[4664]: I1003 07:49:28.203158 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:28 crc kubenswrapper[4664]: I1003 07:49:28.203170 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:28Z","lastTransitionTime":"2025-10-03T07:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:28 crc kubenswrapper[4664]: I1003 07:49:28.305940 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:28 crc kubenswrapper[4664]: I1003 07:49:28.305986 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:28 crc kubenswrapper[4664]: I1003 07:49:28.305998 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:28 crc kubenswrapper[4664]: I1003 07:49:28.306013 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:28 crc kubenswrapper[4664]: I1003 07:49:28.306024 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:28Z","lastTransitionTime":"2025-10-03T07:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:28 crc kubenswrapper[4664]: I1003 07:49:28.408470 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:28 crc kubenswrapper[4664]: I1003 07:49:28.408508 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:28 crc kubenswrapper[4664]: I1003 07:49:28.408517 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:28 crc kubenswrapper[4664]: I1003 07:49:28.408529 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:28 crc kubenswrapper[4664]: I1003 07:49:28.408540 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:28Z","lastTransitionTime":"2025-10-03T07:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:28 crc kubenswrapper[4664]: I1003 07:49:28.511941 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:28 crc kubenswrapper[4664]: I1003 07:49:28.512075 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:28 crc kubenswrapper[4664]: I1003 07:49:28.512094 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:28 crc kubenswrapper[4664]: I1003 07:49:28.512120 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:28 crc kubenswrapper[4664]: I1003 07:49:28.512177 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:28Z","lastTransitionTime":"2025-10-03T07:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:28 crc kubenswrapper[4664]: I1003 07:49:28.615757 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:28 crc kubenswrapper[4664]: I1003 07:49:28.615823 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:28 crc kubenswrapper[4664]: I1003 07:49:28.615836 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:28 crc kubenswrapper[4664]: I1003 07:49:28.615860 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:28 crc kubenswrapper[4664]: I1003 07:49:28.615874 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:28Z","lastTransitionTime":"2025-10-03T07:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:28 crc kubenswrapper[4664]: I1003 07:49:28.719083 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:28 crc kubenswrapper[4664]: I1003 07:49:28.719143 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:28 crc kubenswrapper[4664]: I1003 07:49:28.719155 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:28 crc kubenswrapper[4664]: I1003 07:49:28.719174 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:28 crc kubenswrapper[4664]: I1003 07:49:28.719186 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:28Z","lastTransitionTime":"2025-10-03T07:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:28 crc kubenswrapper[4664]: I1003 07:49:28.821698 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:28 crc kubenswrapper[4664]: I1003 07:49:28.821739 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:28 crc kubenswrapper[4664]: I1003 07:49:28.821751 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:28 crc kubenswrapper[4664]: I1003 07:49:28.821766 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:28 crc kubenswrapper[4664]: I1003 07:49:28.821778 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:28Z","lastTransitionTime":"2025-10-03T07:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:28 crc kubenswrapper[4664]: I1003 07:49:28.875393 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:49:28 crc kubenswrapper[4664]: E1003 07:49:28.875537 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l687s" podUID="7f2800e0-b66e-4ab2-ad4f-37c5ffe60120" Oct 03 07:49:28 crc kubenswrapper[4664]: I1003 07:49:28.924350 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:28 crc kubenswrapper[4664]: I1003 07:49:28.924406 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:28 crc kubenswrapper[4664]: I1003 07:49:28.924419 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:28 crc kubenswrapper[4664]: I1003 07:49:28.924438 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:28 crc kubenswrapper[4664]: I1003 07:49:28.924453 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:28Z","lastTransitionTime":"2025-10-03T07:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.026878 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.026928 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.026940 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.026956 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.026967 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:29Z","lastTransitionTime":"2025-10-03T07:49:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.129664 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.129705 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.129716 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.129730 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.129742 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:29Z","lastTransitionTime":"2025-10-03T07:49:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.232694 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.233095 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.233237 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.233416 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.233534 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:29Z","lastTransitionTime":"2025-10-03T07:49:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.337349 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.337398 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.337408 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.337429 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.337439 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:29Z","lastTransitionTime":"2025-10-03T07:49:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.440096 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.440164 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.440178 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.440196 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.440209 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:29Z","lastTransitionTime":"2025-10-03T07:49:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.542848 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.542914 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.542930 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.542961 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.542983 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:29Z","lastTransitionTime":"2025-10-03T07:49:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.645314 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.645361 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.645374 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.645396 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.645412 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:29Z","lastTransitionTime":"2025-10-03T07:49:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.748665 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.748743 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.748759 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.748782 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.748798 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:29Z","lastTransitionTime":"2025-10-03T07:49:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.852168 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.852243 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.852255 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.852271 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.852283 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:29Z","lastTransitionTime":"2025-10-03T07:49:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.875435 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:49:29 crc kubenswrapper[4664]: E1003 07:49:29.875689 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.875487 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:49:29 crc kubenswrapper[4664]: E1003 07:49:29.876112 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.876383 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:49:29 crc kubenswrapper[4664]: E1003 07:49:29.876781 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.891103 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:29Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.903426 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b81ce-0ce7-498f-9337-ae5e6e64682b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf6aded6ff3dcd5fbea90e46bb42d6db6b963c9688abfed6fa40662139fb80f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b446e000609801e974810aa233005f7c2810cc977c7a13155452de76e1268427\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x9dgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:29Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.915224 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9z9q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4492bc6-9e61-4748-935e-e070a703c05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aac5ac1e508036a7e3cee3eb11577734e7923c4076399288863f294b7efa3fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqzrd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9z9q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:29Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.933008 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-72cm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6998d742-8d17-4f20-ab52-c30d9f7b0b89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93e9f047fd6ffc420a229d1159b5c53c40c5efe1f945e45bafa3ba93a69caef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9hpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-72cm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:29Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.948196 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7002f9af-9339-4bec-8b7a-ce0c1d20f3ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96ea8ece692f200d9dbe3650ba30a3680897c6d7c33c9b359b27e31692689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f580c83a6ce9959ed5a1c8b78834f7d6caecaeff3ed366fdef991ff728b08d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed30aa5b737df201bfca9e17073255b5d791430429d86149107650c38a1c0c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89484df39df055036ea8f81fb4e052a14292e7268e7c52088a497348141c3a81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 07:48:39.005956 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 07:48:39.006283 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 07:48:39.007038 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3527224080/tls.crt::/tmp/serving-cert-3527224080/tls.key\\\\\\\"\\\\nI1003 07:48:39.474760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 07:48:39.479276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 07:48:39.479354 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 07:48:39.479401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 07:48:39.479427 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 07:48:39.489134 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 07:48:39.489225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 07:48:39.489291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 07:48:39.489311 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 07:48:39.489336 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 07:48:39.489506 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 07:48:39.491496 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1078b48064376958ba64f9040dc1973628494ff4c9935c90c296bdb8a27f8738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:29Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.954304 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.954351 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.954369 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.954390 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.954404 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:29Z","lastTransitionTime":"2025-10-03T07:49:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.973080 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af9dfaab4099a0e60f66ad4c28ea685e11df2e1c3af1ac6805a4a1d70d841574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9dfaab4099a0e60f66ad4c28ea685e11df2e1c3af1ac6805a4a1d70d841574\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T07:49:18Z\\\",\\\"message\\\":\\\"Pod openshift-ovn-kubernetes/ovnkube-node-2jpvm\\\\nI1003 07:49:18.633255 6386 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1003 07:49:18.633257 6386 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-27zqq\\\\nI1003 07:49:18.633264 6386 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1003 07:49:18.633264 6386 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-h865c\\\\nI1003 07:49:18.633271 6386 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1003 07:49:18.633218 6386 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nF1003 07:49:18.633264 6386 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network contr\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:49:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2jpvm_openshift-ovn-kubernetes(8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2jpvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:29Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.988857 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h865c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e441c3-e8db-4705-9da1-0c6513d57048\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de7994382833aa2a4b6d7926e96257038511c4f547f24336eb11f3cf89e2d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h865c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:29Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:29 crc kubenswrapper[4664]: I1003 07:49:29.999569 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-27zqq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988af70a-5398-4c96-b2a7-4b8e143303bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f95198276df57dfddfafb72c325577486b3eddb40e73fbe4afa95e8ce8a6183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cr7ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd25565edabdcc002b63aece712eb1cf0ff4eece8277aa01ad4875f2977f63ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cr7ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-27zqq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:29Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.009077 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l687s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f2800e0-b66e-4ab2-ad4f-37c5ffe60120\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6h72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6h72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l687s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:30Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.020923 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:30Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.032186 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hqfhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e03e57b4-cb64-42fa-b8c5-ee4863291568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31867b4c35050cbe6c4247edfe085dc9429d5516c19ad8da432699262b7fd092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hqfhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:30Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.045561 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8078a587-9c95-43a8-9cf8-4286835f8134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938aaa7e3e507e4f04632ecbdeb12d1e169b59fdcd8c19838e21fedc4b55d721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682632866f1e606691047ac952506c02f250c825f142a1676a4abcca32e58466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7eed5fa26eeb30a803512265d01d5fbb9fffad00d857a683939158da6b0bb7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e4e705e4bd02aaf141572ec91291240f8f7ec92b33cd45ba98ea8ce83f06918\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:30Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.056413 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.056447 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.056458 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.056471 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.056481 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:30Z","lastTransitionTime":"2025-10-03T07:49:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.060466 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b74b4b6-2022-4308-8071-0f972dd1d922\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b1c43aa954d4f0ba95fd6bafe459a5f2d2df82b8cdbcd661c2a6e6238526fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd376fcbd2d46400852a695c1682ba853522bcc98038fcdb55c89d87d13ef012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f5a207be586adc5a1d852c18ed4ca1bf7d81b707a2f55ac2e76f31c5a94ffb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373216c551b7fed67ebbcb9c4cac5afd1e715a5fda849d977798c3690dc4f928\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://373216c551b7fed67ebbcb9c4cac5afd1e715a5fda849d977798c3690dc4f928\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:30Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.072593 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:30Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.085041 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d6a6dd75fa3f860ac6164f4065d5fbb3ea31148102e726b792655d5b08e34d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:30Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.098901 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c51653288453da9ac24f91fdd67f9cec6bea6dc9b90432eeab5350b7e48f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://741ad5608cab1c104f0a912ca7325733265167e85291eb6bc1b08dd5a23a9b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:30Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.111675 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a8b21d2fb73dbe0bf41ef524cce8befb05dca708bc568a567ccd71bc1434b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:30Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.159116 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.159152 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.159160 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.159173 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.159182 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:30Z","lastTransitionTime":"2025-10-03T07:49:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.261184 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.261248 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.261258 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.261272 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.261282 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:30Z","lastTransitionTime":"2025-10-03T07:49:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.363142 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.363190 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.363203 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.363221 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.363233 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:30Z","lastTransitionTime":"2025-10-03T07:49:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.465858 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.465915 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.465926 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.465940 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.465949 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:30Z","lastTransitionTime":"2025-10-03T07:49:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.568594 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.568666 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.568676 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.568691 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.568703 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:30Z","lastTransitionTime":"2025-10-03T07:49:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.671411 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.671448 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.671457 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.671470 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.671480 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:30Z","lastTransitionTime":"2025-10-03T07:49:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.773722 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.773766 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.773775 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.773788 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.773798 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:30Z","lastTransitionTime":"2025-10-03T07:49:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.875172 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:49:30 crc kubenswrapper[4664]: E1003 07:49:30.875308 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l687s" podUID="7f2800e0-b66e-4ab2-ad4f-37c5ffe60120" Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.876506 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.876571 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.876584 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.876639 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.876655 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:30Z","lastTransitionTime":"2025-10-03T07:49:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.979674 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.979758 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.979767 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.979781 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:30 crc kubenswrapper[4664]: I1003 07:49:30.979794 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:30Z","lastTransitionTime":"2025-10-03T07:49:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:31 crc kubenswrapper[4664]: I1003 07:49:31.082429 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:31 crc kubenswrapper[4664]: I1003 07:49:31.082470 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:31 crc kubenswrapper[4664]: I1003 07:49:31.082483 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:31 crc kubenswrapper[4664]: I1003 07:49:31.082498 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:31 crc kubenswrapper[4664]: I1003 07:49:31.082510 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:31Z","lastTransitionTime":"2025-10-03T07:49:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:31 crc kubenswrapper[4664]: I1003 07:49:31.184726 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:31 crc kubenswrapper[4664]: I1003 07:49:31.184765 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:31 crc kubenswrapper[4664]: I1003 07:49:31.184774 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:31 crc kubenswrapper[4664]: I1003 07:49:31.184789 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:31 crc kubenswrapper[4664]: I1003 07:49:31.184798 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:31Z","lastTransitionTime":"2025-10-03T07:49:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:31 crc kubenswrapper[4664]: I1003 07:49:31.287301 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:31 crc kubenswrapper[4664]: I1003 07:49:31.287351 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:31 crc kubenswrapper[4664]: I1003 07:49:31.287392 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:31 crc kubenswrapper[4664]: I1003 07:49:31.287410 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:31 crc kubenswrapper[4664]: I1003 07:49:31.287421 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:31Z","lastTransitionTime":"2025-10-03T07:49:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:31 crc kubenswrapper[4664]: I1003 07:49:31.389682 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:31 crc kubenswrapper[4664]: I1003 07:49:31.389719 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:31 crc kubenswrapper[4664]: I1003 07:49:31.389729 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:31 crc kubenswrapper[4664]: I1003 07:49:31.389777 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:31 crc kubenswrapper[4664]: I1003 07:49:31.389787 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:31Z","lastTransitionTime":"2025-10-03T07:49:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:31 crc kubenswrapper[4664]: I1003 07:49:31.491985 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:31 crc kubenswrapper[4664]: I1003 07:49:31.492025 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:31 crc kubenswrapper[4664]: I1003 07:49:31.492035 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:31 crc kubenswrapper[4664]: I1003 07:49:31.492050 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:31 crc kubenswrapper[4664]: I1003 07:49:31.492060 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:31Z","lastTransitionTime":"2025-10-03T07:49:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:31 crc kubenswrapper[4664]: I1003 07:49:31.492690 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7f2800e0-b66e-4ab2-ad4f-37c5ffe60120-metrics-certs\") pod \"network-metrics-daemon-l687s\" (UID: \"7f2800e0-b66e-4ab2-ad4f-37c5ffe60120\") " pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:49:31 crc kubenswrapper[4664]: E1003 07:49:31.492829 4664 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 07:49:31 crc kubenswrapper[4664]: E1003 07:49:31.493102 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f2800e0-b66e-4ab2-ad4f-37c5ffe60120-metrics-certs podName:7f2800e0-b66e-4ab2-ad4f-37c5ffe60120 nodeName:}" failed. No retries permitted until 2025-10-03 07:50:03.493084517 +0000 UTC m=+104.314275007 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7f2800e0-b66e-4ab2-ad4f-37c5ffe60120-metrics-certs") pod "network-metrics-daemon-l687s" (UID: "7f2800e0-b66e-4ab2-ad4f-37c5ffe60120") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 07:49:31 crc kubenswrapper[4664]: I1003 07:49:31.594517 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:31 crc kubenswrapper[4664]: I1003 07:49:31.594580 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:31 crc kubenswrapper[4664]: I1003 07:49:31.594590 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:31 crc kubenswrapper[4664]: I1003 07:49:31.594624 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:31 crc kubenswrapper[4664]: I1003 07:49:31.594635 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:31Z","lastTransitionTime":"2025-10-03T07:49:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:31 crc kubenswrapper[4664]: I1003 07:49:31.696530 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:31 crc kubenswrapper[4664]: I1003 07:49:31.696566 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:31 crc kubenswrapper[4664]: I1003 07:49:31.696577 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:31 crc kubenswrapper[4664]: I1003 07:49:31.696591 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:31 crc kubenswrapper[4664]: I1003 07:49:31.696618 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:31Z","lastTransitionTime":"2025-10-03T07:49:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:31 crc kubenswrapper[4664]: I1003 07:49:31.799081 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:31 crc kubenswrapper[4664]: I1003 07:49:31.799118 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:31 crc kubenswrapper[4664]: I1003 07:49:31.799126 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:31 crc kubenswrapper[4664]: I1003 07:49:31.799140 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:31 crc kubenswrapper[4664]: I1003 07:49:31.799149 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:31Z","lastTransitionTime":"2025-10-03T07:49:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:31 crc kubenswrapper[4664]: I1003 07:49:31.876243 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:49:31 crc kubenswrapper[4664]: E1003 07:49:31.876362 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:49:31 crc kubenswrapper[4664]: I1003 07:49:31.876534 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:49:31 crc kubenswrapper[4664]: E1003 07:49:31.876577 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:49:31 crc kubenswrapper[4664]: I1003 07:49:31.876736 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:49:31 crc kubenswrapper[4664]: E1003 07:49:31.876799 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:49:31 crc kubenswrapper[4664]: I1003 07:49:31.901538 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:31 crc kubenswrapper[4664]: I1003 07:49:31.901567 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:31 crc kubenswrapper[4664]: I1003 07:49:31.901590 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:31 crc kubenswrapper[4664]: I1003 07:49:31.901617 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:31 crc kubenswrapper[4664]: I1003 07:49:31.901629 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:31Z","lastTransitionTime":"2025-10-03T07:49:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:32 crc kubenswrapper[4664]: I1003 07:49:32.004080 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:32 crc kubenswrapper[4664]: I1003 07:49:32.004137 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:32 crc kubenswrapper[4664]: I1003 07:49:32.004153 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:32 crc kubenswrapper[4664]: I1003 07:49:32.004172 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:32 crc kubenswrapper[4664]: I1003 07:49:32.004185 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:32Z","lastTransitionTime":"2025-10-03T07:49:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:32 crc kubenswrapper[4664]: I1003 07:49:32.106431 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:32 crc kubenswrapper[4664]: I1003 07:49:32.106496 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:32 crc kubenswrapper[4664]: I1003 07:49:32.106507 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:32 crc kubenswrapper[4664]: I1003 07:49:32.106524 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:32 crc kubenswrapper[4664]: I1003 07:49:32.106537 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:32Z","lastTransitionTime":"2025-10-03T07:49:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:32 crc kubenswrapper[4664]: I1003 07:49:32.208990 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:32 crc kubenswrapper[4664]: I1003 07:49:32.209040 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:32 crc kubenswrapper[4664]: I1003 07:49:32.209055 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:32 crc kubenswrapper[4664]: I1003 07:49:32.209068 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:32 crc kubenswrapper[4664]: I1003 07:49:32.209076 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:32Z","lastTransitionTime":"2025-10-03T07:49:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:32 crc kubenswrapper[4664]: I1003 07:49:32.311419 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:32 crc kubenswrapper[4664]: I1003 07:49:32.311461 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:32 crc kubenswrapper[4664]: I1003 07:49:32.311469 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:32 crc kubenswrapper[4664]: I1003 07:49:32.311499 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:32 crc kubenswrapper[4664]: I1003 07:49:32.311508 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:32Z","lastTransitionTime":"2025-10-03T07:49:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:32 crc kubenswrapper[4664]: I1003 07:49:32.414362 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:32 crc kubenswrapper[4664]: I1003 07:49:32.414399 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:32 crc kubenswrapper[4664]: I1003 07:49:32.414409 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:32 crc kubenswrapper[4664]: I1003 07:49:32.414424 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:32 crc kubenswrapper[4664]: I1003 07:49:32.414435 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:32Z","lastTransitionTime":"2025-10-03T07:49:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:32 crc kubenswrapper[4664]: I1003 07:49:32.517194 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:32 crc kubenswrapper[4664]: I1003 07:49:32.517238 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:32 crc kubenswrapper[4664]: I1003 07:49:32.517251 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:32 crc kubenswrapper[4664]: I1003 07:49:32.517269 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:32 crc kubenswrapper[4664]: I1003 07:49:32.517282 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:32Z","lastTransitionTime":"2025-10-03T07:49:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:32 crc kubenswrapper[4664]: I1003 07:49:32.619976 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:32 crc kubenswrapper[4664]: I1003 07:49:32.620006 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:32 crc kubenswrapper[4664]: I1003 07:49:32.620014 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:32 crc kubenswrapper[4664]: I1003 07:49:32.620030 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:32 crc kubenswrapper[4664]: I1003 07:49:32.620039 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:32Z","lastTransitionTime":"2025-10-03T07:49:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:32 crc kubenswrapper[4664]: I1003 07:49:32.722786 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:32 crc kubenswrapper[4664]: I1003 07:49:32.722835 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:32 crc kubenswrapper[4664]: I1003 07:49:32.722844 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:32 crc kubenswrapper[4664]: I1003 07:49:32.722863 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:32 crc kubenswrapper[4664]: I1003 07:49:32.722874 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:32Z","lastTransitionTime":"2025-10-03T07:49:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:32 crc kubenswrapper[4664]: I1003 07:49:32.824704 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:32 crc kubenswrapper[4664]: I1003 07:49:32.824744 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:32 crc kubenswrapper[4664]: I1003 07:49:32.824757 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:32 crc kubenswrapper[4664]: I1003 07:49:32.824774 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:32 crc kubenswrapper[4664]: I1003 07:49:32.824787 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:32Z","lastTransitionTime":"2025-10-03T07:49:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:32 crc kubenswrapper[4664]: I1003 07:49:32.876045 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:49:32 crc kubenswrapper[4664]: E1003 07:49:32.876179 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l687s" podUID="7f2800e0-b66e-4ab2-ad4f-37c5ffe60120" Oct 03 07:49:32 crc kubenswrapper[4664]: I1003 07:49:32.926618 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:32 crc kubenswrapper[4664]: I1003 07:49:32.926649 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:32 crc kubenswrapper[4664]: I1003 07:49:32.926657 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:32 crc kubenswrapper[4664]: I1003 07:49:32.926671 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:32 crc kubenswrapper[4664]: I1003 07:49:32.926680 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:32Z","lastTransitionTime":"2025-10-03T07:49:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.029030 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.029069 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.029080 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.029096 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.029107 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:33Z","lastTransitionTime":"2025-10-03T07:49:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.131555 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.131598 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.131636 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.131656 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.131667 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:33Z","lastTransitionTime":"2025-10-03T07:49:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.233548 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.233718 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.233750 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.233769 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.233781 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:33Z","lastTransitionTime":"2025-10-03T07:49:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.284416 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-72cm2_6998d742-8d17-4f20-ab52-c30d9f7b0b89/kube-multus/0.log" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.284469 4664 generic.go:334] "Generic (PLEG): container finished" podID="6998d742-8d17-4f20-ab52-c30d9f7b0b89" containerID="a93e9f047fd6ffc420a229d1159b5c53c40c5efe1f945e45bafa3ba93a69caef" exitCode=1 Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.284502 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-72cm2" event={"ID":"6998d742-8d17-4f20-ab52-c30d9f7b0b89","Type":"ContainerDied","Data":"a93e9f047fd6ffc420a229d1159b5c53c40c5efe1f945e45bafa3ba93a69caef"} Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.284914 4664 scope.go:117] "RemoveContainer" containerID="a93e9f047fd6ffc420a229d1159b5c53c40c5efe1f945e45bafa3ba93a69caef" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.297826 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b81ce-0ce7-498f-9337-ae5e6e64682b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf6aded6ff3dcd5fbea90e46bb42d6db6b963c9688abfed6fa40662139fb80f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b446e000609801e974810aa233005f7c2810cc977c7a13155452de76e1268427\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x9dgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:33Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.308240 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9z9q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4492bc6-9e61-4748-935e-e070a703c05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aac5ac1e508036a7e3cee3eb11577734e7923c4076399288863f294b7efa3fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqzrd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9z9q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:33Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.321005 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:33Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.333146 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-72cm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6998d742-8d17-4f20-ab52-c30d9f7b0b89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93e9f047fd6ffc420a229d1159b5c53c40c5efe1f945e45bafa3ba93a69caef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a93e9f047fd6ffc420a229d1159b5c53c40c5efe1f945e45bafa3ba93a69caef\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T07:49:32Z\\\",\\\"message\\\":\\\"2025-10-03T07:48:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_fb5617ca-9f7d-47d6-b67b-fe6a5db8a650\\\\n2025-10-03T07:48:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_fb5617ca-9f7d-47d6-b67b-fe6a5db8a650 to /host/opt/cni/bin/\\\\n2025-10-03T07:48:47Z [verbose] multus-daemon started\\\\n2025-10-03T07:48:47Z [verbose] Readiness Indicator file check\\\\n2025-10-03T07:49:32Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9hpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-72cm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:33Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.336429 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.336464 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.336475 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.336491 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.336501 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:33Z","lastTransitionTime":"2025-10-03T07:49:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.347981 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h865c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e441c3-e8db-4705-9da1-0c6513d57048\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de7994382833aa2a4b6d7926e96257038511c4f547f24336eb11f3cf89e2d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h865c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:33Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.359173 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-27zqq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988af70a-5398-4c96-b2a7-4b8e143303bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f95198276df57dfddfafb72c325577486b3eddb40e73fbe4afa95e8ce8a6183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cr7ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd25565edabdcc002b63aece712eb1cf0ff4eece8277aa01ad4875f2977f63ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cr7ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-27zqq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:33Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.370019 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l687s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f2800e0-b66e-4ab2-ad4f-37c5ffe60120\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6h72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6h72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l687s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:33Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.385403 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7002f9af-9339-4bec-8b7a-ce0c1d20f3ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96ea8ece692f200d9dbe3650ba30a3680897c6d7c33c9b359b27e31692689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f580c83a6ce9959ed5a1c8b78834f7d6caecaeff3ed366fdef991ff728b08d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed30aa5b737df201bfca9e17073255b5d791430429d86149107650c38a1c0c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89484df39df055036ea8f81fb4e052a14292e7268e7c52088a497348141c3a81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 07:48:39.005956 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 07:48:39.006283 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 07:48:39.007038 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3527224080/tls.crt::/tmp/serving-cert-3527224080/tls.key\\\\\\\"\\\\nI1003 07:48:39.474760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 07:48:39.479276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 07:48:39.479354 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 07:48:39.479401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 07:48:39.479427 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 07:48:39.489134 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 07:48:39.489225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 07:48:39.489291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 07:48:39.489311 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 07:48:39.489336 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 07:48:39.489506 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 07:48:39.491496 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1078b48064376958ba64f9040dc1973628494ff4c9935c90c296bdb8a27f8738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:33Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.405992 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af9dfaab4099a0e60f66ad4c28ea685e11df2e1c3af1ac6805a4a1d70d841574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9dfaab4099a0e60f66ad4c28ea685e11df2e1c3af1ac6805a4a1d70d841574\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T07:49:18Z\\\",\\\"message\\\":\\\"Pod openshift-ovn-kubernetes/ovnkube-node-2jpvm\\\\nI1003 07:49:18.633255 6386 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1003 07:49:18.633257 6386 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-27zqq\\\\nI1003 07:49:18.633264 6386 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1003 07:49:18.633264 6386 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-h865c\\\\nI1003 07:49:18.633271 6386 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1003 07:49:18.633218 6386 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nF1003 07:49:18.633264 6386 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network contr\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:49:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2jpvm_openshift-ovn-kubernetes(8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2jpvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:33Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.418756 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:33Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.432790 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d6a6dd75fa3f860ac6164f4065d5fbb3ea31148102e726b792655d5b08e34d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:33Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.438739 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.438775 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.438783 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.438799 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.438808 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:33Z","lastTransitionTime":"2025-10-03T07:49:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.447052 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c51653288453da9ac24f91fdd67f9cec6bea6dc9b90432eeab5350b7e48f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://741ad5608cab1c104f0a912ca7325733265167e85291eb6bc1b08dd5a23a9b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:33Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.459711 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a8b21d2fb73dbe0bf41ef524cce8befb05dca708bc568a567ccd71bc1434b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:33Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.472975 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:33Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.484318 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hqfhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e03e57b4-cb64-42fa-b8c5-ee4863291568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31867b4c35050cbe6c4247edfe085dc9429d5516c19ad8da432699262b7fd092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hqfhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:33Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.496030 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8078a587-9c95-43a8-9cf8-4286835f8134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938aaa7e3e507e4f04632ecbdeb12d1e169b59fdcd8c19838e21fedc4b55d721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682632866f1e606691047ac952506c02f250c825f142a1676a4abcca32e58466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7eed5fa26eeb30a803512265d01d5fbb9fffad00d857a683939158da6b0bb7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e4e705e4bd02aaf141572ec91291240f8f7ec92b33cd45ba98ea8ce83f06918\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:33Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.507488 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b74b4b6-2022-4308-8071-0f972dd1d922\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b1c43aa954d4f0ba95fd6bafe459a5f2d2df82b8cdbcd661c2a6e6238526fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd376fcbd2d46400852a695c1682ba853522bcc98038fcdb55c89d87d13ef012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f5a207be586adc5a1d852c18ed4ca1bf7d81b707a2f55ac2e76f31c5a94ffb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373216c551b7fed67ebbcb9c4cac5afd1e715a5fda849d977798c3690dc4f928\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://373216c551b7fed67ebbcb9c4cac5afd1e715a5fda849d977798c3690dc4f928\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:33Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.541120 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.541352 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.541439 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.541510 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.541566 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:33Z","lastTransitionTime":"2025-10-03T07:49:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.644255 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.644471 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.644572 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.644709 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.644809 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:33Z","lastTransitionTime":"2025-10-03T07:49:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.747960 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.747993 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.748001 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.748013 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.748021 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:33Z","lastTransitionTime":"2025-10-03T07:49:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.850030 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.850286 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.850468 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.851016 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.851104 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:33Z","lastTransitionTime":"2025-10-03T07:49:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.875742 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.875788 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.875770 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:49:33 crc kubenswrapper[4664]: E1003 07:49:33.875909 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:49:33 crc kubenswrapper[4664]: E1003 07:49:33.876001 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:49:33 crc kubenswrapper[4664]: E1003 07:49:33.876360 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.876663 4664 scope.go:117] "RemoveContainer" containerID="af9dfaab4099a0e60f66ad4c28ea685e11df2e1c3af1ac6805a4a1d70d841574" Oct 03 07:49:33 crc kubenswrapper[4664]: E1003 07:49:33.876883 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2jpvm_openshift-ovn-kubernetes(8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.954053 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.954105 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.954119 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.954137 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:33 crc kubenswrapper[4664]: I1003 07:49:33.954150 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:33Z","lastTransitionTime":"2025-10-03T07:49:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.057396 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.057436 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.057447 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.057464 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.057474 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:34Z","lastTransitionTime":"2025-10-03T07:49:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.159741 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.159950 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.160013 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.160102 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.160166 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:34Z","lastTransitionTime":"2025-10-03T07:49:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.263144 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.263440 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.263528 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.263643 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.263748 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:34Z","lastTransitionTime":"2025-10-03T07:49:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.289738 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-72cm2_6998d742-8d17-4f20-ab52-c30d9f7b0b89/kube-multus/0.log" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.289969 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-72cm2" event={"ID":"6998d742-8d17-4f20-ab52-c30d9f7b0b89","Type":"ContainerStarted","Data":"482e54714945acaea85fdeeb4b89eb9b16568c96319d07eb812ef88bd5faeb85"} Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.302649 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c51653288453da9ac24f91fdd67f9cec6bea6dc9b90432eeab5350b7e48f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://741ad5608cab1c104f0a912ca7325733265167e85291eb6bc1b08dd5a23a9b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:34Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.315364 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a8b21d2fb73dbe0bf41ef524cce8befb05dca708bc568a567ccd71bc1434b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:34Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.327382 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:34Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.338501 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hqfhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e03e57b4-cb64-42fa-b8c5-ee4863291568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31867b4c35050cbe6c4247edfe085dc9429d5516c19ad8da432699262b7fd092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hqfhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:34Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.351153 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8078a587-9c95-43a8-9cf8-4286835f8134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938aaa7e3e507e4f04632ecbdeb12d1e169b59fdcd8c19838e21fedc4b55d721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682632866f1e606691047ac952506c02f250c825f142a1676a4abcca32e58466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7eed5fa26eeb30a803512265d01d5fbb9fffad00d857a683939158da6b0bb7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e4e705e4bd02aaf141572ec91291240f8f7ec92b33cd45ba98ea8ce83f06918\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:34Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.363865 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b74b4b6-2022-4308-8071-0f972dd1d922\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b1c43aa954d4f0ba95fd6bafe459a5f2d2df82b8cdbcd661c2a6e6238526fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd376fcbd2d46400852a695c1682ba853522bcc98038fcdb55c89d87d13ef012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f5a207be586adc5a1d852c18ed4ca1bf7d81b707a2f55ac2e76f31c5a94ffb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373216c551b7fed67ebbcb9c4cac5afd1e715a5fda849d977798c3690dc4f928\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://373216c551b7fed67ebbcb9c4cac5afd1e715a5fda849d977798c3690dc4f928\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:34Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.365890 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.365939 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.365952 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.365967 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.365977 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:34Z","lastTransitionTime":"2025-10-03T07:49:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.376253 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:34Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.386344 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d6a6dd75fa3f860ac6164f4065d5fbb3ea31148102e726b792655d5b08e34d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:34Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.398477 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:34Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.409132 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b81ce-0ce7-498f-9337-ae5e6e64682b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf6aded6ff3dcd5fbea90e46bb42d6db6b963c9688abfed6fa40662139fb80f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b446e000609801e974810aa233005f7c2810cc977c7a13155452de76e1268427\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x9dgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:34Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.419091 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9z9q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4492bc6-9e61-4748-935e-e070a703c05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aac5ac1e508036a7e3cee3eb11577734e7923c4076399288863f294b7efa3fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqzrd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9z9q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:34Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.430247 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-72cm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6998d742-8d17-4f20-ab52-c30d9f7b0b89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482e54714945acaea85fdeeb4b89eb9b16568c96319d07eb812ef88bd5faeb85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a93e9f047fd6ffc420a229d1159b5c53c40c5efe1f945e45bafa3ba93a69caef\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T07:49:32Z\\\",\\\"message\\\":\\\"2025-10-03T07:48:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_fb5617ca-9f7d-47d6-b67b-fe6a5db8a650\\\\n2025-10-03T07:48:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_fb5617ca-9f7d-47d6-b67b-fe6a5db8a650 to /host/opt/cni/bin/\\\\n2025-10-03T07:48:47Z [verbose] multus-daemon started\\\\n2025-10-03T07:48:47Z [verbose] Readiness Indicator file check\\\\n2025-10-03T07:49:32Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:49:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9hpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-72cm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:34Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.439235 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l687s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f2800e0-b66e-4ab2-ad4f-37c5ffe60120\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6h72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6h72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l687s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:34Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.452890 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7002f9af-9339-4bec-8b7a-ce0c1d20f3ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96ea8ece692f200d9dbe3650ba30a3680897c6d7c33c9b359b27e31692689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f580c83a6ce9959ed5a1c8b78834f7d6caecaeff3ed366fdef991ff728b08d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed30aa5b737df201bfca9e17073255b5d791430429d86149107650c38a1c0c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89484df39df055036ea8f81fb4e052a14292e7268e7c52088a497348141c3a81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 07:48:39.005956 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 07:48:39.006283 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 07:48:39.007038 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3527224080/tls.crt::/tmp/serving-cert-3527224080/tls.key\\\\\\\"\\\\nI1003 07:48:39.474760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 07:48:39.479276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 07:48:39.479354 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 07:48:39.479401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 07:48:39.479427 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 07:48:39.489134 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 07:48:39.489225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 07:48:39.489291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 07:48:39.489311 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 07:48:39.489336 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 07:48:39.489506 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 07:48:39.491496 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1078b48064376958ba64f9040dc1973628494ff4c9935c90c296bdb8a27f8738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:34Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.467793 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.467832 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.467845 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.467859 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.467869 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:34Z","lastTransitionTime":"2025-10-03T07:49:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.471291 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af9dfaab4099a0e60f66ad4c28ea685e11df2e1c3af1ac6805a4a1d70d841574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9dfaab4099a0e60f66ad4c28ea685e11df2e1c3af1ac6805a4a1d70d841574\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T07:49:18Z\\\",\\\"message\\\":\\\"Pod openshift-ovn-kubernetes/ovnkube-node-2jpvm\\\\nI1003 07:49:18.633255 6386 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1003 07:49:18.633257 6386 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-27zqq\\\\nI1003 07:49:18.633264 6386 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1003 07:49:18.633264 6386 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-h865c\\\\nI1003 07:49:18.633271 6386 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1003 07:49:18.633218 6386 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nF1003 07:49:18.633264 6386 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network contr\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:49:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2jpvm_openshift-ovn-kubernetes(8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2jpvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:34Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.485239 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h865c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e441c3-e8db-4705-9da1-0c6513d57048\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de7994382833aa2a4b6d7926e96257038511c4f547f24336eb11f3cf89e2d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h865c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:34Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.495853 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-27zqq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988af70a-5398-4c96-b2a7-4b8e143303bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f95198276df57dfddfafb72c325577486b3eddb40e73fbe4afa95e8ce8a6183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cr7ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd25565edabdcc002b63aece712eb1cf0ff4eece8277aa01ad4875f2977f63ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cr7ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-27zqq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:34Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.569976 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.570026 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.570038 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.570052 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.570061 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:34Z","lastTransitionTime":"2025-10-03T07:49:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.672380 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.672445 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.672467 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.672487 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.672500 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:34Z","lastTransitionTime":"2025-10-03T07:49:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.775036 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.775079 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.775088 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.775127 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.775140 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:34Z","lastTransitionTime":"2025-10-03T07:49:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.875597 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:49:34 crc kubenswrapper[4664]: E1003 07:49:34.875737 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l687s" podUID="7f2800e0-b66e-4ab2-ad4f-37c5ffe60120" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.877026 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.877068 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.877078 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.877114 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.877123 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:34Z","lastTransitionTime":"2025-10-03T07:49:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.979080 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.979113 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.979122 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.979137 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:34 crc kubenswrapper[4664]: I1003 07:49:34.979146 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:34Z","lastTransitionTime":"2025-10-03T07:49:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:35 crc kubenswrapper[4664]: I1003 07:49:35.081197 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:35 crc kubenswrapper[4664]: I1003 07:49:35.081236 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:35 crc kubenswrapper[4664]: I1003 07:49:35.081247 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:35 crc kubenswrapper[4664]: I1003 07:49:35.081262 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:35 crc kubenswrapper[4664]: I1003 07:49:35.081274 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:35Z","lastTransitionTime":"2025-10-03T07:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:35 crc kubenswrapper[4664]: I1003 07:49:35.183758 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:35 crc kubenswrapper[4664]: I1003 07:49:35.183804 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:35 crc kubenswrapper[4664]: I1003 07:49:35.183816 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:35 crc kubenswrapper[4664]: I1003 07:49:35.183834 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:35 crc kubenswrapper[4664]: I1003 07:49:35.183847 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:35Z","lastTransitionTime":"2025-10-03T07:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:35 crc kubenswrapper[4664]: I1003 07:49:35.286729 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:35 crc kubenswrapper[4664]: I1003 07:49:35.286768 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:35 crc kubenswrapper[4664]: I1003 07:49:35.286780 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:35 crc kubenswrapper[4664]: I1003 07:49:35.286796 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:35 crc kubenswrapper[4664]: I1003 07:49:35.286808 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:35Z","lastTransitionTime":"2025-10-03T07:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:35 crc kubenswrapper[4664]: I1003 07:49:35.389513 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:35 crc kubenswrapper[4664]: I1003 07:49:35.389567 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:35 crc kubenswrapper[4664]: I1003 07:49:35.389579 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:35 crc kubenswrapper[4664]: I1003 07:49:35.389596 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:35 crc kubenswrapper[4664]: I1003 07:49:35.389626 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:35Z","lastTransitionTime":"2025-10-03T07:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:35 crc kubenswrapper[4664]: I1003 07:49:35.492150 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:35 crc kubenswrapper[4664]: I1003 07:49:35.492192 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:35 crc kubenswrapper[4664]: I1003 07:49:35.492203 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:35 crc kubenswrapper[4664]: I1003 07:49:35.492220 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:35 crc kubenswrapper[4664]: I1003 07:49:35.492232 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:35Z","lastTransitionTime":"2025-10-03T07:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:35 crc kubenswrapper[4664]: I1003 07:49:35.594416 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:35 crc kubenswrapper[4664]: I1003 07:49:35.594452 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:35 crc kubenswrapper[4664]: I1003 07:49:35.594461 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:35 crc kubenswrapper[4664]: I1003 07:49:35.594474 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:35 crc kubenswrapper[4664]: I1003 07:49:35.594488 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:35Z","lastTransitionTime":"2025-10-03T07:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:35 crc kubenswrapper[4664]: I1003 07:49:35.696879 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:35 crc kubenswrapper[4664]: I1003 07:49:35.696919 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:35 crc kubenswrapper[4664]: I1003 07:49:35.696929 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:35 crc kubenswrapper[4664]: I1003 07:49:35.696943 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:35 crc kubenswrapper[4664]: I1003 07:49:35.696955 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:35Z","lastTransitionTime":"2025-10-03T07:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:35 crc kubenswrapper[4664]: I1003 07:49:35.799673 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:35 crc kubenswrapper[4664]: I1003 07:49:35.799717 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:35 crc kubenswrapper[4664]: I1003 07:49:35.799728 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:35 crc kubenswrapper[4664]: I1003 07:49:35.799742 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:35 crc kubenswrapper[4664]: I1003 07:49:35.799753 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:35Z","lastTransitionTime":"2025-10-03T07:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:35 crc kubenswrapper[4664]: I1003 07:49:35.876351 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:49:35 crc kubenswrapper[4664]: I1003 07:49:35.876399 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:49:35 crc kubenswrapper[4664]: I1003 07:49:35.876450 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:49:35 crc kubenswrapper[4664]: E1003 07:49:35.876490 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:49:35 crc kubenswrapper[4664]: E1003 07:49:35.876655 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:49:35 crc kubenswrapper[4664]: E1003 07:49:35.876805 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:49:35 crc kubenswrapper[4664]: I1003 07:49:35.902080 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:35 crc kubenswrapper[4664]: I1003 07:49:35.902129 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:35 crc kubenswrapper[4664]: I1003 07:49:35.902140 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:35 crc kubenswrapper[4664]: I1003 07:49:35.902156 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:35 crc kubenswrapper[4664]: I1003 07:49:35.902167 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:35Z","lastTransitionTime":"2025-10-03T07:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.004695 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.004733 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.004744 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.004759 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.004770 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:36Z","lastTransitionTime":"2025-10-03T07:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.107637 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.108203 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.108298 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.108384 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.108458 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:36Z","lastTransitionTime":"2025-10-03T07:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.212203 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.212239 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.212250 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.212265 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.212276 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:36Z","lastTransitionTime":"2025-10-03T07:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.315053 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.315095 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.315108 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.315124 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.315134 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:36Z","lastTransitionTime":"2025-10-03T07:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.417294 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.417331 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.417340 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.417353 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.417363 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:36Z","lastTransitionTime":"2025-10-03T07:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.519818 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.520116 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.520701 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.520826 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.520924 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:36Z","lastTransitionTime":"2025-10-03T07:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.623653 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.623699 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.623709 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.623727 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.623738 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:36Z","lastTransitionTime":"2025-10-03T07:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.683460 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.683759 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.683837 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.684035 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.684117 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:36Z","lastTransitionTime":"2025-10-03T07:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:36 crc kubenswrapper[4664]: E1003 07:49:36.696004 4664 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eca81c87-676e-4667-a87b-e015ec0be81c\\\",\\\"systemUUID\\\":\\\"7be6e848-96ef-48b1-8627-9ddc13d5cc87\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:36Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.699765 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.699984 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.700260 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.700357 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.700440 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:36Z","lastTransitionTime":"2025-10-03T07:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:36 crc kubenswrapper[4664]: E1003 07:49:36.712979 4664 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eca81c87-676e-4667-a87b-e015ec0be81c\\\",\\\"systemUUID\\\":\\\"7be6e848-96ef-48b1-8627-9ddc13d5cc87\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:36Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.730878 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.731133 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.731209 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.731281 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.731345 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:36Z","lastTransitionTime":"2025-10-03T07:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:36 crc kubenswrapper[4664]: E1003 07:49:36.750488 4664 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eca81c87-676e-4667-a87b-e015ec0be81c\\\",\\\"systemUUID\\\":\\\"7be6e848-96ef-48b1-8627-9ddc13d5cc87\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:36Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.755523 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.755565 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.755577 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.755594 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.755620 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:36Z","lastTransitionTime":"2025-10-03T07:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:36 crc kubenswrapper[4664]: E1003 07:49:36.771113 4664 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eca81c87-676e-4667-a87b-e015ec0be81c\\\",\\\"systemUUID\\\":\\\"7be6e848-96ef-48b1-8627-9ddc13d5cc87\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:36Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.774549 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.774587 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.774595 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.774635 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.774651 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:36Z","lastTransitionTime":"2025-10-03T07:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:36 crc kubenswrapper[4664]: E1003 07:49:36.785106 4664 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eca81c87-676e-4667-a87b-e015ec0be81c\\\",\\\"systemUUID\\\":\\\"7be6e848-96ef-48b1-8627-9ddc13d5cc87\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:36Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:36 crc kubenswrapper[4664]: E1003 07:49:36.785220 4664 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.786858 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.786997 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.787107 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.787211 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.787300 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:36Z","lastTransitionTime":"2025-10-03T07:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.875755 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:49:36 crc kubenswrapper[4664]: E1003 07:49:36.876104 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l687s" podUID="7f2800e0-b66e-4ab2-ad4f-37c5ffe60120" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.889504 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.889537 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.889547 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.889562 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.889572 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:36Z","lastTransitionTime":"2025-10-03T07:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.991237 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.991514 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.991586 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.991697 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:36 crc kubenswrapper[4664]: I1003 07:49:36.991761 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:36Z","lastTransitionTime":"2025-10-03T07:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:37 crc kubenswrapper[4664]: I1003 07:49:37.094199 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:37 crc kubenswrapper[4664]: I1003 07:49:37.094428 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:37 crc kubenswrapper[4664]: I1003 07:49:37.094492 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:37 crc kubenswrapper[4664]: I1003 07:49:37.094616 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:37 crc kubenswrapper[4664]: I1003 07:49:37.094692 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:37Z","lastTransitionTime":"2025-10-03T07:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:37 crc kubenswrapper[4664]: I1003 07:49:37.196728 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:37 crc kubenswrapper[4664]: I1003 07:49:37.197352 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:37 crc kubenswrapper[4664]: I1003 07:49:37.197440 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:37 crc kubenswrapper[4664]: I1003 07:49:37.197519 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:37 crc kubenswrapper[4664]: I1003 07:49:37.197622 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:37Z","lastTransitionTime":"2025-10-03T07:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:37 crc kubenswrapper[4664]: I1003 07:49:37.299587 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:37 crc kubenswrapper[4664]: I1003 07:49:37.300001 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:37 crc kubenswrapper[4664]: I1003 07:49:37.300166 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:37 crc kubenswrapper[4664]: I1003 07:49:37.300302 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:37 crc kubenswrapper[4664]: I1003 07:49:37.300434 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:37Z","lastTransitionTime":"2025-10-03T07:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:37 crc kubenswrapper[4664]: I1003 07:49:37.403182 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:37 crc kubenswrapper[4664]: I1003 07:49:37.403249 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:37 crc kubenswrapper[4664]: I1003 07:49:37.403273 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:37 crc kubenswrapper[4664]: I1003 07:49:37.403304 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:37 crc kubenswrapper[4664]: I1003 07:49:37.403325 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:37Z","lastTransitionTime":"2025-10-03T07:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:37 crc kubenswrapper[4664]: I1003 07:49:37.505267 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:37 crc kubenswrapper[4664]: I1003 07:49:37.505311 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:37 crc kubenswrapper[4664]: I1003 07:49:37.505320 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:37 crc kubenswrapper[4664]: I1003 07:49:37.505333 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:37 crc kubenswrapper[4664]: I1003 07:49:37.505344 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:37Z","lastTransitionTime":"2025-10-03T07:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:37 crc kubenswrapper[4664]: I1003 07:49:37.607786 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:37 crc kubenswrapper[4664]: I1003 07:49:37.607886 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:37 crc kubenswrapper[4664]: I1003 07:49:37.607901 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:37 crc kubenswrapper[4664]: I1003 07:49:37.607928 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:37 crc kubenswrapper[4664]: I1003 07:49:37.607942 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:37Z","lastTransitionTime":"2025-10-03T07:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:37 crc kubenswrapper[4664]: I1003 07:49:37.709699 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:37 crc kubenswrapper[4664]: I1003 07:49:37.709990 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:37 crc kubenswrapper[4664]: I1003 07:49:37.710068 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:37 crc kubenswrapper[4664]: I1003 07:49:37.710132 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:37 crc kubenswrapper[4664]: I1003 07:49:37.710195 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:37Z","lastTransitionTime":"2025-10-03T07:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:37 crc kubenswrapper[4664]: I1003 07:49:37.814028 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:37 crc kubenswrapper[4664]: I1003 07:49:37.814541 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:37 crc kubenswrapper[4664]: I1003 07:49:37.814761 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:37 crc kubenswrapper[4664]: I1003 07:49:37.814895 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:37 crc kubenswrapper[4664]: I1003 07:49:37.815000 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:37Z","lastTransitionTime":"2025-10-03T07:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:37 crc kubenswrapper[4664]: I1003 07:49:37.875373 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:49:37 crc kubenswrapper[4664]: I1003 07:49:37.875423 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:49:37 crc kubenswrapper[4664]: I1003 07:49:37.875373 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:49:37 crc kubenswrapper[4664]: E1003 07:49:37.875592 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:49:37 crc kubenswrapper[4664]: E1003 07:49:37.875719 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:49:37 crc kubenswrapper[4664]: E1003 07:49:37.875797 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:49:37 crc kubenswrapper[4664]: I1003 07:49:37.918874 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:37 crc kubenswrapper[4664]: I1003 07:49:37.918920 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:37 crc kubenswrapper[4664]: I1003 07:49:37.918933 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:37 crc kubenswrapper[4664]: I1003 07:49:37.918958 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:37 crc kubenswrapper[4664]: I1003 07:49:37.918977 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:37Z","lastTransitionTime":"2025-10-03T07:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:38 crc kubenswrapper[4664]: I1003 07:49:38.023122 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:38 crc kubenswrapper[4664]: I1003 07:49:38.023179 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:38 crc kubenswrapper[4664]: I1003 07:49:38.023196 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:38 crc kubenswrapper[4664]: I1003 07:49:38.023220 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:38 crc kubenswrapper[4664]: I1003 07:49:38.023269 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:38Z","lastTransitionTime":"2025-10-03T07:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:38 crc kubenswrapper[4664]: I1003 07:49:38.125894 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:38 crc kubenswrapper[4664]: I1003 07:49:38.125917 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:38 crc kubenswrapper[4664]: I1003 07:49:38.125925 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:38 crc kubenswrapper[4664]: I1003 07:49:38.125937 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:38 crc kubenswrapper[4664]: I1003 07:49:38.125945 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:38Z","lastTransitionTime":"2025-10-03T07:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:38 crc kubenswrapper[4664]: I1003 07:49:38.228183 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:38 crc kubenswrapper[4664]: I1003 07:49:38.228223 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:38 crc kubenswrapper[4664]: I1003 07:49:38.228233 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:38 crc kubenswrapper[4664]: I1003 07:49:38.228244 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:38 crc kubenswrapper[4664]: I1003 07:49:38.228253 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:38Z","lastTransitionTime":"2025-10-03T07:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:38 crc kubenswrapper[4664]: I1003 07:49:38.330619 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:38 crc kubenswrapper[4664]: I1003 07:49:38.330666 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:38 crc kubenswrapper[4664]: I1003 07:49:38.330677 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:38 crc kubenswrapper[4664]: I1003 07:49:38.330694 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:38 crc kubenswrapper[4664]: I1003 07:49:38.330707 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:38Z","lastTransitionTime":"2025-10-03T07:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:38 crc kubenswrapper[4664]: I1003 07:49:38.432943 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:38 crc kubenswrapper[4664]: I1003 07:49:38.432975 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:38 crc kubenswrapper[4664]: I1003 07:49:38.432983 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:38 crc kubenswrapper[4664]: I1003 07:49:38.432997 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:38 crc kubenswrapper[4664]: I1003 07:49:38.433021 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:38Z","lastTransitionTime":"2025-10-03T07:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:38 crc kubenswrapper[4664]: I1003 07:49:38.535639 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:38 crc kubenswrapper[4664]: I1003 07:49:38.535682 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:38 crc kubenswrapper[4664]: I1003 07:49:38.535695 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:38 crc kubenswrapper[4664]: I1003 07:49:38.535709 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:38 crc kubenswrapper[4664]: I1003 07:49:38.535719 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:38Z","lastTransitionTime":"2025-10-03T07:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:38 crc kubenswrapper[4664]: I1003 07:49:38.637891 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:38 crc kubenswrapper[4664]: I1003 07:49:38.637988 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:38 crc kubenswrapper[4664]: I1003 07:49:38.638000 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:38 crc kubenswrapper[4664]: I1003 07:49:38.638016 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:38 crc kubenswrapper[4664]: I1003 07:49:38.638050 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:38Z","lastTransitionTime":"2025-10-03T07:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:38 crc kubenswrapper[4664]: I1003 07:49:38.741074 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:38 crc kubenswrapper[4664]: I1003 07:49:38.741124 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:38 crc kubenswrapper[4664]: I1003 07:49:38.741134 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:38 crc kubenswrapper[4664]: I1003 07:49:38.741148 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:38 crc kubenswrapper[4664]: I1003 07:49:38.741157 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:38Z","lastTransitionTime":"2025-10-03T07:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:38 crc kubenswrapper[4664]: I1003 07:49:38.843695 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:38 crc kubenswrapper[4664]: I1003 07:49:38.843734 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:38 crc kubenswrapper[4664]: I1003 07:49:38.843771 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:38 crc kubenswrapper[4664]: I1003 07:49:38.843786 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:38 crc kubenswrapper[4664]: I1003 07:49:38.843798 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:38Z","lastTransitionTime":"2025-10-03T07:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:38 crc kubenswrapper[4664]: I1003 07:49:38.876023 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:49:38 crc kubenswrapper[4664]: E1003 07:49:38.876397 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l687s" podUID="7f2800e0-b66e-4ab2-ad4f-37c5ffe60120" Oct 03 07:49:38 crc kubenswrapper[4664]: I1003 07:49:38.945665 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:38 crc kubenswrapper[4664]: I1003 07:49:38.945700 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:38 crc kubenswrapper[4664]: I1003 07:49:38.945712 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:38 crc kubenswrapper[4664]: I1003 07:49:38.945734 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:38 crc kubenswrapper[4664]: I1003 07:49:38.945748 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:38Z","lastTransitionTime":"2025-10-03T07:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.048560 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.048631 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.048643 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.048659 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.048671 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:39Z","lastTransitionTime":"2025-10-03T07:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.150423 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.150465 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.150474 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.150488 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.150498 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:39Z","lastTransitionTime":"2025-10-03T07:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.252482 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.252802 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.252811 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.252823 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.252833 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:39Z","lastTransitionTime":"2025-10-03T07:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.356926 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.356986 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.357000 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.357023 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.357041 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:39Z","lastTransitionTime":"2025-10-03T07:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.459766 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.459814 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.459827 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.459844 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.459856 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:39Z","lastTransitionTime":"2025-10-03T07:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.562075 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.562111 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.562122 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.562137 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.562148 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:39Z","lastTransitionTime":"2025-10-03T07:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.665025 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.665072 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.665087 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.665105 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.665118 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:39Z","lastTransitionTime":"2025-10-03T07:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.768354 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.768404 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.768416 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.768438 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.768452 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:39Z","lastTransitionTime":"2025-10-03T07:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.871168 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.871381 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.871479 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.871551 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.871647 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:39Z","lastTransitionTime":"2025-10-03T07:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.875621 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.875692 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:49:39 crc kubenswrapper[4664]: E1003 07:49:39.875749 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.875692 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:49:39 crc kubenswrapper[4664]: E1003 07:49:39.875883 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:49:39 crc kubenswrapper[4664]: E1003 07:49:39.875991 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.890523 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:39Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.900720 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b81ce-0ce7-498f-9337-ae5e6e64682b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf6aded6ff3dcd5fbea90e46bb42d6db6b963c9688abfed6fa40662139fb80f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b446e000609801e974810aa233005f7c2810cc977c7a13155452de76e1268427\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x9dgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:39Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.910319 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9z9q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4492bc6-9e61-4748-935e-e070a703c05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aac5ac1e508036a7e3cee3eb11577734e7923c4076399288863f294b7efa3fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqzrd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9z9q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:39Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.922600 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-72cm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6998d742-8d17-4f20-ab52-c30d9f7b0b89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482e54714945acaea85fdeeb4b89eb9b16568c96319d07eb812ef88bd5faeb85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a93e9f047fd6ffc420a229d1159b5c53c40c5efe1f945e45bafa3ba93a69caef\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T07:49:32Z\\\",\\\"message\\\":\\\"2025-10-03T07:48:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_fb5617ca-9f7d-47d6-b67b-fe6a5db8a650\\\\n2025-10-03T07:48:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_fb5617ca-9f7d-47d6-b67b-fe6a5db8a650 to /host/opt/cni/bin/\\\\n2025-10-03T07:48:47Z [verbose] multus-daemon started\\\\n2025-10-03T07:48:47Z [verbose] Readiness Indicator file check\\\\n2025-10-03T07:49:32Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:49:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9hpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-72cm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:39Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.942069 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7002f9af-9339-4bec-8b7a-ce0c1d20f3ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96ea8ece692f200d9dbe3650ba30a3680897c6d7c33c9b359b27e31692689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f580c83a6ce9959ed5a1c8b78834f7d6caecaeff3ed366fdef991ff728b08d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed30aa5b737df201bfca9e17073255b5d791430429d86149107650c38a1c0c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89484df39df055036ea8f81fb4e052a14292e7268e7c52088a497348141c3a81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 07:48:39.005956 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 07:48:39.006283 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 07:48:39.007038 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3527224080/tls.crt::/tmp/serving-cert-3527224080/tls.key\\\\\\\"\\\\nI1003 07:48:39.474760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 07:48:39.479276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 07:48:39.479354 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 07:48:39.479401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 07:48:39.479427 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 07:48:39.489134 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 07:48:39.489225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 07:48:39.489291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 07:48:39.489311 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 07:48:39.489336 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 07:48:39.489506 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 07:48:39.491496 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1078b48064376958ba64f9040dc1973628494ff4c9935c90c296bdb8a27f8738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:39Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.958983 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af9dfaab4099a0e60f66ad4c28ea685e11df2e1c3af1ac6805a4a1d70d841574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9dfaab4099a0e60f66ad4c28ea685e11df2e1c3af1ac6805a4a1d70d841574\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T07:49:18Z\\\",\\\"message\\\":\\\"Pod openshift-ovn-kubernetes/ovnkube-node-2jpvm\\\\nI1003 07:49:18.633255 6386 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1003 07:49:18.633257 6386 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-27zqq\\\\nI1003 07:49:18.633264 6386 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1003 07:49:18.633264 6386 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-h865c\\\\nI1003 07:49:18.633271 6386 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1003 07:49:18.633218 6386 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nF1003 07:49:18.633264 6386 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network contr\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:49:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2jpvm_openshift-ovn-kubernetes(8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2jpvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:39Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.973404 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.973435 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.973444 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.973458 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.973467 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:39Z","lastTransitionTime":"2025-10-03T07:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.974154 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h865c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e441c3-e8db-4705-9da1-0c6513d57048\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de7994382833aa2a4b6d7926e96257038511c4f547f24336eb11f3cf89e2d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h865c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:39Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.985863 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-27zqq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988af70a-5398-4c96-b2a7-4b8e143303bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f95198276df57dfddfafb72c325577486b3eddb40e73fbe4afa95e8ce8a6183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cr7ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd25565edabdcc002b63aece712eb1cf0ff4eece8277aa01ad4875f2977f63ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cr7ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-27zqq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:39Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:39 crc kubenswrapper[4664]: I1003 07:49:39.996719 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l687s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f2800e0-b66e-4ab2-ad4f-37c5ffe60120\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6h72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6h72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l687s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:39Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.007937 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:40Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.020817 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hqfhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e03e57b4-cb64-42fa-b8c5-ee4863291568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31867b4c35050cbe6c4247edfe085dc9429d5516c19ad8da432699262b7fd092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hqfhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:40Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.032723 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8078a587-9c95-43a8-9cf8-4286835f8134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938aaa7e3e507e4f04632ecbdeb12d1e169b59fdcd8c19838e21fedc4b55d721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682632866f1e606691047ac952506c02f250c825f142a1676a4abcca32e58466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7eed5fa26eeb30a803512265d01d5fbb9fffad00d857a683939158da6b0bb7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e4e705e4bd02aaf141572ec91291240f8f7ec92b33cd45ba98ea8ce83f06918\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:40Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.044445 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b74b4b6-2022-4308-8071-0f972dd1d922\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b1c43aa954d4f0ba95fd6bafe459a5f2d2df82b8cdbcd661c2a6e6238526fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd376fcbd2d46400852a695c1682ba853522bcc98038fcdb55c89d87d13ef012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f5a207be586adc5a1d852c18ed4ca1bf7d81b707a2f55ac2e76f31c5a94ffb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373216c551b7fed67ebbcb9c4cac5afd1e715a5fda849d977798c3690dc4f928\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://373216c551b7fed67ebbcb9c4cac5afd1e715a5fda849d977798c3690dc4f928\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:40Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.058857 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:40Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.071206 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d6a6dd75fa3f860ac6164f4065d5fbb3ea31148102e726b792655d5b08e34d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:40Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.075268 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.075306 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.075314 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.075328 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.075340 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:40Z","lastTransitionTime":"2025-10-03T07:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.083950 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c51653288453da9ac24f91fdd67f9cec6bea6dc9b90432eeab5350b7e48f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://741ad5608cab1c104f0a912ca7325733265167e85291eb6bc1b08dd5a23a9b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:40Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.095644 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a8b21d2fb73dbe0bf41ef524cce8befb05dca708bc568a567ccd71bc1434b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:40Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.177331 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.177363 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.177376 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.177400 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.177412 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:40Z","lastTransitionTime":"2025-10-03T07:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.279451 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.279504 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.279519 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.279536 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.279547 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:40Z","lastTransitionTime":"2025-10-03T07:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.382194 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.382290 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.382308 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.382330 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.382341 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:40Z","lastTransitionTime":"2025-10-03T07:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.484215 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.484255 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.484266 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.484284 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.484301 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:40Z","lastTransitionTime":"2025-10-03T07:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.586493 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.586531 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.586545 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.586562 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.586573 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:40Z","lastTransitionTime":"2025-10-03T07:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.689396 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.689443 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.689455 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.689471 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.689481 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:40Z","lastTransitionTime":"2025-10-03T07:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.791716 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.791770 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.791783 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.791798 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.791809 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:40Z","lastTransitionTime":"2025-10-03T07:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.875451 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:49:40 crc kubenswrapper[4664]: E1003 07:49:40.875811 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l687s" podUID="7f2800e0-b66e-4ab2-ad4f-37c5ffe60120" Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.891507 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.893955 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.893998 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.894010 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.894026 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.894038 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:40Z","lastTransitionTime":"2025-10-03T07:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.996498 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.996597 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.996638 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.996653 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:40 crc kubenswrapper[4664]: I1003 07:49:40.996674 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:40Z","lastTransitionTime":"2025-10-03T07:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:41 crc kubenswrapper[4664]: I1003 07:49:41.099013 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:41 crc kubenswrapper[4664]: I1003 07:49:41.099051 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:41 crc kubenswrapper[4664]: I1003 07:49:41.099061 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:41 crc kubenswrapper[4664]: I1003 07:49:41.099076 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:41 crc kubenswrapper[4664]: I1003 07:49:41.099085 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:41Z","lastTransitionTime":"2025-10-03T07:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:41 crc kubenswrapper[4664]: I1003 07:49:41.203253 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:41 crc kubenswrapper[4664]: I1003 07:49:41.203336 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:41 crc kubenswrapper[4664]: I1003 07:49:41.203348 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:41 crc kubenswrapper[4664]: I1003 07:49:41.203367 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:41 crc kubenswrapper[4664]: I1003 07:49:41.203383 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:41Z","lastTransitionTime":"2025-10-03T07:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:41 crc kubenswrapper[4664]: I1003 07:49:41.308683 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:41 crc kubenswrapper[4664]: I1003 07:49:41.309522 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:41 crc kubenswrapper[4664]: I1003 07:49:41.309532 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:41 crc kubenswrapper[4664]: I1003 07:49:41.309545 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:41 crc kubenswrapper[4664]: I1003 07:49:41.309557 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:41Z","lastTransitionTime":"2025-10-03T07:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:41 crc kubenswrapper[4664]: I1003 07:49:41.411464 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:41 crc kubenswrapper[4664]: I1003 07:49:41.411756 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:41 crc kubenswrapper[4664]: I1003 07:49:41.411835 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:41 crc kubenswrapper[4664]: I1003 07:49:41.411925 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:41 crc kubenswrapper[4664]: I1003 07:49:41.411996 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:41Z","lastTransitionTime":"2025-10-03T07:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:41 crc kubenswrapper[4664]: I1003 07:49:41.514867 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:41 crc kubenswrapper[4664]: I1003 07:49:41.514918 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:41 crc kubenswrapper[4664]: I1003 07:49:41.514937 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:41 crc kubenswrapper[4664]: I1003 07:49:41.514955 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:41 crc kubenswrapper[4664]: I1003 07:49:41.514966 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:41Z","lastTransitionTime":"2025-10-03T07:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:41 crc kubenswrapper[4664]: I1003 07:49:41.617803 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:41 crc kubenswrapper[4664]: I1003 07:49:41.617857 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:41 crc kubenswrapper[4664]: I1003 07:49:41.617870 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:41 crc kubenswrapper[4664]: I1003 07:49:41.617888 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:41 crc kubenswrapper[4664]: I1003 07:49:41.617900 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:41Z","lastTransitionTime":"2025-10-03T07:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:41 crc kubenswrapper[4664]: I1003 07:49:41.720077 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:41 crc kubenswrapper[4664]: I1003 07:49:41.720117 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:41 crc kubenswrapper[4664]: I1003 07:49:41.720131 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:41 crc kubenswrapper[4664]: I1003 07:49:41.720150 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:41 crc kubenswrapper[4664]: I1003 07:49:41.720162 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:41Z","lastTransitionTime":"2025-10-03T07:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:41 crc kubenswrapper[4664]: I1003 07:49:41.822630 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:41 crc kubenswrapper[4664]: I1003 07:49:41.822701 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:41 crc kubenswrapper[4664]: I1003 07:49:41.822717 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:41 crc kubenswrapper[4664]: I1003 07:49:41.822736 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:41 crc kubenswrapper[4664]: I1003 07:49:41.822751 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:41Z","lastTransitionTime":"2025-10-03T07:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:41 crc kubenswrapper[4664]: I1003 07:49:41.875904 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:49:41 crc kubenswrapper[4664]: I1003 07:49:41.875994 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:49:41 crc kubenswrapper[4664]: I1003 07:49:41.876030 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:49:41 crc kubenswrapper[4664]: E1003 07:49:41.876478 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:49:41 crc kubenswrapper[4664]: E1003 07:49:41.876290 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:49:41 crc kubenswrapper[4664]: E1003 07:49:41.876522 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:49:41 crc kubenswrapper[4664]: I1003 07:49:41.925452 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:41 crc kubenswrapper[4664]: I1003 07:49:41.925501 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:41 crc kubenswrapper[4664]: I1003 07:49:41.925513 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:41 crc kubenswrapper[4664]: I1003 07:49:41.925532 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:41 crc kubenswrapper[4664]: I1003 07:49:41.925544 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:41Z","lastTransitionTime":"2025-10-03T07:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:42 crc kubenswrapper[4664]: I1003 07:49:42.028295 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:42 crc kubenswrapper[4664]: I1003 07:49:42.028338 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:42 crc kubenswrapper[4664]: I1003 07:49:42.028349 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:42 crc kubenswrapper[4664]: I1003 07:49:42.028365 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:42 crc kubenswrapper[4664]: I1003 07:49:42.028377 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:42Z","lastTransitionTime":"2025-10-03T07:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:42 crc kubenswrapper[4664]: I1003 07:49:42.130953 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:42 crc kubenswrapper[4664]: I1003 07:49:42.131178 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:42 crc kubenswrapper[4664]: I1003 07:49:42.131289 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:42 crc kubenswrapper[4664]: I1003 07:49:42.131381 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:42 crc kubenswrapper[4664]: I1003 07:49:42.131462 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:42Z","lastTransitionTime":"2025-10-03T07:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:42 crc kubenswrapper[4664]: I1003 07:49:42.234667 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:42 crc kubenswrapper[4664]: I1003 07:49:42.234703 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:42 crc kubenswrapper[4664]: I1003 07:49:42.234711 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:42 crc kubenswrapper[4664]: I1003 07:49:42.234726 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:42 crc kubenswrapper[4664]: I1003 07:49:42.234735 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:42Z","lastTransitionTime":"2025-10-03T07:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:42 crc kubenswrapper[4664]: I1003 07:49:42.337198 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:42 crc kubenswrapper[4664]: I1003 07:49:42.337232 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:42 crc kubenswrapper[4664]: I1003 07:49:42.337244 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:42 crc kubenswrapper[4664]: I1003 07:49:42.337260 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:42 crc kubenswrapper[4664]: I1003 07:49:42.337270 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:42Z","lastTransitionTime":"2025-10-03T07:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:42 crc kubenswrapper[4664]: I1003 07:49:42.439792 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:42 crc kubenswrapper[4664]: I1003 07:49:42.439906 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:42 crc kubenswrapper[4664]: I1003 07:49:42.439922 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:42 crc kubenswrapper[4664]: I1003 07:49:42.439938 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:42 crc kubenswrapper[4664]: I1003 07:49:42.439950 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:42Z","lastTransitionTime":"2025-10-03T07:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:42 crc kubenswrapper[4664]: I1003 07:49:42.542148 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:42 crc kubenswrapper[4664]: I1003 07:49:42.542185 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:42 crc kubenswrapper[4664]: I1003 07:49:42.542195 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:42 crc kubenswrapper[4664]: I1003 07:49:42.542208 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:42 crc kubenswrapper[4664]: I1003 07:49:42.542218 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:42Z","lastTransitionTime":"2025-10-03T07:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:42 crc kubenswrapper[4664]: I1003 07:49:42.644855 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:42 crc kubenswrapper[4664]: I1003 07:49:42.644889 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:42 crc kubenswrapper[4664]: I1003 07:49:42.644901 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:42 crc kubenswrapper[4664]: I1003 07:49:42.644949 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:42 crc kubenswrapper[4664]: I1003 07:49:42.644961 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:42Z","lastTransitionTime":"2025-10-03T07:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:42 crc kubenswrapper[4664]: I1003 07:49:42.748065 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:42 crc kubenswrapper[4664]: I1003 07:49:42.748098 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:42 crc kubenswrapper[4664]: I1003 07:49:42.748109 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:42 crc kubenswrapper[4664]: I1003 07:49:42.748123 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:42 crc kubenswrapper[4664]: I1003 07:49:42.748133 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:42Z","lastTransitionTime":"2025-10-03T07:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:42 crc kubenswrapper[4664]: I1003 07:49:42.850582 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:42 crc kubenswrapper[4664]: I1003 07:49:42.850641 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:42 crc kubenswrapper[4664]: I1003 07:49:42.850652 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:42 crc kubenswrapper[4664]: I1003 07:49:42.850667 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:42 crc kubenswrapper[4664]: I1003 07:49:42.850679 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:42Z","lastTransitionTime":"2025-10-03T07:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:42 crc kubenswrapper[4664]: I1003 07:49:42.875364 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:49:42 crc kubenswrapper[4664]: E1003 07:49:42.875714 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l687s" podUID="7f2800e0-b66e-4ab2-ad4f-37c5ffe60120" Oct 03 07:49:42 crc kubenswrapper[4664]: I1003 07:49:42.953462 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:42 crc kubenswrapper[4664]: I1003 07:49:42.953565 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:42 crc kubenswrapper[4664]: I1003 07:49:42.953580 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:42 crc kubenswrapper[4664]: I1003 07:49:42.953635 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:42 crc kubenswrapper[4664]: I1003 07:49:42.953653 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:42Z","lastTransitionTime":"2025-10-03T07:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.056433 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.056658 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.056810 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.056922 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.057000 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:43Z","lastTransitionTime":"2025-10-03T07:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.159452 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.159511 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.159521 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.159535 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.159545 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:43Z","lastTransitionTime":"2025-10-03T07:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.262209 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.262249 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.262259 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.262278 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.262288 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:43Z","lastTransitionTime":"2025-10-03T07:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.364730 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.364768 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.364779 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.364793 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.364805 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:43Z","lastTransitionTime":"2025-10-03T07:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.467846 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.467889 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.467899 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.467916 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.467928 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:43Z","lastTransitionTime":"2025-10-03T07:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.570997 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.571303 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.571382 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.571467 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.571550 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:43Z","lastTransitionTime":"2025-10-03T07:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.674651 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.674696 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.674705 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.674720 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.674729 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:43Z","lastTransitionTime":"2025-10-03T07:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.711639 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.711752 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:49:43 crc kubenswrapper[4664]: E1003 07:49:43.711824 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 07:50:47.711796196 +0000 UTC m=+148.532986706 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:49:43 crc kubenswrapper[4664]: E1003 07:49:43.711883 4664 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.711890 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:49:43 crc kubenswrapper[4664]: E1003 07:49:43.711906 4664 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 07:49:43 crc kubenswrapper[4664]: E1003 07:49:43.711920 4664 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.711924 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.711951 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:49:43 crc kubenswrapper[4664]: E1003 07:49:43.711957 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 07:50:47.71194895 +0000 UTC m=+148.533139440 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 07:49:43 crc kubenswrapper[4664]: E1003 07:49:43.712027 4664 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 07:49:43 crc kubenswrapper[4664]: E1003 07:49:43.712058 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 07:50:47.712049063 +0000 UTC m=+148.533239553 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 07:49:43 crc kubenswrapper[4664]: E1003 07:49:43.712083 4664 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 07:49:43 crc kubenswrapper[4664]: E1003 07:49:43.712100 4664 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 07:49:43 crc kubenswrapper[4664]: E1003 07:49:43.712109 4664 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 07:49:43 crc kubenswrapper[4664]: E1003 07:49:43.712183 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 07:50:47.712167726 +0000 UTC m=+148.533358216 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 07:49:43 crc kubenswrapper[4664]: E1003 07:49:43.712208 4664 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 07:49:43 crc kubenswrapper[4664]: E1003 07:49:43.712340 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 07:50:47.71230737 +0000 UTC m=+148.533497860 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.777935 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.777976 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.777986 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.778004 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.778016 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:43Z","lastTransitionTime":"2025-10-03T07:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.875923 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.875933 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:49:43 crc kubenswrapper[4664]: E1003 07:49:43.876076 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.876202 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:49:43 crc kubenswrapper[4664]: E1003 07:49:43.876366 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:49:43 crc kubenswrapper[4664]: E1003 07:49:43.876502 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.880600 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.880953 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.881019 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.881136 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.881203 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:43Z","lastTransitionTime":"2025-10-03T07:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.984284 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.984328 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.984341 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.984357 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:43 crc kubenswrapper[4664]: I1003 07:49:43.984368 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:43Z","lastTransitionTime":"2025-10-03T07:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:44 crc kubenswrapper[4664]: I1003 07:49:44.087599 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:44 crc kubenswrapper[4664]: I1003 07:49:44.087900 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:44 crc kubenswrapper[4664]: I1003 07:49:44.088002 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:44 crc kubenswrapper[4664]: I1003 07:49:44.088106 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:44 crc kubenswrapper[4664]: I1003 07:49:44.088198 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:44Z","lastTransitionTime":"2025-10-03T07:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:44 crc kubenswrapper[4664]: I1003 07:49:44.191217 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:44 crc kubenswrapper[4664]: I1003 07:49:44.191284 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:44 crc kubenswrapper[4664]: I1003 07:49:44.191296 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:44 crc kubenswrapper[4664]: I1003 07:49:44.191318 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:44 crc kubenswrapper[4664]: I1003 07:49:44.191331 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:44Z","lastTransitionTime":"2025-10-03T07:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:44 crc kubenswrapper[4664]: I1003 07:49:44.293769 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:44 crc kubenswrapper[4664]: I1003 07:49:44.293809 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:44 crc kubenswrapper[4664]: I1003 07:49:44.293817 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:44 crc kubenswrapper[4664]: I1003 07:49:44.293833 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:44 crc kubenswrapper[4664]: I1003 07:49:44.293847 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:44Z","lastTransitionTime":"2025-10-03T07:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:44 crc kubenswrapper[4664]: I1003 07:49:44.395991 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:44 crc kubenswrapper[4664]: I1003 07:49:44.396043 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:44 crc kubenswrapper[4664]: I1003 07:49:44.396055 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:44 crc kubenswrapper[4664]: I1003 07:49:44.396069 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:44 crc kubenswrapper[4664]: I1003 07:49:44.396078 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:44Z","lastTransitionTime":"2025-10-03T07:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:44 crc kubenswrapper[4664]: I1003 07:49:44.498213 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:44 crc kubenswrapper[4664]: I1003 07:49:44.498285 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:44 crc kubenswrapper[4664]: I1003 07:49:44.498299 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:44 crc kubenswrapper[4664]: I1003 07:49:44.498315 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:44 crc kubenswrapper[4664]: I1003 07:49:44.498326 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:44Z","lastTransitionTime":"2025-10-03T07:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:44 crc kubenswrapper[4664]: I1003 07:49:44.600498 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:44 crc kubenswrapper[4664]: I1003 07:49:44.600542 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:44 crc kubenswrapper[4664]: I1003 07:49:44.600553 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:44 crc kubenswrapper[4664]: I1003 07:49:44.600569 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:44 crc kubenswrapper[4664]: I1003 07:49:44.600581 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:44Z","lastTransitionTime":"2025-10-03T07:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:44 crc kubenswrapper[4664]: I1003 07:49:44.703123 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:44 crc kubenswrapper[4664]: I1003 07:49:44.703160 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:44 crc kubenswrapper[4664]: I1003 07:49:44.703168 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:44 crc kubenswrapper[4664]: I1003 07:49:44.703181 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:44 crc kubenswrapper[4664]: I1003 07:49:44.703191 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:44Z","lastTransitionTime":"2025-10-03T07:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:44 crc kubenswrapper[4664]: I1003 07:49:44.806541 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:44 crc kubenswrapper[4664]: I1003 07:49:44.806623 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:44 crc kubenswrapper[4664]: I1003 07:49:44.806637 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:44 crc kubenswrapper[4664]: I1003 07:49:44.806655 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:44 crc kubenswrapper[4664]: I1003 07:49:44.806665 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:44Z","lastTransitionTime":"2025-10-03T07:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:44 crc kubenswrapper[4664]: I1003 07:49:44.876205 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:49:44 crc kubenswrapper[4664]: E1003 07:49:44.876737 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l687s" podUID="7f2800e0-b66e-4ab2-ad4f-37c5ffe60120" Oct 03 07:49:44 crc kubenswrapper[4664]: I1003 07:49:44.887141 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 03 07:49:44 crc kubenswrapper[4664]: I1003 07:49:44.909377 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:44 crc kubenswrapper[4664]: I1003 07:49:44.909421 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:44 crc kubenswrapper[4664]: I1003 07:49:44.909435 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:44 crc kubenswrapper[4664]: I1003 07:49:44.909452 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:44 crc kubenswrapper[4664]: I1003 07:49:44.909463 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:44Z","lastTransitionTime":"2025-10-03T07:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:45 crc kubenswrapper[4664]: I1003 07:49:45.012574 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:45 crc kubenswrapper[4664]: I1003 07:49:45.012661 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:45 crc kubenswrapper[4664]: I1003 07:49:45.012672 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:45 crc kubenswrapper[4664]: I1003 07:49:45.012686 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:45 crc kubenswrapper[4664]: I1003 07:49:45.012698 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:45Z","lastTransitionTime":"2025-10-03T07:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:45 crc kubenswrapper[4664]: I1003 07:49:45.114893 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:45 crc kubenswrapper[4664]: I1003 07:49:45.114939 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:45 crc kubenswrapper[4664]: I1003 07:49:45.114948 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:45 crc kubenswrapper[4664]: I1003 07:49:45.114962 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:45 crc kubenswrapper[4664]: I1003 07:49:45.114973 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:45Z","lastTransitionTime":"2025-10-03T07:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:45 crc kubenswrapper[4664]: I1003 07:49:45.217346 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:45 crc kubenswrapper[4664]: I1003 07:49:45.217399 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:45 crc kubenswrapper[4664]: I1003 07:49:45.217410 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:45 crc kubenswrapper[4664]: I1003 07:49:45.217427 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:45 crc kubenswrapper[4664]: I1003 07:49:45.217441 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:45Z","lastTransitionTime":"2025-10-03T07:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:45 crc kubenswrapper[4664]: I1003 07:49:45.318854 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:45 crc kubenswrapper[4664]: I1003 07:49:45.318880 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:45 crc kubenswrapper[4664]: I1003 07:49:45.318889 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:45 crc kubenswrapper[4664]: I1003 07:49:45.318901 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:45 crc kubenswrapper[4664]: I1003 07:49:45.318910 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:45Z","lastTransitionTime":"2025-10-03T07:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:45 crc kubenswrapper[4664]: I1003 07:49:45.421566 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:45 crc kubenswrapper[4664]: I1003 07:49:45.422154 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:45 crc kubenswrapper[4664]: I1003 07:49:45.422258 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:45 crc kubenswrapper[4664]: I1003 07:49:45.422333 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:45 crc kubenswrapper[4664]: I1003 07:49:45.422424 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:45Z","lastTransitionTime":"2025-10-03T07:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:45 crc kubenswrapper[4664]: I1003 07:49:45.524590 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:45 crc kubenswrapper[4664]: I1003 07:49:45.524664 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:45 crc kubenswrapper[4664]: I1003 07:49:45.524674 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:45 crc kubenswrapper[4664]: I1003 07:49:45.524691 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:45 crc kubenswrapper[4664]: I1003 07:49:45.524703 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:45Z","lastTransitionTime":"2025-10-03T07:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:45 crc kubenswrapper[4664]: I1003 07:49:45.627021 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:45 crc kubenswrapper[4664]: I1003 07:49:45.627077 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:45 crc kubenswrapper[4664]: I1003 07:49:45.627089 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:45 crc kubenswrapper[4664]: I1003 07:49:45.627109 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:45 crc kubenswrapper[4664]: I1003 07:49:45.627121 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:45Z","lastTransitionTime":"2025-10-03T07:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:45 crc kubenswrapper[4664]: I1003 07:49:45.729954 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:45 crc kubenswrapper[4664]: I1003 07:49:45.729992 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:45 crc kubenswrapper[4664]: I1003 07:49:45.730002 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:45 crc kubenswrapper[4664]: I1003 07:49:45.730016 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:45 crc kubenswrapper[4664]: I1003 07:49:45.730027 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:45Z","lastTransitionTime":"2025-10-03T07:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:45 crc kubenswrapper[4664]: I1003 07:49:45.832159 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:45 crc kubenswrapper[4664]: I1003 07:49:45.832216 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:45 crc kubenswrapper[4664]: I1003 07:49:45.832234 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:45 crc kubenswrapper[4664]: I1003 07:49:45.832253 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:45 crc kubenswrapper[4664]: I1003 07:49:45.832275 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:45Z","lastTransitionTime":"2025-10-03T07:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:45 crc kubenswrapper[4664]: I1003 07:49:45.875927 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:49:45 crc kubenswrapper[4664]: I1003 07:49:45.875999 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:49:45 crc kubenswrapper[4664]: I1003 07:49:45.875958 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:49:45 crc kubenswrapper[4664]: E1003 07:49:45.876093 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:49:45 crc kubenswrapper[4664]: E1003 07:49:45.876890 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:49:45 crc kubenswrapper[4664]: E1003 07:49:45.876918 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:49:45 crc kubenswrapper[4664]: I1003 07:49:45.878403 4664 scope.go:117] "RemoveContainer" containerID="af9dfaab4099a0e60f66ad4c28ea685e11df2e1c3af1ac6805a4a1d70d841574" Oct 03 07:49:45 crc kubenswrapper[4664]: I1003 07:49:45.934455 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:45 crc kubenswrapper[4664]: I1003 07:49:45.934507 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:45 crc kubenswrapper[4664]: I1003 07:49:45.934518 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:45 crc kubenswrapper[4664]: I1003 07:49:45.934536 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:45 crc kubenswrapper[4664]: I1003 07:49:45.934547 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:45Z","lastTransitionTime":"2025-10-03T07:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.037572 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.037716 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.037737 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.037754 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.037766 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:46Z","lastTransitionTime":"2025-10-03T07:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.141084 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.141152 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.141164 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.141182 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.141212 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:46Z","lastTransitionTime":"2025-10-03T07:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.244121 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.244175 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.244210 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.244234 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.244247 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:46Z","lastTransitionTime":"2025-10-03T07:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.325798 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2jpvm_8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d/ovnkube-controller/2.log" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.328942 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" event={"ID":"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d","Type":"ContainerStarted","Data":"46c7dd53736fe9d646bd1433c7f564d5096f83984954d65d5f63ce953cd49690"} Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.329369 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.342521 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hqfhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e03e57b4-cb64-42fa-b8c5-ee4863291568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31867b4c35050cbe6c4247edfe085dc9429d5516c19ad8da432699262b7fd092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hqfhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:46Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.347369 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.347418 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.347428 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.347446 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.347456 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:46Z","lastTransitionTime":"2025-10-03T07:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.360729 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8078a587-9c95-43a8-9cf8-4286835f8134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938aaa7e3e507e4f04632ecbdeb12d1e169b59fdcd8c19838e21fedc4b55d721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682632866f1e606691047ac952506c02f250c825f142a1676a4abcca32e58466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7eed5fa26eeb30a803512265d01d5fbb9fffad00d857a683939158da6b0bb7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e4e705e4bd02aaf141572ec91291240f8f7ec92b33cd45ba98ea8ce83f06918\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:46Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.374032 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b74b4b6-2022-4308-8071-0f972dd1d922\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b1c43aa954d4f0ba95fd6bafe459a5f2d2df82b8cdbcd661c2a6e6238526fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd376fcbd2d46400852a695c1682ba853522bcc98038fcdb55c89d87d13ef012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f5a207be586adc5a1d852c18ed4ca1bf7d81b707a2f55ac2e76f31c5a94ffb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373216c551b7fed67ebbcb9c4cac5afd1e715a5fda849d977798c3690dc4f928\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://373216c551b7fed67ebbcb9c4cac5afd1e715a5fda849d977798c3690dc4f928\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:46Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.388454 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:46Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.403064 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d6a6dd75fa3f860ac6164f4065d5fbb3ea31148102e726b792655d5b08e34d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:46Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.416310 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c51653288453da9ac24f91fdd67f9cec6bea6dc9b90432eeab5350b7e48f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://741ad5608cab1c104f0a912ca7325733265167e85291eb6bc1b08dd5a23a9b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:46Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.434789 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a8b21d2fb73dbe0bf41ef524cce8befb05dca708bc568a567ccd71bc1434b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:46Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.450192 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.450260 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.450271 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.450308 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.450322 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:46Z","lastTransitionTime":"2025-10-03T07:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.450989 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:46Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.474025 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9408c318-d840-4eff-815e-152565efafbc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ccdd0e3981ae1631d21b940d6ab096e6c5a8f62ea0d9edfba00c925b7b4a235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9054c0ddac91219be608702604b3f2fc398dcb23cd4ae2c87e29aea2267383a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://081e3f10dbfcd175f2595aabad8ff020b4878a8605ab4f85d68ecd8178bda548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://111f5ec5819c2f11bfc65c9cf5308b5d3ffc4b3bb6f8b3f65f31a9323b56d9e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28b42bc27e52911b15b17e3effdc251425df3177be375aed26649ed02cea13e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc449fe962a0144d2f3088d5d5b4a8769035aca8b07118aca83de2cd3183651b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc449fe962a0144d2f3088d5d5b4a8769035aca8b07118aca83de2cd3183651b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01406adf04f1c29dcb0acaa2268c9514d4712a73f7f055149ae1507b1cdbb088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01406adf04f1c29dcb0acaa2268c9514d4712a73f7f055149ae1507b1cdbb088\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f6097fc39412ae589b05b16aede4059ae55e0bf6251ff504832fff0bba157aac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6097fc39412ae589b05b16aede4059ae55e0bf6251ff504832fff0bba157aac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:46Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.489383 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:46Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.503792 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b81ce-0ce7-498f-9337-ae5e6e64682b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf6aded6ff3dcd5fbea90e46bb42d6db6b963c9688abfed6fa40662139fb80f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b446e000609801e974810aa233005f7c2810cc977c7a13155452de76e1268427\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x9dgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:46Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.519889 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9z9q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4492bc6-9e61-4748-935e-e070a703c05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aac5ac1e508036a7e3cee3eb11577734e7923c4076399288863f294b7efa3fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqzrd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9z9q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:46Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.534537 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4391bdd-694b-4c79-8482-12ecc43e15a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d71e13c58bbb5cba74df86672f02d7970dae7bc41a9c88aa6652f98c62fa7122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b60eef776685fde325952f0aa6c0b2a679d105f02227deae22666253ae8596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19b60eef776685fde325952f0aa6c0b2a679d105f02227deae22666253ae8596\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:46Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.549891 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-72cm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6998d742-8d17-4f20-ab52-c30d9f7b0b89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482e54714945acaea85fdeeb4b89eb9b16568c96319d07eb812ef88bd5faeb85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a93e9f047fd6ffc420a229d1159b5c53c40c5efe1f945e45bafa3ba93a69caef\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T07:49:32Z\\\",\\\"message\\\":\\\"2025-10-03T07:48:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_fb5617ca-9f7d-47d6-b67b-fe6a5db8a650\\\\n2025-10-03T07:48:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_fb5617ca-9f7d-47d6-b67b-fe6a5db8a650 to /host/opt/cni/bin/\\\\n2025-10-03T07:48:47Z [verbose] multus-daemon started\\\\n2025-10-03T07:48:47Z [verbose] Readiness Indicator file check\\\\n2025-10-03T07:49:32Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:49:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9hpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-72cm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:46Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.552748 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.552782 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.552792 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.552807 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.552816 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:46Z","lastTransitionTime":"2025-10-03T07:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.567320 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7002f9af-9339-4bec-8b7a-ce0c1d20f3ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96ea8ece692f200d9dbe3650ba30a3680897c6d7c33c9b359b27e31692689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f580c83a6ce9959ed5a1c8b78834f7d6caecaeff3ed366fdef991ff728b08d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed30aa5b737df201bfca9e17073255b5d791430429d86149107650c38a1c0c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89484df39df055036ea8f81fb4e052a14292e7268e7c52088a497348141c3a81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 07:48:39.005956 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 07:48:39.006283 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 07:48:39.007038 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3527224080/tls.crt::/tmp/serving-cert-3527224080/tls.key\\\\\\\"\\\\nI1003 07:48:39.474760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 07:48:39.479276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 07:48:39.479354 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 07:48:39.479401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 07:48:39.479427 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 07:48:39.489134 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 07:48:39.489225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 07:48:39.489291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 07:48:39.489311 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 07:48:39.489336 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 07:48:39.489506 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 07:48:39.491496 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1078b48064376958ba64f9040dc1973628494ff4c9935c90c296bdb8a27f8738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:46Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.587547 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c7dd53736fe9d646bd1433c7f564d5096f83984954d65d5f63ce953cd49690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9dfaab4099a0e60f66ad4c28ea685e11df2e1c3af1ac6805a4a1d70d841574\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T07:49:18Z\\\",\\\"message\\\":\\\"Pod openshift-ovn-kubernetes/ovnkube-node-2jpvm\\\\nI1003 07:49:18.633255 6386 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1003 07:49:18.633257 6386 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-27zqq\\\\nI1003 07:49:18.633264 6386 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1003 07:49:18.633264 6386 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-h865c\\\\nI1003 07:49:18.633271 6386 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1003 07:49:18.633218 6386 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nF1003 07:49:18.633264 6386 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network contr\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:49:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:49:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2jpvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:46Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.604713 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h865c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e441c3-e8db-4705-9da1-0c6513d57048\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de7994382833aa2a4b6d7926e96257038511c4f547f24336eb11f3cf89e2d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h865c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:46Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.617095 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-27zqq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988af70a-5398-4c96-b2a7-4b8e143303bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f95198276df57dfddfafb72c325577486b3eddb40e73fbe4afa95e8ce8a6183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cr7ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd25565edabdcc002b63aece712eb1cf0ff4eece8277aa01ad4875f2977f63ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cr7ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-27zqq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:46Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.628036 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l687s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f2800e0-b66e-4ab2-ad4f-37c5ffe60120\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6h72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6h72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l687s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:46Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.655466 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.655553 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.655566 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.655581 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.655594 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:46Z","lastTransitionTime":"2025-10-03T07:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.758369 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.758492 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.758504 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.758520 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.758529 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:46Z","lastTransitionTime":"2025-10-03T07:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.862418 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.862466 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.862479 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.862495 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.862510 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:46Z","lastTransitionTime":"2025-10-03T07:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.875982 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:49:46 crc kubenswrapper[4664]: E1003 07:49:46.876181 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l687s" podUID="7f2800e0-b66e-4ab2-ad4f-37c5ffe60120" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.964981 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.965013 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.965025 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.965037 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:46 crc kubenswrapper[4664]: I1003 07:49:46.965048 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:46Z","lastTransitionTime":"2025-10-03T07:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.068090 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.068152 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.068169 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.068192 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.068208 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:47Z","lastTransitionTime":"2025-10-03T07:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.175323 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.175372 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.175397 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.175411 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.175422 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:47Z","lastTransitionTime":"2025-10-03T07:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.176653 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.176704 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.176732 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.176757 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.176774 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:47Z","lastTransitionTime":"2025-10-03T07:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:47 crc kubenswrapper[4664]: E1003 07:49:47.189374 4664 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eca81c87-676e-4667-a87b-e015ec0be81c\\\",\\\"systemUUID\\\":\\\"7be6e848-96ef-48b1-8627-9ddc13d5cc87\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:47Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.193166 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.193208 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.193217 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.193230 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.193240 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:47Z","lastTransitionTime":"2025-10-03T07:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:47 crc kubenswrapper[4664]: E1003 07:49:47.204210 4664 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eca81c87-676e-4667-a87b-e015ec0be81c\\\",\\\"systemUUID\\\":\\\"7be6e848-96ef-48b1-8627-9ddc13d5cc87\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:47Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.207938 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.207965 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.207973 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.207986 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.207995 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:47Z","lastTransitionTime":"2025-10-03T07:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:47 crc kubenswrapper[4664]: E1003 07:49:47.219125 4664 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eca81c87-676e-4667-a87b-e015ec0be81c\\\",\\\"systemUUID\\\":\\\"7be6e848-96ef-48b1-8627-9ddc13d5cc87\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:47Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.222944 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.222979 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.222988 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.223002 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.223014 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:47Z","lastTransitionTime":"2025-10-03T07:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:47 crc kubenswrapper[4664]: E1003 07:49:47.235448 4664 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eca81c87-676e-4667-a87b-e015ec0be81c\\\",\\\"systemUUID\\\":\\\"7be6e848-96ef-48b1-8627-9ddc13d5cc87\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:47Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.239593 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.239654 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.239667 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.239683 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.239719 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:47Z","lastTransitionTime":"2025-10-03T07:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:47 crc kubenswrapper[4664]: E1003 07:49:47.251774 4664 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eca81c87-676e-4667-a87b-e015ec0be81c\\\",\\\"systemUUID\\\":\\\"7be6e848-96ef-48b1-8627-9ddc13d5cc87\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:47Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:47 crc kubenswrapper[4664]: E1003 07:49:47.251929 4664 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.278355 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.278395 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.278440 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.278456 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.278465 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:47Z","lastTransitionTime":"2025-10-03T07:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.333995 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2jpvm_8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d/ovnkube-controller/3.log" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.334722 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2jpvm_8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d/ovnkube-controller/2.log" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.336991 4664 generic.go:334] "Generic (PLEG): container finished" podID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerID="46c7dd53736fe9d646bd1433c7f564d5096f83984954d65d5f63ce953cd49690" exitCode=1 Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.337026 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" event={"ID":"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d","Type":"ContainerDied","Data":"46c7dd53736fe9d646bd1433c7f564d5096f83984954d65d5f63ce953cd49690"} Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.337265 4664 scope.go:117] "RemoveContainer" containerID="af9dfaab4099a0e60f66ad4c28ea685e11df2e1c3af1ac6805a4a1d70d841574" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.337857 4664 scope.go:117] "RemoveContainer" containerID="46c7dd53736fe9d646bd1433c7f564d5096f83984954d65d5f63ce953cd49690" Oct 03 07:49:47 crc kubenswrapper[4664]: E1003 07:49:47.338014 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-2jpvm_openshift-ovn-kubernetes(8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.356341 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h865c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e441c3-e8db-4705-9da1-0c6513d57048\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de7994382833aa2a4b6d7926e96257038511c4f547f24336eb11f3cf89e2d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h865c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:47Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.369641 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-27zqq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988af70a-5398-4c96-b2a7-4b8e143303bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f95198276df57dfddfafb72c325577486b3eddb40e73fbe4afa95e8ce8a6183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cr7ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd25565edabdcc002b63aece712eb1cf0ff4eece8277aa01ad4875f2977f63ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cr7ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-27zqq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:47Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.379272 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l687s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f2800e0-b66e-4ab2-ad4f-37c5ffe60120\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6h72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6h72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l687s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:47Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.380382 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.380409 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.380420 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.380435 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.380446 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:47Z","lastTransitionTime":"2025-10-03T07:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.390970 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7002f9af-9339-4bec-8b7a-ce0c1d20f3ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96ea8ece692f200d9dbe3650ba30a3680897c6d7c33c9b359b27e31692689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f580c83a6ce9959ed5a1c8b78834f7d6caecaeff3ed366fdef991ff728b08d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed30aa5b737df201bfca9e17073255b5d791430429d86149107650c38a1c0c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89484df39df055036ea8f81fb4e052a14292e7268e7c52088a497348141c3a81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 07:48:39.005956 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 07:48:39.006283 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 07:48:39.007038 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3527224080/tls.crt::/tmp/serving-cert-3527224080/tls.key\\\\\\\"\\\\nI1003 07:48:39.474760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 07:48:39.479276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 07:48:39.479354 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 07:48:39.479401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 07:48:39.479427 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 07:48:39.489134 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 07:48:39.489225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 07:48:39.489291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 07:48:39.489311 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 07:48:39.489336 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 07:48:39.489506 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 07:48:39.491496 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1078b48064376958ba64f9040dc1973628494ff4c9935c90c296bdb8a27f8738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:47Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.407872 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c7dd53736fe9d646bd1433c7f564d5096f83984954d65d5f63ce953cd49690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9dfaab4099a0e60f66ad4c28ea685e11df2e1c3af1ac6805a4a1d70d841574\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T07:49:18Z\\\",\\\"message\\\":\\\"Pod openshift-ovn-kubernetes/ovnkube-node-2jpvm\\\\nI1003 07:49:18.633255 6386 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1003 07:49:18.633257 6386 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-27zqq\\\\nI1003 07:49:18.633264 6386 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1003 07:49:18.633264 6386 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-h865c\\\\nI1003 07:49:18.633271 6386 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1003 07:49:18.633218 6386 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nF1003 07:49:18.633264 6386 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network contr\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:49:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46c7dd53736fe9d646bd1433c7f564d5096f83984954d65d5f63ce953cd49690\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T07:49:46Z\\\",\\\"message\\\":\\\"ternalversions/factory.go:141\\\\nI1003 07:49:46.647758 6739 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1003 07:49:46.647937 6739 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 07:49:46.648037 6739 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 07:49:46.648258 6739 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1003 07:49:46.648314 6739 factory.go:656] Stopping watch factory\\\\nI1003 07:49:46.648371 6739 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 07:49:46.677200 6739 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1003 07:49:46.677255 6739 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1003 07:49:46.677350 6739 ovnkube.go:599] Stopped ovnkube\\\\nI1003 07:49:46.677384 6739 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 07:49:46.677538 6739 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:49:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2jpvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:47Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.418466 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:47Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.428976 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d6a6dd75fa3f860ac6164f4065d5fbb3ea31148102e726b792655d5b08e34d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:47Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.441528 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c51653288453da9ac24f91fdd67f9cec6bea6dc9b90432eeab5350b7e48f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://741ad5608cab1c104f0a912ca7325733265167e85291eb6bc1b08dd5a23a9b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:47Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.455119 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a8b21d2fb73dbe0bf41ef524cce8befb05dca708bc568a567ccd71bc1434b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:47Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.467936 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:47Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.478960 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hqfhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e03e57b4-cb64-42fa-b8c5-ee4863291568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31867b4c35050cbe6c4247edfe085dc9429d5516c19ad8da432699262b7fd092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hqfhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:47Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.483051 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.483094 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.483107 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.483125 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.483138 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:47Z","lastTransitionTime":"2025-10-03T07:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.492276 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8078a587-9c95-43a8-9cf8-4286835f8134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938aaa7e3e507e4f04632ecbdeb12d1e169b59fdcd8c19838e21fedc4b55d721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682632866f1e606691047ac952506c02f250c825f142a1676a4abcca32e58466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7eed5fa26eeb30a803512265d01d5fbb9fffad00d857a683939158da6b0bb7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e4e705e4bd02aaf141572ec91291240f8f7ec92b33cd45ba98ea8ce83f06918\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:47Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.503711 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b74b4b6-2022-4308-8071-0f972dd1d922\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b1c43aa954d4f0ba95fd6bafe459a5f2d2df82b8cdbcd661c2a6e6238526fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd376fcbd2d46400852a695c1682ba853522bcc98038fcdb55c89d87d13ef012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f5a207be586adc5a1d852c18ed4ca1bf7d81b707a2f55ac2e76f31c5a94ffb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373216c551b7fed67ebbcb9c4cac5afd1e715a5fda849d977798c3690dc4f928\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://373216c551b7fed67ebbcb9c4cac5afd1e715a5fda849d977798c3690dc4f928\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:47Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.515236 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b81ce-0ce7-498f-9337-ae5e6e64682b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf6aded6ff3dcd5fbea90e46bb42d6db6b963c9688abfed6fa40662139fb80f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b446e000609801e974810aa233005f7c2810cc977c7a13155452de76e1268427\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x9dgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:47Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.527737 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9z9q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4492bc6-9e61-4748-935e-e070a703c05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aac5ac1e508036a7e3cee3eb11577734e7923c4076399288863f294b7efa3fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqzrd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9z9q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:47Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.546796 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9408c318-d840-4eff-815e-152565efafbc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ccdd0e3981ae1631d21b940d6ab096e6c5a8f62ea0d9edfba00c925b7b4a235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9054c0ddac91219be608702604b3f2fc398dcb23cd4ae2c87e29aea2267383a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://081e3f10dbfcd175f2595aabad8ff020b4878a8605ab4f85d68ecd8178bda548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://111f5ec5819c2f11bfc65c9cf5308b5d3ffc4b3bb6f8b3f65f31a9323b56d9e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28b42bc27e52911b15b17e3effdc251425df3177be375aed26649ed02cea13e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc449fe962a0144d2f3088d5d5b4a8769035aca8b07118aca83de2cd3183651b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc449fe962a0144d2f3088d5d5b4a8769035aca8b07118aca83de2cd3183651b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01406adf04f1c29dcb0acaa2268c9514d4712a73f7f055149ae1507b1cdbb088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01406adf04f1c29dcb0acaa2268c9514d4712a73f7f055149ae1507b1cdbb088\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f6097fc39412ae589b05b16aede4059ae55e0bf6251ff504832fff0bba157aac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6097fc39412ae589b05b16aede4059ae55e0bf6251ff504832fff0bba157aac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:47Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.559976 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:47Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.571076 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4391bdd-694b-4c79-8482-12ecc43e15a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d71e13c58bbb5cba74df86672f02d7970dae7bc41a9c88aa6652f98c62fa7122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b60eef776685fde325952f0aa6c0b2a679d105f02227deae22666253ae8596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19b60eef776685fde325952f0aa6c0b2a679d105f02227deae22666253ae8596\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:47Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.585047 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.585095 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.585110 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.585127 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.585140 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:47Z","lastTransitionTime":"2025-10-03T07:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.587011 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-72cm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6998d742-8d17-4f20-ab52-c30d9f7b0b89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482e54714945acaea85fdeeb4b89eb9b16568c96319d07eb812ef88bd5faeb85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a93e9f047fd6ffc420a229d1159b5c53c40c5efe1f945e45bafa3ba93a69caef\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T07:49:32Z\\\",\\\"message\\\":\\\"2025-10-03T07:48:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_fb5617ca-9f7d-47d6-b67b-fe6a5db8a650\\\\n2025-10-03T07:48:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_fb5617ca-9f7d-47d6-b67b-fe6a5db8a650 to /host/opt/cni/bin/\\\\n2025-10-03T07:48:47Z [verbose] multus-daemon started\\\\n2025-10-03T07:48:47Z [verbose] Readiness Indicator file check\\\\n2025-10-03T07:49:32Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:49:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9hpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-72cm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:47Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.688595 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.688654 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.688664 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.688677 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.688686 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:47Z","lastTransitionTime":"2025-10-03T07:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.791443 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.791762 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.791772 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.791784 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.791794 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:47Z","lastTransitionTime":"2025-10-03T07:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.875361 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.875429 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:49:47 crc kubenswrapper[4664]: E1003 07:49:47.875507 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.875866 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:49:47 crc kubenswrapper[4664]: E1003 07:49:47.875939 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:49:47 crc kubenswrapper[4664]: E1003 07:49:47.876141 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.894657 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.894695 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.894706 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.894721 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.894730 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:47Z","lastTransitionTime":"2025-10-03T07:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.997353 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.997409 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.997421 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.997439 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:47 crc kubenswrapper[4664]: I1003 07:49:47.997454 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:47Z","lastTransitionTime":"2025-10-03T07:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.100337 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.100382 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.100390 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.100405 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.100415 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:48Z","lastTransitionTime":"2025-10-03T07:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.202541 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.202571 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.202580 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.202596 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.202627 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:48Z","lastTransitionTime":"2025-10-03T07:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.305274 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.305333 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.305346 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.305364 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.305377 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:48Z","lastTransitionTime":"2025-10-03T07:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.342691 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2jpvm_8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d/ovnkube-controller/3.log" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.345311 4664 scope.go:117] "RemoveContainer" containerID="46c7dd53736fe9d646bd1433c7f564d5096f83984954d65d5f63ce953cd49690" Oct 03 07:49:48 crc kubenswrapper[4664]: E1003 07:49:48.345441 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-2jpvm_openshift-ovn-kubernetes(8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.356879 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d6a6dd75fa3f860ac6164f4065d5fbb3ea31148102e726b792655d5b08e34d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:48Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.369024 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c51653288453da9ac24f91fdd67f9cec6bea6dc9b90432eeab5350b7e48f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://741ad5608cab1c104f0a912ca7325733265167e85291eb6bc1b08dd5a23a9b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:48Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.382303 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a8b21d2fb73dbe0bf41ef524cce8befb05dca708bc568a567ccd71bc1434b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:48Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.394460 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:48Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.404386 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hqfhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e03e57b4-cb64-42fa-b8c5-ee4863291568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31867b4c35050cbe6c4247edfe085dc9429d5516c19ad8da432699262b7fd092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hqfhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:48Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.409053 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.409085 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.409092 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.409106 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.409117 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:48Z","lastTransitionTime":"2025-10-03T07:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.416101 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8078a587-9c95-43a8-9cf8-4286835f8134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938aaa7e3e507e4f04632ecbdeb12d1e169b59fdcd8c19838e21fedc4b55d721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682632866f1e606691047ac952506c02f250c825f142a1676a4abcca32e58466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7eed5fa26eeb30a803512265d01d5fbb9fffad00d857a683939158da6b0bb7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e4e705e4bd02aaf141572ec91291240f8f7ec92b33cd45ba98ea8ce83f06918\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:48Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.428266 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b74b4b6-2022-4308-8071-0f972dd1d922\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b1c43aa954d4f0ba95fd6bafe459a5f2d2df82b8cdbcd661c2a6e6238526fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd376fcbd2d46400852a695c1682ba853522bcc98038fcdb55c89d87d13ef012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f5a207be586adc5a1d852c18ed4ca1bf7d81b707a2f55ac2e76f31c5a94ffb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373216c551b7fed67ebbcb9c4cac5afd1e715a5fda849d977798c3690dc4f928\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://373216c551b7fed67ebbcb9c4cac5afd1e715a5fda849d977798c3690dc4f928\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:48Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.441321 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:48Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.452230 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9z9q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4492bc6-9e61-4748-935e-e070a703c05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aac5ac1e508036a7e3cee3eb11577734e7923c4076399288863f294b7efa3fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqzrd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9z9q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:48Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.470728 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9408c318-d840-4eff-815e-152565efafbc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ccdd0e3981ae1631d21b940d6ab096e6c5a8f62ea0d9edfba00c925b7b4a235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9054c0ddac91219be608702604b3f2fc398dcb23cd4ae2c87e29aea2267383a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://081e3f10dbfcd175f2595aabad8ff020b4878a8605ab4f85d68ecd8178bda548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://111f5ec5819c2f11bfc65c9cf5308b5d3ffc4b3bb6f8b3f65f31a9323b56d9e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28b42bc27e52911b15b17e3effdc251425df3177be375aed26649ed02cea13e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc449fe962a0144d2f3088d5d5b4a8769035aca8b07118aca83de2cd3183651b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc449fe962a0144d2f3088d5d5b4a8769035aca8b07118aca83de2cd3183651b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01406adf04f1c29dcb0acaa2268c9514d4712a73f7f055149ae1507b1cdbb088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01406adf04f1c29dcb0acaa2268c9514d4712a73f7f055149ae1507b1cdbb088\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f6097fc39412ae589b05b16aede4059ae55e0bf6251ff504832fff0bba157aac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6097fc39412ae589b05b16aede4059ae55e0bf6251ff504832fff0bba157aac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:48Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.481520 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:48Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.491539 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b81ce-0ce7-498f-9337-ae5e6e64682b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf6aded6ff3dcd5fbea90e46bb42d6db6b963c9688abfed6fa40662139fb80f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b446e000609801e974810aa233005f7c2810cc977c7a13155452de76e1268427\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x9dgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:48Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.499637 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4391bdd-694b-4c79-8482-12ecc43e15a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d71e13c58bbb5cba74df86672f02d7970dae7bc41a9c88aa6652f98c62fa7122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b60eef776685fde325952f0aa6c0b2a679d105f02227deae22666253ae8596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19b60eef776685fde325952f0aa6c0b2a679d105f02227deae22666253ae8596\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:48Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.510786 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-72cm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6998d742-8d17-4f20-ab52-c30d9f7b0b89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482e54714945acaea85fdeeb4b89eb9b16568c96319d07eb812ef88bd5faeb85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a93e9f047fd6ffc420a229d1159b5c53c40c5efe1f945e45bafa3ba93a69caef\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T07:49:32Z\\\",\\\"message\\\":\\\"2025-10-03T07:48:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_fb5617ca-9f7d-47d6-b67b-fe6a5db8a650\\\\n2025-10-03T07:48:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_fb5617ca-9f7d-47d6-b67b-fe6a5db8a650 to /host/opt/cni/bin/\\\\n2025-10-03T07:48:47Z [verbose] multus-daemon started\\\\n2025-10-03T07:48:47Z [verbose] Readiness Indicator file check\\\\n2025-10-03T07:49:32Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:49:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9hpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-72cm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:48Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.511225 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.511268 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.511280 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.511294 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.511311 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:48Z","lastTransitionTime":"2025-10-03T07:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.523403 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-27zqq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988af70a-5398-4c96-b2a7-4b8e143303bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f95198276df57dfddfafb72c325577486b3eddb40e73fbe4afa95e8ce8a6183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cr7ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd25565edabdcc002b63aece712eb1cf0ff4eece8277aa01ad4875f2977f63ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cr7ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-27zqq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:48Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.535674 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l687s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f2800e0-b66e-4ab2-ad4f-37c5ffe60120\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6h72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6h72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l687s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:48Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.548639 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7002f9af-9339-4bec-8b7a-ce0c1d20f3ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96ea8ece692f200d9dbe3650ba30a3680897c6d7c33c9b359b27e31692689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f580c83a6ce9959ed5a1c8b78834f7d6caecaeff3ed366fdef991ff728b08d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed30aa5b737df201bfca9e17073255b5d791430429d86149107650c38a1c0c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89484df39df055036ea8f81fb4e052a14292e7268e7c52088a497348141c3a81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 07:48:39.005956 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 07:48:39.006283 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 07:48:39.007038 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3527224080/tls.crt::/tmp/serving-cert-3527224080/tls.key\\\\\\\"\\\\nI1003 07:48:39.474760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 07:48:39.479276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 07:48:39.479354 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 07:48:39.479401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 07:48:39.479427 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 07:48:39.489134 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 07:48:39.489225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 07:48:39.489291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 07:48:39.489311 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 07:48:39.489336 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 07:48:39.489506 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 07:48:39.491496 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1078b48064376958ba64f9040dc1973628494ff4c9935c90c296bdb8a27f8738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:48Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.569802 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c7dd53736fe9d646bd1433c7f564d5096f83984954d65d5f63ce953cd49690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46c7dd53736fe9d646bd1433c7f564d5096f83984954d65d5f63ce953cd49690\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T07:49:46Z\\\",\\\"message\\\":\\\"ternalversions/factory.go:141\\\\nI1003 07:49:46.647758 6739 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1003 07:49:46.647937 6739 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 07:49:46.648037 6739 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 07:49:46.648258 6739 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1003 07:49:46.648314 6739 factory.go:656] Stopping watch factory\\\\nI1003 07:49:46.648371 6739 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 07:49:46.677200 6739 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1003 07:49:46.677255 6739 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1003 07:49:46.677350 6739 ovnkube.go:599] Stopped ovnkube\\\\nI1003 07:49:46.677384 6739 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 07:49:46.677538 6739 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:49:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-2jpvm_openshift-ovn-kubernetes(8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2jpvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:48Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.588396 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h865c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e441c3-e8db-4705-9da1-0c6513d57048\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de7994382833aa2a4b6d7926e96257038511c4f547f24336eb11f3cf89e2d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h865c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:48Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.613798 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.613851 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.613861 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.613875 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.613885 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:48Z","lastTransitionTime":"2025-10-03T07:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.716634 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.716710 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.716724 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.716752 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.716765 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:48Z","lastTransitionTime":"2025-10-03T07:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.819511 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.819556 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.819568 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.819585 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.819598 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:48Z","lastTransitionTime":"2025-10-03T07:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.875783 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:49:48 crc kubenswrapper[4664]: E1003 07:49:48.875951 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l687s" podUID="7f2800e0-b66e-4ab2-ad4f-37c5ffe60120" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.923122 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.923193 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.923206 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.923221 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:48 crc kubenswrapper[4664]: I1003 07:49:48.923257 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:48Z","lastTransitionTime":"2025-10-03T07:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.025533 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.025577 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.025587 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.025620 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.025632 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:49Z","lastTransitionTime":"2025-10-03T07:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.127950 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.128013 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.128026 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.128047 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.128063 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:49Z","lastTransitionTime":"2025-10-03T07:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.231807 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.231858 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.231868 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.231883 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.231894 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:49Z","lastTransitionTime":"2025-10-03T07:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.334571 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.334635 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.334649 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.334664 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.334675 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:49Z","lastTransitionTime":"2025-10-03T07:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.437186 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.437288 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.437296 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.437308 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.437318 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:49Z","lastTransitionTime":"2025-10-03T07:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.540009 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.540048 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.540058 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.540073 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.540083 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:49Z","lastTransitionTime":"2025-10-03T07:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.642053 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.642339 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.642401 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.642480 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.642538 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:49Z","lastTransitionTime":"2025-10-03T07:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.745150 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.745186 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.745194 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.745208 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.745218 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:49Z","lastTransitionTime":"2025-10-03T07:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.847804 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.847865 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.847876 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.847893 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.847905 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:49Z","lastTransitionTime":"2025-10-03T07:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.875147 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:49:49 crc kubenswrapper[4664]: E1003 07:49:49.875488 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.875277 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:49:49 crc kubenswrapper[4664]: E1003 07:49:49.875721 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.875148 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:49:49 crc kubenswrapper[4664]: E1003 07:49:49.875893 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.886774 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4391bdd-694b-4c79-8482-12ecc43e15a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d71e13c58bbb5cba74df86672f02d7970dae7bc41a9c88aa6652f98c62fa7122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b60eef776685fde325952f0aa6c0b2a679d105f02227deae22666253ae8596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19b60eef776685fde325952f0aa6c0b2a679d105f02227deae22666253ae8596\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:49Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.898105 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-72cm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6998d742-8d17-4f20-ab52-c30d9f7b0b89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482e54714945acaea85fdeeb4b89eb9b16568c96319d07eb812ef88bd5faeb85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a93e9f047fd6ffc420a229d1159b5c53c40c5efe1f945e45bafa3ba93a69caef\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T07:49:32Z\\\",\\\"message\\\":\\\"2025-10-03T07:48:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_fb5617ca-9f7d-47d6-b67b-fe6a5db8a650\\\\n2025-10-03T07:48:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_fb5617ca-9f7d-47d6-b67b-fe6a5db8a650 to /host/opt/cni/bin/\\\\n2025-10-03T07:48:47Z [verbose] multus-daemon started\\\\n2025-10-03T07:48:47Z [verbose] Readiness Indicator file check\\\\n2025-10-03T07:49:32Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:49:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9hpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-72cm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:49Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.909912 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7002f9af-9339-4bec-8b7a-ce0c1d20f3ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96ea8ece692f200d9dbe3650ba30a3680897c6d7c33c9b359b27e31692689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f580c83a6ce9959ed5a1c8b78834f7d6caecaeff3ed366fdef991ff728b08d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed30aa5b737df201bfca9e17073255b5d791430429d86149107650c38a1c0c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89484df39df055036ea8f81fb4e052a14292e7268e7c52088a497348141c3a81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 07:48:39.005956 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 07:48:39.006283 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 07:48:39.007038 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3527224080/tls.crt::/tmp/serving-cert-3527224080/tls.key\\\\\\\"\\\\nI1003 07:48:39.474760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 07:48:39.479276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 07:48:39.479354 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 07:48:39.479401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 07:48:39.479427 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 07:48:39.489134 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 07:48:39.489225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 07:48:39.489291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 07:48:39.489311 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 07:48:39.489336 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 07:48:39.489506 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 07:48:39.491496 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1078b48064376958ba64f9040dc1973628494ff4c9935c90c296bdb8a27f8738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:49Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.926161 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c7dd53736fe9d646bd1433c7f564d5096f83984954d65d5f63ce953cd49690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46c7dd53736fe9d646bd1433c7f564d5096f83984954d65d5f63ce953cd49690\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T07:49:46Z\\\",\\\"message\\\":\\\"ternalversions/factory.go:141\\\\nI1003 07:49:46.647758 6739 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1003 07:49:46.647937 6739 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 07:49:46.648037 6739 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 07:49:46.648258 6739 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1003 07:49:46.648314 6739 factory.go:656] Stopping watch factory\\\\nI1003 07:49:46.648371 6739 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 07:49:46.677200 6739 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1003 07:49:46.677255 6739 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1003 07:49:46.677350 6739 ovnkube.go:599] Stopped ovnkube\\\\nI1003 07:49:46.677384 6739 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 07:49:46.677538 6739 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:49:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-2jpvm_openshift-ovn-kubernetes(8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2jpvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:49Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.942533 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h865c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e441c3-e8db-4705-9da1-0c6513d57048\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de7994382833aa2a4b6d7926e96257038511c4f547f24336eb11f3cf89e2d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h865c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:49Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.950971 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.951004 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.951014 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.951030 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.951060 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:49Z","lastTransitionTime":"2025-10-03T07:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.955422 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-27zqq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988af70a-5398-4c96-b2a7-4b8e143303bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f95198276df57dfddfafb72c325577486b3eddb40e73fbe4afa95e8ce8a6183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cr7ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd25565edabdcc002b63aece712eb1cf0ff4eece8277aa01ad4875f2977f63ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cr7ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-27zqq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:49Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.967252 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l687s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f2800e0-b66e-4ab2-ad4f-37c5ffe60120\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6h72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6h72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l687s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:49Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.984425 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a8b21d2fb73dbe0bf41ef524cce8befb05dca708bc568a567ccd71bc1434b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:49Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:49 crc kubenswrapper[4664]: I1003 07:49:49.999300 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:49Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.014536 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hqfhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e03e57b4-cb64-42fa-b8c5-ee4863291568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31867b4c35050cbe6c4247edfe085dc9429d5516c19ad8da432699262b7fd092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hqfhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:50Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.026119 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8078a587-9c95-43a8-9cf8-4286835f8134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938aaa7e3e507e4f04632ecbdeb12d1e169b59fdcd8c19838e21fedc4b55d721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682632866f1e606691047ac952506c02f250c825f142a1676a4abcca32e58466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7eed5fa26eeb30a803512265d01d5fbb9fffad00d857a683939158da6b0bb7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e4e705e4bd02aaf141572ec91291240f8f7ec92b33cd45ba98ea8ce83f06918\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:50Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.038190 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b74b4b6-2022-4308-8071-0f972dd1d922\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b1c43aa954d4f0ba95fd6bafe459a5f2d2df82b8cdbcd661c2a6e6238526fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd376fcbd2d46400852a695c1682ba853522bcc98038fcdb55c89d87d13ef012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f5a207be586adc5a1d852c18ed4ca1bf7d81b707a2f55ac2e76f31c5a94ffb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373216c551b7fed67ebbcb9c4cac5afd1e715a5fda849d977798c3690dc4f928\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://373216c551b7fed67ebbcb9c4cac5afd1e715a5fda849d977798c3690dc4f928\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:50Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.052659 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:50Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.056680 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.056730 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.056745 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.056769 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.056786 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:50Z","lastTransitionTime":"2025-10-03T07:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.065890 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d6a6dd75fa3f860ac6164f4065d5fbb3ea31148102e726b792655d5b08e34d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:50Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.082275 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c51653288453da9ac24f91fdd67f9cec6bea6dc9b90432eeab5350b7e48f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://741ad5608cab1c104f0a912ca7325733265167e85291eb6bc1b08dd5a23a9b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:50Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.100688 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9408c318-d840-4eff-815e-152565efafbc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ccdd0e3981ae1631d21b940d6ab096e6c5a8f62ea0d9edfba00c925b7b4a235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9054c0ddac91219be608702604b3f2fc398dcb23cd4ae2c87e29aea2267383a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://081e3f10dbfcd175f2595aabad8ff020b4878a8605ab4f85d68ecd8178bda548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://111f5ec5819c2f11bfc65c9cf5308b5d3ffc4b3bb6f8b3f65f31a9323b56d9e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28b42bc27e52911b15b17e3effdc251425df3177be375aed26649ed02cea13e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc449fe962a0144d2f3088d5d5b4a8769035aca8b07118aca83de2cd3183651b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc449fe962a0144d2f3088d5d5b4a8769035aca8b07118aca83de2cd3183651b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01406adf04f1c29dcb0acaa2268c9514d4712a73f7f055149ae1507b1cdbb088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01406adf04f1c29dcb0acaa2268c9514d4712a73f7f055149ae1507b1cdbb088\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f6097fc39412ae589b05b16aede4059ae55e0bf6251ff504832fff0bba157aac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6097fc39412ae589b05b16aede4059ae55e0bf6251ff504832fff0bba157aac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:50Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.111893 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:50Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.123006 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b81ce-0ce7-498f-9337-ae5e6e64682b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf6aded6ff3dcd5fbea90e46bb42d6db6b963c9688abfed6fa40662139fb80f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b446e000609801e974810aa233005f7c2810cc977c7a13155452de76e1268427\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x9dgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:50Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.135431 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9z9q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4492bc6-9e61-4748-935e-e070a703c05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aac5ac1e508036a7e3cee3eb11577734e7923c4076399288863f294b7efa3fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqzrd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9z9q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:50Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.160386 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.160438 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.160451 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.160478 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.160492 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:50Z","lastTransitionTime":"2025-10-03T07:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.263256 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.263333 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.263345 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.263360 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.263371 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:50Z","lastTransitionTime":"2025-10-03T07:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.365505 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.365546 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.365555 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.365569 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.365581 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:50Z","lastTransitionTime":"2025-10-03T07:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.467572 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.467621 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.467634 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.467648 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.467659 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:50Z","lastTransitionTime":"2025-10-03T07:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.569723 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.569764 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.569774 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.569789 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.569800 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:50Z","lastTransitionTime":"2025-10-03T07:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.671437 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.671475 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.671485 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.671497 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.671507 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:50Z","lastTransitionTime":"2025-10-03T07:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.773676 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.773715 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.773728 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.773746 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.773758 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:50Z","lastTransitionTime":"2025-10-03T07:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.875338 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:49:50 crc kubenswrapper[4664]: E1003 07:49:50.875471 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l687s" podUID="7f2800e0-b66e-4ab2-ad4f-37c5ffe60120" Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.875708 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.875730 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.875741 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.875754 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.875764 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:50Z","lastTransitionTime":"2025-10-03T07:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.978113 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.978172 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.978183 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.978200 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:50 crc kubenswrapper[4664]: I1003 07:49:50.978211 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:50Z","lastTransitionTime":"2025-10-03T07:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:51 crc kubenswrapper[4664]: I1003 07:49:51.080410 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:51 crc kubenswrapper[4664]: I1003 07:49:51.080459 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:51 crc kubenswrapper[4664]: I1003 07:49:51.080472 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:51 crc kubenswrapper[4664]: I1003 07:49:51.080490 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:51 crc kubenswrapper[4664]: I1003 07:49:51.080502 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:51Z","lastTransitionTime":"2025-10-03T07:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:51 crc kubenswrapper[4664]: I1003 07:49:51.182758 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:51 crc kubenswrapper[4664]: I1003 07:49:51.182796 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:51 crc kubenswrapper[4664]: I1003 07:49:51.182807 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:51 crc kubenswrapper[4664]: I1003 07:49:51.182824 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:51 crc kubenswrapper[4664]: I1003 07:49:51.182835 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:51Z","lastTransitionTime":"2025-10-03T07:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:51 crc kubenswrapper[4664]: I1003 07:49:51.285970 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:51 crc kubenswrapper[4664]: I1003 07:49:51.286013 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:51 crc kubenswrapper[4664]: I1003 07:49:51.286022 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:51 crc kubenswrapper[4664]: I1003 07:49:51.286038 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:51 crc kubenswrapper[4664]: I1003 07:49:51.286049 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:51Z","lastTransitionTime":"2025-10-03T07:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:51 crc kubenswrapper[4664]: I1003 07:49:51.389053 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:51 crc kubenswrapper[4664]: I1003 07:49:51.389107 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:51 crc kubenswrapper[4664]: I1003 07:49:51.389119 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:51 crc kubenswrapper[4664]: I1003 07:49:51.389141 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:51 crc kubenswrapper[4664]: I1003 07:49:51.389155 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:51Z","lastTransitionTime":"2025-10-03T07:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:51 crc kubenswrapper[4664]: I1003 07:49:51.492579 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:51 crc kubenswrapper[4664]: I1003 07:49:51.492641 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:51 crc kubenswrapper[4664]: I1003 07:49:51.492653 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:51 crc kubenswrapper[4664]: I1003 07:49:51.492667 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:51 crc kubenswrapper[4664]: I1003 07:49:51.492676 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:51Z","lastTransitionTime":"2025-10-03T07:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:51 crc kubenswrapper[4664]: I1003 07:49:51.594707 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:51 crc kubenswrapper[4664]: I1003 07:49:51.594754 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:51 crc kubenswrapper[4664]: I1003 07:49:51.594769 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:51 crc kubenswrapper[4664]: I1003 07:49:51.594785 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:51 crc kubenswrapper[4664]: I1003 07:49:51.594796 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:51Z","lastTransitionTime":"2025-10-03T07:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:51 crc kubenswrapper[4664]: I1003 07:49:51.696793 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:51 crc kubenswrapper[4664]: I1003 07:49:51.696848 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:51 crc kubenswrapper[4664]: I1003 07:49:51.696864 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:51 crc kubenswrapper[4664]: I1003 07:49:51.696882 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:51 crc kubenswrapper[4664]: I1003 07:49:51.696894 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:51Z","lastTransitionTime":"2025-10-03T07:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:51 crc kubenswrapper[4664]: I1003 07:49:51.799387 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:51 crc kubenswrapper[4664]: I1003 07:49:51.799416 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:51 crc kubenswrapper[4664]: I1003 07:49:51.799427 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:51 crc kubenswrapper[4664]: I1003 07:49:51.799440 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:51 crc kubenswrapper[4664]: I1003 07:49:51.799448 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:51Z","lastTransitionTime":"2025-10-03T07:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:51 crc kubenswrapper[4664]: I1003 07:49:51.875226 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:49:51 crc kubenswrapper[4664]: I1003 07:49:51.875305 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:49:51 crc kubenswrapper[4664]: E1003 07:49:51.875376 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:49:51 crc kubenswrapper[4664]: E1003 07:49:51.875448 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:49:51 crc kubenswrapper[4664]: I1003 07:49:51.875525 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:49:51 crc kubenswrapper[4664]: E1003 07:49:51.875700 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:49:51 crc kubenswrapper[4664]: I1003 07:49:51.902017 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:51 crc kubenswrapper[4664]: I1003 07:49:51.902050 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:51 crc kubenswrapper[4664]: I1003 07:49:51.902058 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:51 crc kubenswrapper[4664]: I1003 07:49:51.902070 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:51 crc kubenswrapper[4664]: I1003 07:49:51.902078 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:51Z","lastTransitionTime":"2025-10-03T07:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:52 crc kubenswrapper[4664]: I1003 07:49:52.004110 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:52 crc kubenswrapper[4664]: I1003 07:49:52.004147 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:52 crc kubenswrapper[4664]: I1003 07:49:52.004159 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:52 crc kubenswrapper[4664]: I1003 07:49:52.004175 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:52 crc kubenswrapper[4664]: I1003 07:49:52.004187 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:52Z","lastTransitionTime":"2025-10-03T07:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:52 crc kubenswrapper[4664]: I1003 07:49:52.106071 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:52 crc kubenswrapper[4664]: I1003 07:49:52.106101 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:52 crc kubenswrapper[4664]: I1003 07:49:52.106109 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:52 crc kubenswrapper[4664]: I1003 07:49:52.106122 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:52 crc kubenswrapper[4664]: I1003 07:49:52.106131 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:52Z","lastTransitionTime":"2025-10-03T07:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:52 crc kubenswrapper[4664]: I1003 07:49:52.208822 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:52 crc kubenswrapper[4664]: I1003 07:49:52.208853 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:52 crc kubenswrapper[4664]: I1003 07:49:52.208860 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:52 crc kubenswrapper[4664]: I1003 07:49:52.208872 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:52 crc kubenswrapper[4664]: I1003 07:49:52.208881 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:52Z","lastTransitionTime":"2025-10-03T07:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:52 crc kubenswrapper[4664]: I1003 07:49:52.311717 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:52 crc kubenswrapper[4664]: I1003 07:49:52.311763 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:52 crc kubenswrapper[4664]: I1003 07:49:52.311776 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:52 crc kubenswrapper[4664]: I1003 07:49:52.311791 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:52 crc kubenswrapper[4664]: I1003 07:49:52.311804 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:52Z","lastTransitionTime":"2025-10-03T07:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:52 crc kubenswrapper[4664]: I1003 07:49:52.415098 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:52 crc kubenswrapper[4664]: I1003 07:49:52.415133 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:52 crc kubenswrapper[4664]: I1003 07:49:52.415144 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:52 crc kubenswrapper[4664]: I1003 07:49:52.415158 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:52 crc kubenswrapper[4664]: I1003 07:49:52.415168 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:52Z","lastTransitionTime":"2025-10-03T07:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:52 crc kubenswrapper[4664]: I1003 07:49:52.517029 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:52 crc kubenswrapper[4664]: I1003 07:49:52.517062 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:52 crc kubenswrapper[4664]: I1003 07:49:52.517072 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:52 crc kubenswrapper[4664]: I1003 07:49:52.517084 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:52 crc kubenswrapper[4664]: I1003 07:49:52.517093 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:52Z","lastTransitionTime":"2025-10-03T07:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:52 crc kubenswrapper[4664]: I1003 07:49:52.619431 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:52 crc kubenswrapper[4664]: I1003 07:49:52.619500 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:52 crc kubenswrapper[4664]: I1003 07:49:52.619515 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:52 crc kubenswrapper[4664]: I1003 07:49:52.619531 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:52 crc kubenswrapper[4664]: I1003 07:49:52.619541 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:52Z","lastTransitionTime":"2025-10-03T07:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:52 crc kubenswrapper[4664]: I1003 07:49:52.721914 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:52 crc kubenswrapper[4664]: I1003 07:49:52.721949 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:52 crc kubenswrapper[4664]: I1003 07:49:52.721959 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:52 crc kubenswrapper[4664]: I1003 07:49:52.721974 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:52 crc kubenswrapper[4664]: I1003 07:49:52.721984 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:52Z","lastTransitionTime":"2025-10-03T07:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:52 crc kubenswrapper[4664]: I1003 07:49:52.824420 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:52 crc kubenswrapper[4664]: I1003 07:49:52.824469 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:52 crc kubenswrapper[4664]: I1003 07:49:52.824483 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:52 crc kubenswrapper[4664]: I1003 07:49:52.824499 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:52 crc kubenswrapper[4664]: I1003 07:49:52.824510 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:52Z","lastTransitionTime":"2025-10-03T07:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:52 crc kubenswrapper[4664]: I1003 07:49:52.875987 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:49:52 crc kubenswrapper[4664]: E1003 07:49:52.876324 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l687s" podUID="7f2800e0-b66e-4ab2-ad4f-37c5ffe60120" Oct 03 07:49:52 crc kubenswrapper[4664]: I1003 07:49:52.927424 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:52 crc kubenswrapper[4664]: I1003 07:49:52.927488 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:52 crc kubenswrapper[4664]: I1003 07:49:52.927500 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:52 crc kubenswrapper[4664]: I1003 07:49:52.927570 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:52 crc kubenswrapper[4664]: I1003 07:49:52.927595 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:52Z","lastTransitionTime":"2025-10-03T07:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:53 crc kubenswrapper[4664]: I1003 07:49:53.029981 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:53 crc kubenswrapper[4664]: I1003 07:49:53.030294 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:53 crc kubenswrapper[4664]: I1003 07:49:53.030383 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:53 crc kubenswrapper[4664]: I1003 07:49:53.030479 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:53 crc kubenswrapper[4664]: I1003 07:49:53.030628 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:53Z","lastTransitionTime":"2025-10-03T07:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:53 crc kubenswrapper[4664]: I1003 07:49:53.133078 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:53 crc kubenswrapper[4664]: I1003 07:49:53.133417 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:53 crc kubenswrapper[4664]: I1003 07:49:53.133510 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:53 crc kubenswrapper[4664]: I1003 07:49:53.133591 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:53 crc kubenswrapper[4664]: I1003 07:49:53.133687 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:53Z","lastTransitionTime":"2025-10-03T07:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:53 crc kubenswrapper[4664]: I1003 07:49:53.235812 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:53 crc kubenswrapper[4664]: I1003 07:49:53.235852 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:53 crc kubenswrapper[4664]: I1003 07:49:53.235864 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:53 crc kubenswrapper[4664]: I1003 07:49:53.235881 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:53 crc kubenswrapper[4664]: I1003 07:49:53.235893 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:53Z","lastTransitionTime":"2025-10-03T07:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:53 crc kubenswrapper[4664]: I1003 07:49:53.338729 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:53 crc kubenswrapper[4664]: I1003 07:49:53.338772 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:53 crc kubenswrapper[4664]: I1003 07:49:53.338783 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:53 crc kubenswrapper[4664]: I1003 07:49:53.338798 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:53 crc kubenswrapper[4664]: I1003 07:49:53.338809 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:53Z","lastTransitionTime":"2025-10-03T07:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:53 crc kubenswrapper[4664]: I1003 07:49:53.442052 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:53 crc kubenswrapper[4664]: I1003 07:49:53.442296 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:53 crc kubenswrapper[4664]: I1003 07:49:53.442378 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:53 crc kubenswrapper[4664]: I1003 07:49:53.442500 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:53 crc kubenswrapper[4664]: I1003 07:49:53.442588 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:53Z","lastTransitionTime":"2025-10-03T07:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:53 crc kubenswrapper[4664]: I1003 07:49:53.545029 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:53 crc kubenswrapper[4664]: I1003 07:49:53.545065 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:53 crc kubenswrapper[4664]: I1003 07:49:53.545074 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:53 crc kubenswrapper[4664]: I1003 07:49:53.545087 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:53 crc kubenswrapper[4664]: I1003 07:49:53.545100 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:53Z","lastTransitionTime":"2025-10-03T07:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:53 crc kubenswrapper[4664]: I1003 07:49:53.647087 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:53 crc kubenswrapper[4664]: I1003 07:49:53.647127 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:53 crc kubenswrapper[4664]: I1003 07:49:53.647137 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:53 crc kubenswrapper[4664]: I1003 07:49:53.647151 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:53 crc kubenswrapper[4664]: I1003 07:49:53.647162 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:53Z","lastTransitionTime":"2025-10-03T07:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:53 crc kubenswrapper[4664]: I1003 07:49:53.749723 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:53 crc kubenswrapper[4664]: I1003 07:49:53.749764 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:53 crc kubenswrapper[4664]: I1003 07:49:53.749775 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:53 crc kubenswrapper[4664]: I1003 07:49:53.749792 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:53 crc kubenswrapper[4664]: I1003 07:49:53.749805 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:53Z","lastTransitionTime":"2025-10-03T07:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:53 crc kubenswrapper[4664]: I1003 07:49:53.852303 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:53 crc kubenswrapper[4664]: I1003 07:49:53.852334 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:53 crc kubenswrapper[4664]: I1003 07:49:53.852344 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:53 crc kubenswrapper[4664]: I1003 07:49:53.852358 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:53 crc kubenswrapper[4664]: I1003 07:49:53.852369 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:53Z","lastTransitionTime":"2025-10-03T07:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:53 crc kubenswrapper[4664]: I1003 07:49:53.877527 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:49:53 crc kubenswrapper[4664]: E1003 07:49:53.877690 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:49:53 crc kubenswrapper[4664]: I1003 07:49:53.877788 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:49:53 crc kubenswrapper[4664]: I1003 07:49:53.877884 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:49:53 crc kubenswrapper[4664]: E1003 07:49:53.878020 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:49:53 crc kubenswrapper[4664]: E1003 07:49:53.878082 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:49:53 crc kubenswrapper[4664]: I1003 07:49:53.954350 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:53 crc kubenswrapper[4664]: I1003 07:49:53.954386 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:53 crc kubenswrapper[4664]: I1003 07:49:53.954395 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:53 crc kubenswrapper[4664]: I1003 07:49:53.954408 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:53 crc kubenswrapper[4664]: I1003 07:49:53.954418 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:53Z","lastTransitionTime":"2025-10-03T07:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:54 crc kubenswrapper[4664]: I1003 07:49:54.057068 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:54 crc kubenswrapper[4664]: I1003 07:49:54.057106 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:54 crc kubenswrapper[4664]: I1003 07:49:54.057117 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:54 crc kubenswrapper[4664]: I1003 07:49:54.057130 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:54 crc kubenswrapper[4664]: I1003 07:49:54.057139 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:54Z","lastTransitionTime":"2025-10-03T07:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:54 crc kubenswrapper[4664]: I1003 07:49:54.158934 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:54 crc kubenswrapper[4664]: I1003 07:49:54.159187 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:54 crc kubenswrapper[4664]: I1003 07:49:54.159266 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:54 crc kubenswrapper[4664]: I1003 07:49:54.159370 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:54 crc kubenswrapper[4664]: I1003 07:49:54.159468 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:54Z","lastTransitionTime":"2025-10-03T07:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:54 crc kubenswrapper[4664]: I1003 07:49:54.262407 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:54 crc kubenswrapper[4664]: I1003 07:49:54.262655 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:54 crc kubenswrapper[4664]: I1003 07:49:54.262748 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:54 crc kubenswrapper[4664]: I1003 07:49:54.262867 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:54 crc kubenswrapper[4664]: I1003 07:49:54.262961 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:54Z","lastTransitionTime":"2025-10-03T07:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:54 crc kubenswrapper[4664]: I1003 07:49:54.365015 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:54 crc kubenswrapper[4664]: I1003 07:49:54.365055 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:54 crc kubenswrapper[4664]: I1003 07:49:54.365064 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:54 crc kubenswrapper[4664]: I1003 07:49:54.365078 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:54 crc kubenswrapper[4664]: I1003 07:49:54.365087 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:54Z","lastTransitionTime":"2025-10-03T07:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:54 crc kubenswrapper[4664]: I1003 07:49:54.467295 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:54 crc kubenswrapper[4664]: I1003 07:49:54.467334 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:54 crc kubenswrapper[4664]: I1003 07:49:54.467344 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:54 crc kubenswrapper[4664]: I1003 07:49:54.467358 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:54 crc kubenswrapper[4664]: I1003 07:49:54.467369 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:54Z","lastTransitionTime":"2025-10-03T07:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:54 crc kubenswrapper[4664]: I1003 07:49:54.569836 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:54 crc kubenswrapper[4664]: I1003 07:49:54.569943 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:54 crc kubenswrapper[4664]: I1003 07:49:54.569962 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:54 crc kubenswrapper[4664]: I1003 07:49:54.569980 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:54 crc kubenswrapper[4664]: I1003 07:49:54.569990 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:54Z","lastTransitionTime":"2025-10-03T07:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:54 crc kubenswrapper[4664]: I1003 07:49:54.672255 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:54 crc kubenswrapper[4664]: I1003 07:49:54.672295 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:54 crc kubenswrapper[4664]: I1003 07:49:54.672307 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:54 crc kubenswrapper[4664]: I1003 07:49:54.672322 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:54 crc kubenswrapper[4664]: I1003 07:49:54.672333 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:54Z","lastTransitionTime":"2025-10-03T07:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:54 crc kubenswrapper[4664]: I1003 07:49:54.774972 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:54 crc kubenswrapper[4664]: I1003 07:49:54.775214 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:54 crc kubenswrapper[4664]: I1003 07:49:54.775314 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:54 crc kubenswrapper[4664]: I1003 07:49:54.775505 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:54 crc kubenswrapper[4664]: I1003 07:49:54.775584 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:54Z","lastTransitionTime":"2025-10-03T07:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:54 crc kubenswrapper[4664]: I1003 07:49:54.875553 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:49:54 crc kubenswrapper[4664]: E1003 07:49:54.875735 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l687s" podUID="7f2800e0-b66e-4ab2-ad4f-37c5ffe60120" Oct 03 07:49:54 crc kubenswrapper[4664]: I1003 07:49:54.878230 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:54 crc kubenswrapper[4664]: I1003 07:49:54.878349 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:54 crc kubenswrapper[4664]: I1003 07:49:54.878411 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:54 crc kubenswrapper[4664]: I1003 07:49:54.878470 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:54 crc kubenswrapper[4664]: I1003 07:49:54.878549 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:54Z","lastTransitionTime":"2025-10-03T07:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:54 crc kubenswrapper[4664]: I1003 07:49:54.981297 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:54 crc kubenswrapper[4664]: I1003 07:49:54.981342 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:54 crc kubenswrapper[4664]: I1003 07:49:54.981354 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:54 crc kubenswrapper[4664]: I1003 07:49:54.981372 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:54 crc kubenswrapper[4664]: I1003 07:49:54.981383 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:54Z","lastTransitionTime":"2025-10-03T07:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:55 crc kubenswrapper[4664]: I1003 07:49:55.084156 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:55 crc kubenswrapper[4664]: I1003 07:49:55.084200 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:55 crc kubenswrapper[4664]: I1003 07:49:55.084209 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:55 crc kubenswrapper[4664]: I1003 07:49:55.084224 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:55 crc kubenswrapper[4664]: I1003 07:49:55.084235 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:55Z","lastTransitionTime":"2025-10-03T07:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:55 crc kubenswrapper[4664]: I1003 07:49:55.186405 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:55 crc kubenswrapper[4664]: I1003 07:49:55.186729 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:55 crc kubenswrapper[4664]: I1003 07:49:55.187102 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:55 crc kubenswrapper[4664]: I1003 07:49:55.187180 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:55 crc kubenswrapper[4664]: I1003 07:49:55.187278 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:55Z","lastTransitionTime":"2025-10-03T07:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:55 crc kubenswrapper[4664]: I1003 07:49:55.289674 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:55 crc kubenswrapper[4664]: I1003 07:49:55.289960 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:55 crc kubenswrapper[4664]: I1003 07:49:55.290046 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:55 crc kubenswrapper[4664]: I1003 07:49:55.290112 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:55 crc kubenswrapper[4664]: I1003 07:49:55.290179 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:55Z","lastTransitionTime":"2025-10-03T07:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:55 crc kubenswrapper[4664]: I1003 07:49:55.391885 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:55 crc kubenswrapper[4664]: I1003 07:49:55.391916 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:55 crc kubenswrapper[4664]: I1003 07:49:55.391924 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:55 crc kubenswrapper[4664]: I1003 07:49:55.391936 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:55 crc kubenswrapper[4664]: I1003 07:49:55.391945 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:55Z","lastTransitionTime":"2025-10-03T07:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:55 crc kubenswrapper[4664]: I1003 07:49:55.494111 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:55 crc kubenswrapper[4664]: I1003 07:49:55.494144 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:55 crc kubenswrapper[4664]: I1003 07:49:55.494155 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:55 crc kubenswrapper[4664]: I1003 07:49:55.494169 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:55 crc kubenswrapper[4664]: I1003 07:49:55.494179 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:55Z","lastTransitionTime":"2025-10-03T07:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:55 crc kubenswrapper[4664]: I1003 07:49:55.596905 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:55 crc kubenswrapper[4664]: I1003 07:49:55.596945 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:55 crc kubenswrapper[4664]: I1003 07:49:55.596957 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:55 crc kubenswrapper[4664]: I1003 07:49:55.596972 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:55 crc kubenswrapper[4664]: I1003 07:49:55.596983 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:55Z","lastTransitionTime":"2025-10-03T07:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:55 crc kubenswrapper[4664]: I1003 07:49:55.699280 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:55 crc kubenswrapper[4664]: I1003 07:49:55.699332 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:55 crc kubenswrapper[4664]: I1003 07:49:55.699345 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:55 crc kubenswrapper[4664]: I1003 07:49:55.699362 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:55 crc kubenswrapper[4664]: I1003 07:49:55.699375 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:55Z","lastTransitionTime":"2025-10-03T07:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:55 crc kubenswrapper[4664]: I1003 07:49:55.801403 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:55 crc kubenswrapper[4664]: I1003 07:49:55.801448 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:55 crc kubenswrapper[4664]: I1003 07:49:55.801457 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:55 crc kubenswrapper[4664]: I1003 07:49:55.801470 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:55 crc kubenswrapper[4664]: I1003 07:49:55.801481 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:55Z","lastTransitionTime":"2025-10-03T07:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:55 crc kubenswrapper[4664]: I1003 07:49:55.875465 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:49:55 crc kubenswrapper[4664]: E1003 07:49:55.875621 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:49:55 crc kubenswrapper[4664]: I1003 07:49:55.875630 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:49:55 crc kubenswrapper[4664]: I1003 07:49:55.875467 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:49:55 crc kubenswrapper[4664]: E1003 07:49:55.875701 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:49:55 crc kubenswrapper[4664]: E1003 07:49:55.875876 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:49:55 crc kubenswrapper[4664]: I1003 07:49:55.904854 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:55 crc kubenswrapper[4664]: I1003 07:49:55.904890 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:55 crc kubenswrapper[4664]: I1003 07:49:55.904901 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:55 crc kubenswrapper[4664]: I1003 07:49:55.904917 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:55 crc kubenswrapper[4664]: I1003 07:49:55.904928 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:55Z","lastTransitionTime":"2025-10-03T07:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:56 crc kubenswrapper[4664]: I1003 07:49:56.007542 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:56 crc kubenswrapper[4664]: I1003 07:49:56.007579 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:56 crc kubenswrapper[4664]: I1003 07:49:56.007588 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:56 crc kubenswrapper[4664]: I1003 07:49:56.007601 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:56 crc kubenswrapper[4664]: I1003 07:49:56.007634 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:56Z","lastTransitionTime":"2025-10-03T07:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:56 crc kubenswrapper[4664]: I1003 07:49:56.111753 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:56 crc kubenswrapper[4664]: I1003 07:49:56.112178 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:56 crc kubenswrapper[4664]: I1003 07:49:56.112191 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:56 crc kubenswrapper[4664]: I1003 07:49:56.112210 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:56 crc kubenswrapper[4664]: I1003 07:49:56.112221 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:56Z","lastTransitionTime":"2025-10-03T07:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:56 crc kubenswrapper[4664]: I1003 07:49:56.215011 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:56 crc kubenswrapper[4664]: I1003 07:49:56.215062 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:56 crc kubenswrapper[4664]: I1003 07:49:56.215071 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:56 crc kubenswrapper[4664]: I1003 07:49:56.215086 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:56 crc kubenswrapper[4664]: I1003 07:49:56.215097 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:56Z","lastTransitionTime":"2025-10-03T07:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:56 crc kubenswrapper[4664]: I1003 07:49:56.318370 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:56 crc kubenswrapper[4664]: I1003 07:49:56.318419 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:56 crc kubenswrapper[4664]: I1003 07:49:56.318432 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:56 crc kubenswrapper[4664]: I1003 07:49:56.318449 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:56 crc kubenswrapper[4664]: I1003 07:49:56.318462 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:56Z","lastTransitionTime":"2025-10-03T07:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:56 crc kubenswrapper[4664]: I1003 07:49:56.420637 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:56 crc kubenswrapper[4664]: I1003 07:49:56.420679 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:56 crc kubenswrapper[4664]: I1003 07:49:56.420689 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:56 crc kubenswrapper[4664]: I1003 07:49:56.420705 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:56 crc kubenswrapper[4664]: I1003 07:49:56.420718 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:56Z","lastTransitionTime":"2025-10-03T07:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:56 crc kubenswrapper[4664]: I1003 07:49:56.523119 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:56 crc kubenswrapper[4664]: I1003 07:49:56.523163 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:56 crc kubenswrapper[4664]: I1003 07:49:56.523174 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:56 crc kubenswrapper[4664]: I1003 07:49:56.523189 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:56 crc kubenswrapper[4664]: I1003 07:49:56.523200 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:56Z","lastTransitionTime":"2025-10-03T07:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:56 crc kubenswrapper[4664]: I1003 07:49:56.625453 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:56 crc kubenswrapper[4664]: I1003 07:49:56.625489 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:56 crc kubenswrapper[4664]: I1003 07:49:56.625498 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:56 crc kubenswrapper[4664]: I1003 07:49:56.625511 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:56 crc kubenswrapper[4664]: I1003 07:49:56.625520 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:56Z","lastTransitionTime":"2025-10-03T07:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:56 crc kubenswrapper[4664]: I1003 07:49:56.728438 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:56 crc kubenswrapper[4664]: I1003 07:49:56.728504 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:56 crc kubenswrapper[4664]: I1003 07:49:56.728516 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:56 crc kubenswrapper[4664]: I1003 07:49:56.728534 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:56 crc kubenswrapper[4664]: I1003 07:49:56.728547 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:56Z","lastTransitionTime":"2025-10-03T07:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:56 crc kubenswrapper[4664]: I1003 07:49:56.830572 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:56 crc kubenswrapper[4664]: I1003 07:49:56.830646 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:56 crc kubenswrapper[4664]: I1003 07:49:56.830663 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:56 crc kubenswrapper[4664]: I1003 07:49:56.830682 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:56 crc kubenswrapper[4664]: I1003 07:49:56.830695 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:56Z","lastTransitionTime":"2025-10-03T07:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:56 crc kubenswrapper[4664]: I1003 07:49:56.876106 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:49:56 crc kubenswrapper[4664]: E1003 07:49:56.876322 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l687s" podUID="7f2800e0-b66e-4ab2-ad4f-37c5ffe60120" Oct 03 07:49:56 crc kubenswrapper[4664]: I1003 07:49:56.933410 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:56 crc kubenswrapper[4664]: I1003 07:49:56.933461 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:56 crc kubenswrapper[4664]: I1003 07:49:56.933473 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:56 crc kubenswrapper[4664]: I1003 07:49:56.933494 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:56 crc kubenswrapper[4664]: I1003 07:49:56.933508 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:56Z","lastTransitionTime":"2025-10-03T07:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.035741 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.035777 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.035786 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.035800 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.035811 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:57Z","lastTransitionTime":"2025-10-03T07:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.138453 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.138498 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.138510 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.138527 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.138537 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:57Z","lastTransitionTime":"2025-10-03T07:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.241230 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.241260 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.241268 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.241280 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.241288 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:57Z","lastTransitionTime":"2025-10-03T07:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.343502 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.343528 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.343536 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.343549 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.343560 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:57Z","lastTransitionTime":"2025-10-03T07:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.382018 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.382064 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.382075 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.382090 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.382101 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:57Z","lastTransitionTime":"2025-10-03T07:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:57 crc kubenswrapper[4664]: E1003 07:49:57.393748 4664 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eca81c87-676e-4667-a87b-e015ec0be81c\\\",\\\"systemUUID\\\":\\\"7be6e848-96ef-48b1-8627-9ddc13d5cc87\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:57Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.396913 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.396951 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.396963 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.396975 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.396985 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:57Z","lastTransitionTime":"2025-10-03T07:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:57 crc kubenswrapper[4664]: E1003 07:49:57.411059 4664 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eca81c87-676e-4667-a87b-e015ec0be81c\\\",\\\"systemUUID\\\":\\\"7be6e848-96ef-48b1-8627-9ddc13d5cc87\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:57Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.414356 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.414394 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.414407 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.414424 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.414435 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:57Z","lastTransitionTime":"2025-10-03T07:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:57 crc kubenswrapper[4664]: E1003 07:49:57.424592 4664 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eca81c87-676e-4667-a87b-e015ec0be81c\\\",\\\"systemUUID\\\":\\\"7be6e848-96ef-48b1-8627-9ddc13d5cc87\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:57Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.428308 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.428337 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.428345 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.428358 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.428366 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:57Z","lastTransitionTime":"2025-10-03T07:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:57 crc kubenswrapper[4664]: E1003 07:49:57.439985 4664 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eca81c87-676e-4667-a87b-e015ec0be81c\\\",\\\"systemUUID\\\":\\\"7be6e848-96ef-48b1-8627-9ddc13d5cc87\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:57Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.444521 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.444583 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.444602 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.444637 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.444650 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:57Z","lastTransitionTime":"2025-10-03T07:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:57 crc kubenswrapper[4664]: E1003 07:49:57.456040 4664 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T07:49:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eca81c87-676e-4667-a87b-e015ec0be81c\\\",\\\"systemUUID\\\":\\\"7be6e848-96ef-48b1-8627-9ddc13d5cc87\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:57Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:57 crc kubenswrapper[4664]: E1003 07:49:57.456149 4664 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.457864 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.457946 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.457959 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.457974 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.457986 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:57Z","lastTransitionTime":"2025-10-03T07:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.560546 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.560590 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.560629 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.560646 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.560659 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:57Z","lastTransitionTime":"2025-10-03T07:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.662982 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.663039 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.663052 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.663067 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.663078 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:57Z","lastTransitionTime":"2025-10-03T07:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.765896 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.765939 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.765948 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.765961 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.765971 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:57Z","lastTransitionTime":"2025-10-03T07:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.868828 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.868877 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.868889 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.868905 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.868917 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:57Z","lastTransitionTime":"2025-10-03T07:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.876205 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.876310 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.876330 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:49:57 crc kubenswrapper[4664]: E1003 07:49:57.876373 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:49:57 crc kubenswrapper[4664]: E1003 07:49:57.876486 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:49:57 crc kubenswrapper[4664]: E1003 07:49:57.876577 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.971154 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.971196 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.971211 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.971229 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:57 crc kubenswrapper[4664]: I1003 07:49:57.971241 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:57Z","lastTransitionTime":"2025-10-03T07:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:58 crc kubenswrapper[4664]: I1003 07:49:58.073443 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:58 crc kubenswrapper[4664]: I1003 07:49:58.073478 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:58 crc kubenswrapper[4664]: I1003 07:49:58.073486 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:58 crc kubenswrapper[4664]: I1003 07:49:58.073499 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:58 crc kubenswrapper[4664]: I1003 07:49:58.073508 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:58Z","lastTransitionTime":"2025-10-03T07:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:58 crc kubenswrapper[4664]: I1003 07:49:58.176019 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:58 crc kubenswrapper[4664]: I1003 07:49:58.176123 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:58 crc kubenswrapper[4664]: I1003 07:49:58.176193 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:58 crc kubenswrapper[4664]: I1003 07:49:58.176223 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:58 crc kubenswrapper[4664]: I1003 07:49:58.176241 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:58Z","lastTransitionTime":"2025-10-03T07:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:58 crc kubenswrapper[4664]: I1003 07:49:58.279288 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:58 crc kubenswrapper[4664]: I1003 07:49:58.279324 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:58 crc kubenswrapper[4664]: I1003 07:49:58.279334 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:58 crc kubenswrapper[4664]: I1003 07:49:58.279347 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:58 crc kubenswrapper[4664]: I1003 07:49:58.279356 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:58Z","lastTransitionTime":"2025-10-03T07:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:58 crc kubenswrapper[4664]: I1003 07:49:58.381389 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:58 crc kubenswrapper[4664]: I1003 07:49:58.381427 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:58 crc kubenswrapper[4664]: I1003 07:49:58.381438 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:58 crc kubenswrapper[4664]: I1003 07:49:58.381454 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:58 crc kubenswrapper[4664]: I1003 07:49:58.381466 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:58Z","lastTransitionTime":"2025-10-03T07:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:58 crc kubenswrapper[4664]: I1003 07:49:58.483232 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:58 crc kubenswrapper[4664]: I1003 07:49:58.483289 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:58 crc kubenswrapper[4664]: I1003 07:49:58.483302 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:58 crc kubenswrapper[4664]: I1003 07:49:58.483319 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:58 crc kubenswrapper[4664]: I1003 07:49:58.483331 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:58Z","lastTransitionTime":"2025-10-03T07:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:58 crc kubenswrapper[4664]: I1003 07:49:58.585824 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:58 crc kubenswrapper[4664]: I1003 07:49:58.585858 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:58 crc kubenswrapper[4664]: I1003 07:49:58.585868 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:58 crc kubenswrapper[4664]: I1003 07:49:58.585880 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:58 crc kubenswrapper[4664]: I1003 07:49:58.585890 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:58Z","lastTransitionTime":"2025-10-03T07:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:58 crc kubenswrapper[4664]: I1003 07:49:58.688500 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:58 crc kubenswrapper[4664]: I1003 07:49:58.688542 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:58 crc kubenswrapper[4664]: I1003 07:49:58.688553 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:58 crc kubenswrapper[4664]: I1003 07:49:58.688569 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:58 crc kubenswrapper[4664]: I1003 07:49:58.688580 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:58Z","lastTransitionTime":"2025-10-03T07:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:58 crc kubenswrapper[4664]: I1003 07:49:58.790990 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:58 crc kubenswrapper[4664]: I1003 07:49:58.791040 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:58 crc kubenswrapper[4664]: I1003 07:49:58.791049 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:58 crc kubenswrapper[4664]: I1003 07:49:58.791074 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:58 crc kubenswrapper[4664]: I1003 07:49:58.791094 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:58Z","lastTransitionTime":"2025-10-03T07:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:58 crc kubenswrapper[4664]: I1003 07:49:58.875152 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:49:58 crc kubenswrapper[4664]: E1003 07:49:58.875317 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l687s" podUID="7f2800e0-b66e-4ab2-ad4f-37c5ffe60120" Oct 03 07:49:58 crc kubenswrapper[4664]: I1003 07:49:58.892891 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:58 crc kubenswrapper[4664]: I1003 07:49:58.892932 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:58 crc kubenswrapper[4664]: I1003 07:49:58.892944 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:58 crc kubenswrapper[4664]: I1003 07:49:58.892959 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:58 crc kubenswrapper[4664]: I1003 07:49:58.892970 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:58Z","lastTransitionTime":"2025-10-03T07:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:58 crc kubenswrapper[4664]: I1003 07:49:58.995784 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:58 crc kubenswrapper[4664]: I1003 07:49:58.995826 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:58 crc kubenswrapper[4664]: I1003 07:49:58.995838 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:58 crc kubenswrapper[4664]: I1003 07:49:58.995853 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:58 crc kubenswrapper[4664]: I1003 07:49:58.995864 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:58Z","lastTransitionTime":"2025-10-03T07:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.098909 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.098982 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.098993 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.099009 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.099018 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:59Z","lastTransitionTime":"2025-10-03T07:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.200802 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.200855 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.200872 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.200888 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.200898 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:59Z","lastTransitionTime":"2025-10-03T07:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.303407 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.303444 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.303451 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.303464 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.303472 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:59Z","lastTransitionTime":"2025-10-03T07:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.406110 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.406165 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.406179 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.406196 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.406205 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:59Z","lastTransitionTime":"2025-10-03T07:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.509086 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.509143 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.509152 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.509179 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.509188 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:59Z","lastTransitionTime":"2025-10-03T07:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.611349 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.611390 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.611401 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.611416 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.611428 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:59Z","lastTransitionTime":"2025-10-03T07:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.714399 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.714441 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.714452 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.714468 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.714483 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:59Z","lastTransitionTime":"2025-10-03T07:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.816584 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.816642 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.816652 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.816664 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.816672 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:59Z","lastTransitionTime":"2025-10-03T07:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.875366 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.875445 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:49:59 crc kubenswrapper[4664]: E1003 07:49:59.875519 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.875659 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:49:59 crc kubenswrapper[4664]: E1003 07:49:59.875718 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:49:59 crc kubenswrapper[4664]: E1003 07:49:59.875804 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.890512 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hqfhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e03e57b4-cb64-42fa-b8c5-ee4863291568\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31867b4c35050cbe6c4247edfe085dc9429d5516c19ad8da432699262b7fd092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hqfhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:59Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.907298 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8078a587-9c95-43a8-9cf8-4286835f8134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938aaa7e3e507e4f04632ecbdeb12d1e169b59fdcd8c19838e21fedc4b55d721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682632866f1e606691047ac952506c02f250c825f142a1676a4abcca32e58466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7eed5fa26eeb30a803512265d01d5fbb9fffad00d857a683939158da6b0bb7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e4e705e4bd02aaf141572ec91291240f8f7ec92b33cd45ba98ea8ce83f06918\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:59Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.918873 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.918907 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.918919 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.918934 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.918944 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:49:59Z","lastTransitionTime":"2025-10-03T07:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.923487 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b74b4b6-2022-4308-8071-0f972dd1d922\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b1c43aa954d4f0ba95fd6bafe459a5f2d2df82b8cdbcd661c2a6e6238526fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd376fcbd2d46400852a695c1682ba853522bcc98038fcdb55c89d87d13ef012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f5a207be586adc5a1d852c18ed4ca1bf7d81b707a2f55ac2e76f31c5a94ffb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373216c551b7fed67ebbcb9c4cac5afd1e715a5fda849d977798c3690dc4f928\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://373216c551b7fed67ebbcb9c4cac5afd1e715a5fda849d977798c3690dc4f928\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:59Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.936454 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:59Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.949519 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d6a6dd75fa3f860ac6164f4065d5fbb3ea31148102e726b792655d5b08e34d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:59Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.961277 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c51653288453da9ac24f91fdd67f9cec6bea6dc9b90432eeab5350b7e48f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://741ad5608cab1c104f0a912ca7325733265167e85291eb6bc1b08dd5a23a9b4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:59Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.974215 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a8b21d2fb73dbe0bf41ef524cce8befb05dca708bc568a567ccd71bc1434b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:59Z is after 2025-08-24T17:21:41Z" Oct 03 07:49:59 crc kubenswrapper[4664]: I1003 07:49:59.990655 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:49:59Z is after 2025-08-24T17:21:41Z" Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.015475 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9408c318-d840-4eff-815e-152565efafbc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ccdd0e3981ae1631d21b940d6ab096e6c5a8f62ea0d9edfba00c925b7b4a235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9054c0ddac91219be608702604b3f2fc398dcb23cd4ae2c87e29aea2267383a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://081e3f10dbfcd175f2595aabad8ff020b4878a8605ab4f85d68ecd8178bda548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://111f5ec5819c2f11bfc65c9cf5308b5d3ffc4b3bb6f8b3f65f31a9323b56d9e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28b42bc27e52911b15b17e3effdc251425df3177be375aed26649ed02cea13e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc449fe962a0144d2f3088d5d5b4a8769035aca8b07118aca83de2cd3183651b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc449fe962a0144d2f3088d5d5b4a8769035aca8b07118aca83de2cd3183651b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01406adf04f1c29dcb0acaa2268c9514d4712a73f7f055149ae1507b1cdbb088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01406adf04f1c29dcb0acaa2268c9514d4712a73f7f055149ae1507b1cdbb088\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f6097fc39412ae589b05b16aede4059ae55e0bf6251ff504832fff0bba157aac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6097fc39412ae589b05b16aede4059ae55e0bf6251ff504832fff0bba157aac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:50:00Z is after 2025-08-24T17:21:41Z" Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.021807 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.021969 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.021986 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.022011 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.022034 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:00Z","lastTransitionTime":"2025-10-03T07:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.029286 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:50:00Z is after 2025-08-24T17:21:41Z" Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.042323 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b81ce-0ce7-498f-9337-ae5e6e64682b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf6aded6ff3dcd5fbea90e46bb42d6db6b963c9688abfed6fa40662139fb80f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b446e000609801e974810aa233005f7c2810cc977c7a13155452de76e1268427\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx92c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x9dgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:50:00Z is after 2025-08-24T17:21:41Z" Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.053276 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9z9q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4492bc6-9e61-4748-935e-e070a703c05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aac5ac1e508036a7e3cee3eb11577734e7923c4076399288863f294b7efa3fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqzrd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9z9q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:50:00Z is after 2025-08-24T17:21:41Z" Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.063743 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4391bdd-694b-4c79-8482-12ecc43e15a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d71e13c58bbb5cba74df86672f02d7970dae7bc41a9c88aa6652f98c62fa7122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b60eef776685fde325952f0aa6c0b2a679d105f02227deae22666253ae8596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19b60eef776685fde325952f0aa6c0b2a679d105f02227deae22666253ae8596\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:50:00Z is after 2025-08-24T17:21:41Z" Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.076014 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-72cm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6998d742-8d17-4f20-ab52-c30d9f7b0b89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482e54714945acaea85fdeeb4b89eb9b16568c96319d07eb812ef88bd5faeb85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a93e9f047fd6ffc420a229d1159b5c53c40c5efe1f945e45bafa3ba93a69caef\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T07:49:32Z\\\",\\\"message\\\":\\\"2025-10-03T07:48:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_fb5617ca-9f7d-47d6-b67b-fe6a5db8a650\\\\n2025-10-03T07:48:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_fb5617ca-9f7d-47d6-b67b-fe6a5db8a650 to /host/opt/cni/bin/\\\\n2025-10-03T07:48:47Z [verbose] multus-daemon started\\\\n2025-10-03T07:48:47Z [verbose] Readiness Indicator file check\\\\n2025-10-03T07:49:32Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:49:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9hpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-72cm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:50:00Z is after 2025-08-24T17:21:41Z" Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.090047 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7002f9af-9339-4bec-8b7a-ce0c1d20f3ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96ea8ece692f200d9dbe3650ba30a3680897c6d7c33c9b359b27e31692689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f580c83a6ce9959ed5a1c8b78834f7d6caecaeff3ed366fdef991ff728b08d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed30aa5b737df201bfca9e17073255b5d791430429d86149107650c38a1c0c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89484df39df055036ea8f81fb4e052a14292e7268e7c52088a497348141c3a81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://387893d4cb2d89ff9f370846815705aaa42ff8f4f3fc86d08fd06509d17af8e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T07:48:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 07:48:39.005956 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 07:48:39.006283 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 07:48:39.007038 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3527224080/tls.crt::/tmp/serving-cert-3527224080/tls.key\\\\\\\"\\\\nI1003 07:48:39.474760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 07:48:39.479276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 07:48:39.479354 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 07:48:39.479401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 07:48:39.479427 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 07:48:39.489134 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 07:48:39.489225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 07:48:39.489271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 07:48:39.489291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 07:48:39.489311 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 07:48:39.489336 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 07:48:39.489506 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 07:48:39.491496 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1078b48064376958ba64f9040dc1973628494ff4c9935c90c296bdb8a27f8738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c678c86754c6ec4bfb825856edb788d6f8a6d6b7b62074ee46de1a5c6568eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:50:00Z is after 2025-08-24T17:21:41Z" Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.110843 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c7dd53736fe9d646bd1433c7f564d5096f83984954d65d5f63ce953cd49690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46c7dd53736fe9d646bd1433c7f564d5096f83984954d65d5f63ce953cd49690\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T07:49:46Z\\\",\\\"message\\\":\\\"ternalversions/factory.go:141\\\\nI1003 07:49:46.647758 6739 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1003 07:49:46.647937 6739 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 07:49:46.648037 6739 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 07:49:46.648258 6739 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1003 07:49:46.648314 6739 factory.go:656] Stopping watch factory\\\\nI1003 07:49:46.648371 6739 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 07:49:46.677200 6739 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1003 07:49:46.677255 6739 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1003 07:49:46.677350 6739 ovnkube.go:599] Stopped ovnkube\\\\nI1003 07:49:46.677384 6739 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 07:49:46.677538 6739 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T07:49:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-2jpvm_openshift-ovn-kubernetes(8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k42nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2jpvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:50:00Z is after 2025-08-24T17:21:41Z" Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.124691 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.124737 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.124751 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.124764 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.124775 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:00Z","lastTransitionTime":"2025-10-03T07:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.126250 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h865c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e441c3-e8db-4705-9da1-0c6513d57048\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de7994382833aa2a4b6d7926e96257038511c4f547f24336eb11f3cf89e2d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b1d7eee670c65e2bb98dfd968c435f1be57b6d77c438f22a0879145c8d631e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4866f21ecaa4c4b0090cd60ec47d3b21f7451bef3ec10916326fc8d71f53ab11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ac6054af3d81f8a5f644041c827bd2ce4009b168a21e1b497b09e99ad5f7e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6652062f119d14db463c5e397b3730c798000ac1aa9b158476ffeac05f6451f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74477ea62a72636a31cd373f68b7ec13a7c4fecacc368198437c66ad4dddd90c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7a0ba16408bb7b21d7fd875b7e1788bd6e3441df436360d7b908d24ad04fdf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T07:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T07:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br5ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h865c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:50:00Z is after 2025-08-24T17:21:41Z" Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.138240 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-27zqq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988af70a-5398-4c96-b2a7-4b8e143303bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f95198276df57dfddfafb72c325577486b3eddb40e73fbe4afa95e8ce8a6183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cr7ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd25565edabdcc002b63aece712eb1cf0ff4eece8277aa01ad4875f2977f63ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T07:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cr7ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-27zqq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:50:00Z is after 2025-08-24T17:21:41Z" Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.148800 4664 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l687s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f2800e0-b66e-4ab2-ad4f-37c5ffe60120\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T07:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6h72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6h72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T07:48:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l687s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T07:50:00Z is after 2025-08-24T17:21:41Z" Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.227047 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.227084 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.227094 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.227109 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.227119 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:00Z","lastTransitionTime":"2025-10-03T07:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.329130 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.329179 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.329195 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.329211 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.329222 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:00Z","lastTransitionTime":"2025-10-03T07:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.431254 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.431306 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.431315 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.431331 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.431341 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:00Z","lastTransitionTime":"2025-10-03T07:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.533626 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.533674 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.533683 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.533697 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.533706 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:00Z","lastTransitionTime":"2025-10-03T07:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.635872 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.635920 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.635931 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.635949 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.635963 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:00Z","lastTransitionTime":"2025-10-03T07:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.738440 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.738484 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.738494 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.738510 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.738521 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:00Z","lastTransitionTime":"2025-10-03T07:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.841205 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.841241 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.841252 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.841264 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.841273 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:00Z","lastTransitionTime":"2025-10-03T07:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.875379 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:50:00 crc kubenswrapper[4664]: E1003 07:50:00.875652 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l687s" podUID="7f2800e0-b66e-4ab2-ad4f-37c5ffe60120" Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.943516 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.943562 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.943576 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.943594 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:00 crc kubenswrapper[4664]: I1003 07:50:00.943635 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:00Z","lastTransitionTime":"2025-10-03T07:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:01 crc kubenswrapper[4664]: I1003 07:50:01.046075 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:01 crc kubenswrapper[4664]: I1003 07:50:01.046120 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:01 crc kubenswrapper[4664]: I1003 07:50:01.046133 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:01 crc kubenswrapper[4664]: I1003 07:50:01.046147 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:01 crc kubenswrapper[4664]: I1003 07:50:01.046158 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:01Z","lastTransitionTime":"2025-10-03T07:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:01 crc kubenswrapper[4664]: I1003 07:50:01.148882 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:01 crc kubenswrapper[4664]: I1003 07:50:01.148930 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:01 crc kubenswrapper[4664]: I1003 07:50:01.148942 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:01 crc kubenswrapper[4664]: I1003 07:50:01.148959 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:01 crc kubenswrapper[4664]: I1003 07:50:01.148969 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:01Z","lastTransitionTime":"2025-10-03T07:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:01 crc kubenswrapper[4664]: I1003 07:50:01.251300 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:01 crc kubenswrapper[4664]: I1003 07:50:01.251341 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:01 crc kubenswrapper[4664]: I1003 07:50:01.251353 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:01 crc kubenswrapper[4664]: I1003 07:50:01.251369 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:01 crc kubenswrapper[4664]: I1003 07:50:01.251379 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:01Z","lastTransitionTime":"2025-10-03T07:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:01 crc kubenswrapper[4664]: I1003 07:50:01.354384 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:01 crc kubenswrapper[4664]: I1003 07:50:01.354430 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:01 crc kubenswrapper[4664]: I1003 07:50:01.354445 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:01 crc kubenswrapper[4664]: I1003 07:50:01.354462 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:01 crc kubenswrapper[4664]: I1003 07:50:01.354472 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:01Z","lastTransitionTime":"2025-10-03T07:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:01 crc kubenswrapper[4664]: I1003 07:50:01.457863 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:01 crc kubenswrapper[4664]: I1003 07:50:01.457920 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:01 crc kubenswrapper[4664]: I1003 07:50:01.457932 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:01 crc kubenswrapper[4664]: I1003 07:50:01.457955 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:01 crc kubenswrapper[4664]: I1003 07:50:01.457969 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:01Z","lastTransitionTime":"2025-10-03T07:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:01 crc kubenswrapper[4664]: I1003 07:50:01.560409 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:01 crc kubenswrapper[4664]: I1003 07:50:01.560455 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:01 crc kubenswrapper[4664]: I1003 07:50:01.560468 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:01 crc kubenswrapper[4664]: I1003 07:50:01.560484 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:01 crc kubenswrapper[4664]: I1003 07:50:01.560496 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:01Z","lastTransitionTime":"2025-10-03T07:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:01 crc kubenswrapper[4664]: I1003 07:50:01.662496 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:01 crc kubenswrapper[4664]: I1003 07:50:01.662535 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:01 crc kubenswrapper[4664]: I1003 07:50:01.662545 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:01 crc kubenswrapper[4664]: I1003 07:50:01.662558 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:01 crc kubenswrapper[4664]: I1003 07:50:01.662567 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:01Z","lastTransitionTime":"2025-10-03T07:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:01 crc kubenswrapper[4664]: I1003 07:50:01.765293 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:01 crc kubenswrapper[4664]: I1003 07:50:01.765336 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:01 crc kubenswrapper[4664]: I1003 07:50:01.765344 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:01 crc kubenswrapper[4664]: I1003 07:50:01.765358 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:01 crc kubenswrapper[4664]: I1003 07:50:01.765368 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:01Z","lastTransitionTime":"2025-10-03T07:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:01 crc kubenswrapper[4664]: I1003 07:50:01.867424 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:01 crc kubenswrapper[4664]: I1003 07:50:01.867459 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:01 crc kubenswrapper[4664]: I1003 07:50:01.867469 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:01 crc kubenswrapper[4664]: I1003 07:50:01.867484 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:01 crc kubenswrapper[4664]: I1003 07:50:01.867493 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:01Z","lastTransitionTime":"2025-10-03T07:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:01 crc kubenswrapper[4664]: I1003 07:50:01.875822 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:50:01 crc kubenswrapper[4664]: I1003 07:50:01.875875 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:50:01 crc kubenswrapper[4664]: E1003 07:50:01.875938 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:50:01 crc kubenswrapper[4664]: I1003 07:50:01.875946 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:50:01 crc kubenswrapper[4664]: E1003 07:50:01.876665 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:50:01 crc kubenswrapper[4664]: E1003 07:50:01.876732 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:50:01 crc kubenswrapper[4664]: I1003 07:50:01.876999 4664 scope.go:117] "RemoveContainer" containerID="46c7dd53736fe9d646bd1433c7f564d5096f83984954d65d5f63ce953cd49690" Oct 03 07:50:01 crc kubenswrapper[4664]: E1003 07:50:01.877179 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-2jpvm_openshift-ovn-kubernetes(8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" Oct 03 07:50:01 crc kubenswrapper[4664]: I1003 07:50:01.969939 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:01 crc kubenswrapper[4664]: I1003 07:50:01.970239 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:01 crc kubenswrapper[4664]: I1003 07:50:01.970358 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:01 crc kubenswrapper[4664]: I1003 07:50:01.970459 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:01 crc kubenswrapper[4664]: I1003 07:50:01.970541 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:01Z","lastTransitionTime":"2025-10-03T07:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:02 crc kubenswrapper[4664]: I1003 07:50:02.072991 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:02 crc kubenswrapper[4664]: I1003 07:50:02.073028 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:02 crc kubenswrapper[4664]: I1003 07:50:02.073037 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:02 crc kubenswrapper[4664]: I1003 07:50:02.073050 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:02 crc kubenswrapper[4664]: I1003 07:50:02.073058 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:02Z","lastTransitionTime":"2025-10-03T07:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:02 crc kubenswrapper[4664]: I1003 07:50:02.175110 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:02 crc kubenswrapper[4664]: I1003 07:50:02.175157 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:02 crc kubenswrapper[4664]: I1003 07:50:02.175172 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:02 crc kubenswrapper[4664]: I1003 07:50:02.175187 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:02 crc kubenswrapper[4664]: I1003 07:50:02.175197 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:02Z","lastTransitionTime":"2025-10-03T07:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:02 crc kubenswrapper[4664]: I1003 07:50:02.277768 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:02 crc kubenswrapper[4664]: I1003 07:50:02.277809 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:02 crc kubenswrapper[4664]: I1003 07:50:02.277821 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:02 crc kubenswrapper[4664]: I1003 07:50:02.277839 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:02 crc kubenswrapper[4664]: I1003 07:50:02.277857 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:02Z","lastTransitionTime":"2025-10-03T07:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:02 crc kubenswrapper[4664]: I1003 07:50:02.380698 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:02 crc kubenswrapper[4664]: I1003 07:50:02.380734 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:02 crc kubenswrapper[4664]: I1003 07:50:02.380748 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:02 crc kubenswrapper[4664]: I1003 07:50:02.380762 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:02 crc kubenswrapper[4664]: I1003 07:50:02.380773 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:02Z","lastTransitionTime":"2025-10-03T07:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:02 crc kubenswrapper[4664]: I1003 07:50:02.483110 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:02 crc kubenswrapper[4664]: I1003 07:50:02.483155 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:02 crc kubenswrapper[4664]: I1003 07:50:02.483173 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:02 crc kubenswrapper[4664]: I1003 07:50:02.483190 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:02 crc kubenswrapper[4664]: I1003 07:50:02.483201 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:02Z","lastTransitionTime":"2025-10-03T07:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:02 crc kubenswrapper[4664]: I1003 07:50:02.585212 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:02 crc kubenswrapper[4664]: I1003 07:50:02.585257 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:02 crc kubenswrapper[4664]: I1003 07:50:02.585274 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:02 crc kubenswrapper[4664]: I1003 07:50:02.585288 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:02 crc kubenswrapper[4664]: I1003 07:50:02.585299 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:02Z","lastTransitionTime":"2025-10-03T07:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:02 crc kubenswrapper[4664]: I1003 07:50:02.687649 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:02 crc kubenswrapper[4664]: I1003 07:50:02.687693 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:02 crc kubenswrapper[4664]: I1003 07:50:02.687702 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:02 crc kubenswrapper[4664]: I1003 07:50:02.687714 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:02 crc kubenswrapper[4664]: I1003 07:50:02.687724 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:02Z","lastTransitionTime":"2025-10-03T07:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:02 crc kubenswrapper[4664]: I1003 07:50:02.790833 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:02 crc kubenswrapper[4664]: I1003 07:50:02.790881 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:02 crc kubenswrapper[4664]: I1003 07:50:02.790894 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:02 crc kubenswrapper[4664]: I1003 07:50:02.790908 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:02 crc kubenswrapper[4664]: I1003 07:50:02.790919 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:02Z","lastTransitionTime":"2025-10-03T07:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:02 crc kubenswrapper[4664]: I1003 07:50:02.875338 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:50:02 crc kubenswrapper[4664]: E1003 07:50:02.875469 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l687s" podUID="7f2800e0-b66e-4ab2-ad4f-37c5ffe60120" Oct 03 07:50:02 crc kubenswrapper[4664]: I1003 07:50:02.893185 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:02 crc kubenswrapper[4664]: I1003 07:50:02.893244 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:02 crc kubenswrapper[4664]: I1003 07:50:02.893258 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:02 crc kubenswrapper[4664]: I1003 07:50:02.893277 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:02 crc kubenswrapper[4664]: I1003 07:50:02.893287 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:02Z","lastTransitionTime":"2025-10-03T07:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:02 crc kubenswrapper[4664]: I1003 07:50:02.996006 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:02 crc kubenswrapper[4664]: I1003 07:50:02.996043 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:02 crc kubenswrapper[4664]: I1003 07:50:02.996051 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:02 crc kubenswrapper[4664]: I1003 07:50:02.996065 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:02 crc kubenswrapper[4664]: I1003 07:50:02.996074 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:02Z","lastTransitionTime":"2025-10-03T07:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:03 crc kubenswrapper[4664]: I1003 07:50:03.098427 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:03 crc kubenswrapper[4664]: I1003 07:50:03.098473 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:03 crc kubenswrapper[4664]: I1003 07:50:03.098485 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:03 crc kubenswrapper[4664]: I1003 07:50:03.098502 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:03 crc kubenswrapper[4664]: I1003 07:50:03.098514 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:03Z","lastTransitionTime":"2025-10-03T07:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:03 crc kubenswrapper[4664]: I1003 07:50:03.200809 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:03 crc kubenswrapper[4664]: I1003 07:50:03.200879 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:03 crc kubenswrapper[4664]: I1003 07:50:03.200889 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:03 crc kubenswrapper[4664]: I1003 07:50:03.200901 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:03 crc kubenswrapper[4664]: I1003 07:50:03.200910 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:03Z","lastTransitionTime":"2025-10-03T07:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:03 crc kubenswrapper[4664]: I1003 07:50:03.303743 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:03 crc kubenswrapper[4664]: I1003 07:50:03.303801 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:03 crc kubenswrapper[4664]: I1003 07:50:03.303815 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:03 crc kubenswrapper[4664]: I1003 07:50:03.303831 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:03 crc kubenswrapper[4664]: I1003 07:50:03.303842 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:03Z","lastTransitionTime":"2025-10-03T07:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:03 crc kubenswrapper[4664]: I1003 07:50:03.406296 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:03 crc kubenswrapper[4664]: I1003 07:50:03.406338 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:03 crc kubenswrapper[4664]: I1003 07:50:03.406348 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:03 crc kubenswrapper[4664]: I1003 07:50:03.406361 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:03 crc kubenswrapper[4664]: I1003 07:50:03.406370 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:03Z","lastTransitionTime":"2025-10-03T07:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:03 crc kubenswrapper[4664]: I1003 07:50:03.501139 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7f2800e0-b66e-4ab2-ad4f-37c5ffe60120-metrics-certs\") pod \"network-metrics-daemon-l687s\" (UID: \"7f2800e0-b66e-4ab2-ad4f-37c5ffe60120\") " pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:50:03 crc kubenswrapper[4664]: E1003 07:50:03.501289 4664 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 07:50:03 crc kubenswrapper[4664]: E1003 07:50:03.501360 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f2800e0-b66e-4ab2-ad4f-37c5ffe60120-metrics-certs podName:7f2800e0-b66e-4ab2-ad4f-37c5ffe60120 nodeName:}" failed. No retries permitted until 2025-10-03 07:51:07.501340141 +0000 UTC m=+168.322530641 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7f2800e0-b66e-4ab2-ad4f-37c5ffe60120-metrics-certs") pod "network-metrics-daemon-l687s" (UID: "7f2800e0-b66e-4ab2-ad4f-37c5ffe60120") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 07:50:03 crc kubenswrapper[4664]: I1003 07:50:03.508557 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:03 crc kubenswrapper[4664]: I1003 07:50:03.508616 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:03 crc kubenswrapper[4664]: I1003 07:50:03.508640 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:03 crc kubenswrapper[4664]: I1003 07:50:03.508656 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:03 crc kubenswrapper[4664]: I1003 07:50:03.508669 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:03Z","lastTransitionTime":"2025-10-03T07:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:03 crc kubenswrapper[4664]: I1003 07:50:03.610896 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:03 crc kubenswrapper[4664]: I1003 07:50:03.610972 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:03 crc kubenswrapper[4664]: I1003 07:50:03.610984 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:03 crc kubenswrapper[4664]: I1003 07:50:03.611003 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:03 crc kubenswrapper[4664]: I1003 07:50:03.611015 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:03Z","lastTransitionTime":"2025-10-03T07:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:03 crc kubenswrapper[4664]: I1003 07:50:03.713353 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:03 crc kubenswrapper[4664]: I1003 07:50:03.713397 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:03 crc kubenswrapper[4664]: I1003 07:50:03.713407 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:03 crc kubenswrapper[4664]: I1003 07:50:03.713421 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:03 crc kubenswrapper[4664]: I1003 07:50:03.713431 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:03Z","lastTransitionTime":"2025-10-03T07:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:03 crc kubenswrapper[4664]: I1003 07:50:03.816287 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:03 crc kubenswrapper[4664]: I1003 07:50:03.816331 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:03 crc kubenswrapper[4664]: I1003 07:50:03.816341 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:03 crc kubenswrapper[4664]: I1003 07:50:03.816354 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:03 crc kubenswrapper[4664]: I1003 07:50:03.816363 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:03Z","lastTransitionTime":"2025-10-03T07:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:03 crc kubenswrapper[4664]: I1003 07:50:03.876254 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:50:03 crc kubenswrapper[4664]: E1003 07:50:03.876393 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:50:03 crc kubenswrapper[4664]: I1003 07:50:03.876625 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:50:03 crc kubenswrapper[4664]: E1003 07:50:03.876684 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:50:03 crc kubenswrapper[4664]: I1003 07:50:03.876911 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:50:03 crc kubenswrapper[4664]: E1003 07:50:03.877057 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:50:03 crc kubenswrapper[4664]: I1003 07:50:03.919090 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:03 crc kubenswrapper[4664]: I1003 07:50:03.919133 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:03 crc kubenswrapper[4664]: I1003 07:50:03.919146 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:03 crc kubenswrapper[4664]: I1003 07:50:03.919163 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:03 crc kubenswrapper[4664]: I1003 07:50:03.919173 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:03Z","lastTransitionTime":"2025-10-03T07:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:04 crc kubenswrapper[4664]: I1003 07:50:04.021090 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:04 crc kubenswrapper[4664]: I1003 07:50:04.021135 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:04 crc kubenswrapper[4664]: I1003 07:50:04.021144 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:04 crc kubenswrapper[4664]: I1003 07:50:04.021157 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:04 crc kubenswrapper[4664]: I1003 07:50:04.021191 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:04Z","lastTransitionTime":"2025-10-03T07:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:04 crc kubenswrapper[4664]: I1003 07:50:04.123809 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:04 crc kubenswrapper[4664]: I1003 07:50:04.123862 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:04 crc kubenswrapper[4664]: I1003 07:50:04.123873 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:04 crc kubenswrapper[4664]: I1003 07:50:04.123887 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:04 crc kubenswrapper[4664]: I1003 07:50:04.123896 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:04Z","lastTransitionTime":"2025-10-03T07:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:04 crc kubenswrapper[4664]: I1003 07:50:04.226014 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:04 crc kubenswrapper[4664]: I1003 07:50:04.226061 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:04 crc kubenswrapper[4664]: I1003 07:50:04.226072 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:04 crc kubenswrapper[4664]: I1003 07:50:04.226088 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:04 crc kubenswrapper[4664]: I1003 07:50:04.226099 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:04Z","lastTransitionTime":"2025-10-03T07:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:04 crc kubenswrapper[4664]: I1003 07:50:04.328594 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:04 crc kubenswrapper[4664]: I1003 07:50:04.328660 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:04 crc kubenswrapper[4664]: I1003 07:50:04.328673 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:04 crc kubenswrapper[4664]: I1003 07:50:04.328688 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:04 crc kubenswrapper[4664]: I1003 07:50:04.328699 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:04Z","lastTransitionTime":"2025-10-03T07:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:04 crc kubenswrapper[4664]: I1003 07:50:04.430842 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:04 crc kubenswrapper[4664]: I1003 07:50:04.430887 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:04 crc kubenswrapper[4664]: I1003 07:50:04.430900 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:04 crc kubenswrapper[4664]: I1003 07:50:04.430912 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:04 crc kubenswrapper[4664]: I1003 07:50:04.430922 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:04Z","lastTransitionTime":"2025-10-03T07:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:04 crc kubenswrapper[4664]: I1003 07:50:04.532996 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:04 crc kubenswrapper[4664]: I1003 07:50:04.533039 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:04 crc kubenswrapper[4664]: I1003 07:50:04.533056 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:04 crc kubenswrapper[4664]: I1003 07:50:04.533070 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:04 crc kubenswrapper[4664]: I1003 07:50:04.533080 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:04Z","lastTransitionTime":"2025-10-03T07:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:04 crc kubenswrapper[4664]: I1003 07:50:04.635840 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:04 crc kubenswrapper[4664]: I1003 07:50:04.635894 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:04 crc kubenswrapper[4664]: I1003 07:50:04.635905 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:04 crc kubenswrapper[4664]: I1003 07:50:04.635923 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:04 crc kubenswrapper[4664]: I1003 07:50:04.635934 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:04Z","lastTransitionTime":"2025-10-03T07:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:04 crc kubenswrapper[4664]: I1003 07:50:04.737853 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:04 crc kubenswrapper[4664]: I1003 07:50:04.737939 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:04 crc kubenswrapper[4664]: I1003 07:50:04.738004 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:04 crc kubenswrapper[4664]: I1003 07:50:04.738022 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:04 crc kubenswrapper[4664]: I1003 07:50:04.738031 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:04Z","lastTransitionTime":"2025-10-03T07:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:04 crc kubenswrapper[4664]: I1003 07:50:04.840508 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:04 crc kubenswrapper[4664]: I1003 07:50:04.840547 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:04 crc kubenswrapper[4664]: I1003 07:50:04.840559 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:04 crc kubenswrapper[4664]: I1003 07:50:04.840575 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:04 crc kubenswrapper[4664]: I1003 07:50:04.840585 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:04Z","lastTransitionTime":"2025-10-03T07:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:04 crc kubenswrapper[4664]: I1003 07:50:04.875983 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:50:04 crc kubenswrapper[4664]: E1003 07:50:04.876212 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l687s" podUID="7f2800e0-b66e-4ab2-ad4f-37c5ffe60120" Oct 03 07:50:04 crc kubenswrapper[4664]: I1003 07:50:04.942517 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:04 crc kubenswrapper[4664]: I1003 07:50:04.942562 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:04 crc kubenswrapper[4664]: I1003 07:50:04.942574 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:04 crc kubenswrapper[4664]: I1003 07:50:04.942590 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:04 crc kubenswrapper[4664]: I1003 07:50:04.942617 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:04Z","lastTransitionTime":"2025-10-03T07:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:05 crc kubenswrapper[4664]: I1003 07:50:05.045573 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:05 crc kubenswrapper[4664]: I1003 07:50:05.045634 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:05 crc kubenswrapper[4664]: I1003 07:50:05.045646 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:05 crc kubenswrapper[4664]: I1003 07:50:05.045668 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:05 crc kubenswrapper[4664]: I1003 07:50:05.045680 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:05Z","lastTransitionTime":"2025-10-03T07:50:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:05 crc kubenswrapper[4664]: I1003 07:50:05.147409 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:05 crc kubenswrapper[4664]: I1003 07:50:05.147453 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:05 crc kubenswrapper[4664]: I1003 07:50:05.147466 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:05 crc kubenswrapper[4664]: I1003 07:50:05.147481 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:05 crc kubenswrapper[4664]: I1003 07:50:05.147493 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:05Z","lastTransitionTime":"2025-10-03T07:50:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:05 crc kubenswrapper[4664]: I1003 07:50:05.249991 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:05 crc kubenswrapper[4664]: I1003 07:50:05.250037 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:05 crc kubenswrapper[4664]: I1003 07:50:05.250049 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:05 crc kubenswrapper[4664]: I1003 07:50:05.250063 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:05 crc kubenswrapper[4664]: I1003 07:50:05.250073 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:05Z","lastTransitionTime":"2025-10-03T07:50:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:05 crc kubenswrapper[4664]: I1003 07:50:05.352803 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:05 crc kubenswrapper[4664]: I1003 07:50:05.352840 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:05 crc kubenswrapper[4664]: I1003 07:50:05.352849 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:05 crc kubenswrapper[4664]: I1003 07:50:05.352861 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:05 crc kubenswrapper[4664]: I1003 07:50:05.352870 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:05Z","lastTransitionTime":"2025-10-03T07:50:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:05 crc kubenswrapper[4664]: I1003 07:50:05.455671 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:05 crc kubenswrapper[4664]: I1003 07:50:05.455711 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:05 crc kubenswrapper[4664]: I1003 07:50:05.455721 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:05 crc kubenswrapper[4664]: I1003 07:50:05.455735 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:05 crc kubenswrapper[4664]: I1003 07:50:05.455744 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:05Z","lastTransitionTime":"2025-10-03T07:50:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:05 crc kubenswrapper[4664]: I1003 07:50:05.557686 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:05 crc kubenswrapper[4664]: I1003 07:50:05.557732 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:05 crc kubenswrapper[4664]: I1003 07:50:05.557743 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:05 crc kubenswrapper[4664]: I1003 07:50:05.557759 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:05 crc kubenswrapper[4664]: I1003 07:50:05.557770 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:05Z","lastTransitionTime":"2025-10-03T07:50:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:05 crc kubenswrapper[4664]: I1003 07:50:05.659663 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:05 crc kubenswrapper[4664]: I1003 07:50:05.659711 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:05 crc kubenswrapper[4664]: I1003 07:50:05.659721 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:05 crc kubenswrapper[4664]: I1003 07:50:05.659736 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:05 crc kubenswrapper[4664]: I1003 07:50:05.659746 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:05Z","lastTransitionTime":"2025-10-03T07:50:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:05 crc kubenswrapper[4664]: I1003 07:50:05.761984 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:05 crc kubenswrapper[4664]: I1003 07:50:05.762047 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:05 crc kubenswrapper[4664]: I1003 07:50:05.762061 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:05 crc kubenswrapper[4664]: I1003 07:50:05.762081 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:05 crc kubenswrapper[4664]: I1003 07:50:05.762099 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:05Z","lastTransitionTime":"2025-10-03T07:50:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:05 crc kubenswrapper[4664]: I1003 07:50:05.864598 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:05 crc kubenswrapper[4664]: I1003 07:50:05.864680 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:05 crc kubenswrapper[4664]: I1003 07:50:05.864698 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:05 crc kubenswrapper[4664]: I1003 07:50:05.864717 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:05 crc kubenswrapper[4664]: I1003 07:50:05.864729 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:05Z","lastTransitionTime":"2025-10-03T07:50:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:05 crc kubenswrapper[4664]: I1003 07:50:05.875900 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:50:05 crc kubenswrapper[4664]: I1003 07:50:05.875951 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:50:05 crc kubenswrapper[4664]: I1003 07:50:05.875904 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:50:05 crc kubenswrapper[4664]: E1003 07:50:05.876038 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:50:05 crc kubenswrapper[4664]: E1003 07:50:05.876120 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:50:05 crc kubenswrapper[4664]: E1003 07:50:05.876195 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:50:05 crc kubenswrapper[4664]: I1003 07:50:05.967228 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:05 crc kubenswrapper[4664]: I1003 07:50:05.967278 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:05 crc kubenswrapper[4664]: I1003 07:50:05.967290 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:05 crc kubenswrapper[4664]: I1003 07:50:05.967304 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:05 crc kubenswrapper[4664]: I1003 07:50:05.967314 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:05Z","lastTransitionTime":"2025-10-03T07:50:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:06 crc kubenswrapper[4664]: I1003 07:50:06.069875 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:06 crc kubenswrapper[4664]: I1003 07:50:06.069913 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:06 crc kubenswrapper[4664]: I1003 07:50:06.069923 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:06 crc kubenswrapper[4664]: I1003 07:50:06.069941 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:06 crc kubenswrapper[4664]: I1003 07:50:06.069953 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:06Z","lastTransitionTime":"2025-10-03T07:50:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:06 crc kubenswrapper[4664]: I1003 07:50:06.171685 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:06 crc kubenswrapper[4664]: I1003 07:50:06.171721 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:06 crc kubenswrapper[4664]: I1003 07:50:06.171733 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:06 crc kubenswrapper[4664]: I1003 07:50:06.171749 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:06 crc kubenswrapper[4664]: I1003 07:50:06.171761 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:06Z","lastTransitionTime":"2025-10-03T07:50:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:06 crc kubenswrapper[4664]: I1003 07:50:06.274435 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:06 crc kubenswrapper[4664]: I1003 07:50:06.274484 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:06 crc kubenswrapper[4664]: I1003 07:50:06.274497 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:06 crc kubenswrapper[4664]: I1003 07:50:06.274514 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:06 crc kubenswrapper[4664]: I1003 07:50:06.274525 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:06Z","lastTransitionTime":"2025-10-03T07:50:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:06 crc kubenswrapper[4664]: I1003 07:50:06.376321 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:06 crc kubenswrapper[4664]: I1003 07:50:06.376360 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:06 crc kubenswrapper[4664]: I1003 07:50:06.376371 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:06 crc kubenswrapper[4664]: I1003 07:50:06.376398 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:06 crc kubenswrapper[4664]: I1003 07:50:06.376408 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:06Z","lastTransitionTime":"2025-10-03T07:50:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:06 crc kubenswrapper[4664]: I1003 07:50:06.478358 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:06 crc kubenswrapper[4664]: I1003 07:50:06.478403 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:06 crc kubenswrapper[4664]: I1003 07:50:06.478414 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:06 crc kubenswrapper[4664]: I1003 07:50:06.478429 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:06 crc kubenswrapper[4664]: I1003 07:50:06.478441 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:06Z","lastTransitionTime":"2025-10-03T07:50:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:06 crc kubenswrapper[4664]: I1003 07:50:06.580547 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:06 crc kubenswrapper[4664]: I1003 07:50:06.580593 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:06 crc kubenswrapper[4664]: I1003 07:50:06.580600 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:06 crc kubenswrapper[4664]: I1003 07:50:06.580633 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:06 crc kubenswrapper[4664]: I1003 07:50:06.580644 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:06Z","lastTransitionTime":"2025-10-03T07:50:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:06 crc kubenswrapper[4664]: I1003 07:50:06.682866 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:06 crc kubenswrapper[4664]: I1003 07:50:06.682902 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:06 crc kubenswrapper[4664]: I1003 07:50:06.682911 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:06 crc kubenswrapper[4664]: I1003 07:50:06.682923 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:06 crc kubenswrapper[4664]: I1003 07:50:06.682931 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:06Z","lastTransitionTime":"2025-10-03T07:50:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:06 crc kubenswrapper[4664]: I1003 07:50:06.785214 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:06 crc kubenswrapper[4664]: I1003 07:50:06.785259 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:06 crc kubenswrapper[4664]: I1003 07:50:06.785270 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:06 crc kubenswrapper[4664]: I1003 07:50:06.785283 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:06 crc kubenswrapper[4664]: I1003 07:50:06.785293 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:06Z","lastTransitionTime":"2025-10-03T07:50:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:06 crc kubenswrapper[4664]: I1003 07:50:06.875732 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:50:06 crc kubenswrapper[4664]: E1003 07:50:06.875890 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l687s" podUID="7f2800e0-b66e-4ab2-ad4f-37c5ffe60120" Oct 03 07:50:06 crc kubenswrapper[4664]: I1003 07:50:06.887568 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:06 crc kubenswrapper[4664]: I1003 07:50:06.887622 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:06 crc kubenswrapper[4664]: I1003 07:50:06.887635 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:06 crc kubenswrapper[4664]: I1003 07:50:06.887650 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:06 crc kubenswrapper[4664]: I1003 07:50:06.887662 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:06Z","lastTransitionTime":"2025-10-03T07:50:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:06 crc kubenswrapper[4664]: I1003 07:50:06.990247 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:06 crc kubenswrapper[4664]: I1003 07:50:06.990287 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:06 crc kubenswrapper[4664]: I1003 07:50:06.990298 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:06 crc kubenswrapper[4664]: I1003 07:50:06.990313 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:06 crc kubenswrapper[4664]: I1003 07:50:06.990322 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:06Z","lastTransitionTime":"2025-10-03T07:50:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.092941 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.092983 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.092991 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.093006 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.093015 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:07Z","lastTransitionTime":"2025-10-03T07:50:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.195152 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.195194 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.195204 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.195216 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.195225 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:07Z","lastTransitionTime":"2025-10-03T07:50:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.297574 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.297625 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.297634 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.297649 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.297661 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:07Z","lastTransitionTime":"2025-10-03T07:50:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.400004 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.400059 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.400072 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.400092 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.400103 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:07Z","lastTransitionTime":"2025-10-03T07:50:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.502870 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.502906 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.502916 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.502929 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.502938 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:07Z","lastTransitionTime":"2025-10-03T07:50:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.605126 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.605178 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.605188 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.605204 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.605214 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:07Z","lastTransitionTime":"2025-10-03T07:50:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.707145 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.707193 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.707204 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.707221 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.707231 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:07Z","lastTransitionTime":"2025-10-03T07:50:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.809866 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.809898 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.809906 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.809921 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.809929 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:07Z","lastTransitionTime":"2025-10-03T07:50:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.839836 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.839880 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.839890 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.839906 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.839919 4664 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T07:50:07Z","lastTransitionTime":"2025-10-03T07:50:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.875852 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.875886 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.876158 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:50:07 crc kubenswrapper[4664]: E1003 07:50:07.876219 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:50:07 crc kubenswrapper[4664]: E1003 07:50:07.876384 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:50:07 crc kubenswrapper[4664]: E1003 07:50:07.876508 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.886982 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzllp"] Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.887477 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzllp" Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.889251 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.889496 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.889716 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.889786 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.912754 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=23.912738885 podStartE2EDuration="23.912738885s" podCreationTimestamp="2025-10-03 07:49:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:07.900803623 +0000 UTC m=+108.721994123" watchObservedRunningTime="2025-10-03 07:50:07.912738885 +0000 UTC m=+108.733929375" Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.928259 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-72cm2" podStartSLOduration=82.928240058 podStartE2EDuration="1m22.928240058s" podCreationTimestamp="2025-10-03 07:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:07.912701663 +0000 UTC m=+108.733892173" watchObservedRunningTime="2025-10-03 07:50:07.928240058 +0000 UTC m=+108.749430548" Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.931966 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=87.931943574 podStartE2EDuration="1m27.931943574s" podCreationTimestamp="2025-10-03 07:48:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:07.928224578 +0000 UTC m=+108.749415088" watchObservedRunningTime="2025-10-03 07:50:07.931943574 +0000 UTC m=+108.753134064" Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.948082 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d6a6a39d-9e3b-433d-991a-401e1015e04e-service-ca\") pod \"cluster-version-operator-5c965bbfc6-hzllp\" (UID: \"d6a6a39d-9e3b-433d-991a-401e1015e04e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzllp" Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.948135 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d6a6a39d-9e3b-433d-991a-401e1015e04e-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-hzllp\" (UID: \"d6a6a39d-9e3b-433d-991a-401e1015e04e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzllp" Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.948463 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/d6a6a39d-9e3b-433d-991a-401e1015e04e-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-hzllp\" (UID: \"d6a6a39d-9e3b-433d-991a-401e1015e04e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzllp" Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.948513 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/d6a6a39d-9e3b-433d-991a-401e1015e04e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-hzllp\" (UID: \"d6a6a39d-9e3b-433d-991a-401e1015e04e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzllp" Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.948649 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6a6a39d-9e3b-433d-991a-401e1015e04e-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-hzllp\" (UID: \"d6a6a39d-9e3b-433d-991a-401e1015e04e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzllp" Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.987531 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-h865c" podStartSLOduration=82.987512014 podStartE2EDuration="1m22.987512014s" podCreationTimestamp="2025-10-03 07:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:07.976750806 +0000 UTC m=+108.797941316" watchObservedRunningTime="2025-10-03 07:50:07.987512014 +0000 UTC m=+108.808702504" Oct 03 07:50:07 crc kubenswrapper[4664]: I1003 07:50:07.988065 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-27zqq" podStartSLOduration=82.98805859 podStartE2EDuration="1m22.98805859s" podCreationTimestamp="2025-10-03 07:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:07.987117983 +0000 UTC m=+108.808308473" watchObservedRunningTime="2025-10-03 07:50:07.98805859 +0000 UTC m=+108.809249080" Oct 03 07:50:08 crc kubenswrapper[4664]: I1003 07:50:08.024462 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-hqfhs" podStartSLOduration=84.024447321 podStartE2EDuration="1m24.024447321s" podCreationTimestamp="2025-10-03 07:48:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:08.02439221 +0000 UTC m=+108.845582720" watchObservedRunningTime="2025-10-03 07:50:08.024447321 +0000 UTC m=+108.845637811" Oct 03 07:50:08 crc kubenswrapper[4664]: I1003 07:50:08.049967 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/d6a6a39d-9e3b-433d-991a-401e1015e04e-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-hzllp\" (UID: \"d6a6a39d-9e3b-433d-991a-401e1015e04e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzllp" Oct 03 07:50:08 crc kubenswrapper[4664]: I1003 07:50:08.050032 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/d6a6a39d-9e3b-433d-991a-401e1015e04e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-hzllp\" (UID: \"d6a6a39d-9e3b-433d-991a-401e1015e04e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzllp" Oct 03 07:50:08 crc kubenswrapper[4664]: I1003 07:50:08.050073 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/d6a6a39d-9e3b-433d-991a-401e1015e04e-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-hzllp\" (UID: \"d6a6a39d-9e3b-433d-991a-401e1015e04e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzllp" Oct 03 07:50:08 crc kubenswrapper[4664]: I1003 07:50:08.050087 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6a6a39d-9e3b-433d-991a-401e1015e04e-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-hzllp\" (UID: \"d6a6a39d-9e3b-433d-991a-401e1015e04e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzllp" Oct 03 07:50:08 crc kubenswrapper[4664]: I1003 07:50:08.050146 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d6a6a39d-9e3b-433d-991a-401e1015e04e-service-ca\") pod \"cluster-version-operator-5c965bbfc6-hzllp\" (UID: \"d6a6a39d-9e3b-433d-991a-401e1015e04e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzllp" Oct 03 07:50:08 crc kubenswrapper[4664]: I1003 07:50:08.050169 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d6a6a39d-9e3b-433d-991a-401e1015e04e-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-hzllp\" (UID: \"d6a6a39d-9e3b-433d-991a-401e1015e04e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzllp" Oct 03 07:50:08 crc kubenswrapper[4664]: I1003 07:50:08.050207 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/d6a6a39d-9e3b-433d-991a-401e1015e04e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-hzllp\" (UID: \"d6a6a39d-9e3b-433d-991a-401e1015e04e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzllp" Oct 03 07:50:08 crc kubenswrapper[4664]: I1003 07:50:08.051139 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d6a6a39d-9e3b-433d-991a-401e1015e04e-service-ca\") pod \"cluster-version-operator-5c965bbfc6-hzllp\" (UID: \"d6a6a39d-9e3b-433d-991a-401e1015e04e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzllp" Oct 03 07:50:08 crc kubenswrapper[4664]: I1003 07:50:08.054258 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=89.054242854 podStartE2EDuration="1m29.054242854s" podCreationTimestamp="2025-10-03 07:48:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:08.039125621 +0000 UTC m=+108.860316131" watchObservedRunningTime="2025-10-03 07:50:08.054242854 +0000 UTC m=+108.875433344" Oct 03 07:50:08 crc kubenswrapper[4664]: I1003 07:50:08.058491 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6a6a39d-9e3b-433d-991a-401e1015e04e-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-hzllp\" (UID: \"d6a6a39d-9e3b-433d-991a-401e1015e04e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzllp" Oct 03 07:50:08 crc kubenswrapper[4664]: I1003 07:50:08.068232 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d6a6a39d-9e3b-433d-991a-401e1015e04e-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-hzllp\" (UID: \"d6a6a39d-9e3b-433d-991a-401e1015e04e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzllp" Oct 03 07:50:08 crc kubenswrapper[4664]: I1003 07:50:08.073038 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=60.073014591 podStartE2EDuration="1m0.073014591s" podCreationTimestamp="2025-10-03 07:49:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:08.056058496 +0000 UTC m=+108.877248996" watchObservedRunningTime="2025-10-03 07:50:08.073014591 +0000 UTC m=+108.894205091" Oct 03 07:50:08 crc kubenswrapper[4664]: I1003 07:50:08.181433 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=28.181410103 podStartE2EDuration="28.181410103s" podCreationTimestamp="2025-10-03 07:49:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:08.180625561 +0000 UTC m=+109.001816071" watchObservedRunningTime="2025-10-03 07:50:08.181410103 +0000 UTC m=+109.002600593" Oct 03 07:50:08 crc kubenswrapper[4664]: I1003 07:50:08.199910 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzllp" Oct 03 07:50:08 crc kubenswrapper[4664]: I1003 07:50:08.207403 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podStartSLOduration=84.207372216 podStartE2EDuration="1m24.207372216s" podCreationTimestamp="2025-10-03 07:48:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:08.206407639 +0000 UTC m=+109.027598149" watchObservedRunningTime="2025-10-03 07:50:08.207372216 +0000 UTC m=+109.028562706" Oct 03 07:50:08 crc kubenswrapper[4664]: W1003 07:50:08.214368 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6a6a39d_9e3b_433d_991a_401e1015e04e.slice/crio-db8c2e8b7d06011f22a1c80bb6805cc7175ee24a90e63dd041c3a8f5913d6b48 WatchSource:0}: Error finding container db8c2e8b7d06011f22a1c80bb6805cc7175ee24a90e63dd041c3a8f5913d6b48: Status 404 returned error can't find the container with id db8c2e8b7d06011f22a1c80bb6805cc7175ee24a90e63dd041c3a8f5913d6b48 Oct 03 07:50:08 crc kubenswrapper[4664]: I1003 07:50:08.399869 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzllp" event={"ID":"d6a6a39d-9e3b-433d-991a-401e1015e04e","Type":"ContainerStarted","Data":"3a4cf8f52e9b57caa38ee05fc6df95edb1a465a5bc2daf33e061410fdd5d254d"} Oct 03 07:50:08 crc kubenswrapper[4664]: I1003 07:50:08.399948 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzllp" event={"ID":"d6a6a39d-9e3b-433d-991a-401e1015e04e","Type":"ContainerStarted","Data":"db8c2e8b7d06011f22a1c80bb6805cc7175ee24a90e63dd041c3a8f5913d6b48"} Oct 03 07:50:08 crc kubenswrapper[4664]: I1003 07:50:08.417280 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-9z9q9" podStartSLOduration=84.417252743 podStartE2EDuration="1m24.417252743s" podCreationTimestamp="2025-10-03 07:48:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:08.222060657 +0000 UTC m=+109.043251167" watchObservedRunningTime="2025-10-03 07:50:08.417252743 +0000 UTC m=+109.238443233" Oct 03 07:50:08 crc kubenswrapper[4664]: I1003 07:50:08.875760 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:50:08 crc kubenswrapper[4664]: E1003 07:50:08.875883 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l687s" podUID="7f2800e0-b66e-4ab2-ad4f-37c5ffe60120" Oct 03 07:50:09 crc kubenswrapper[4664]: I1003 07:50:09.875559 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:50:09 crc kubenswrapper[4664]: I1003 07:50:09.875573 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:50:09 crc kubenswrapper[4664]: I1003 07:50:09.876820 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:50:09 crc kubenswrapper[4664]: E1003 07:50:09.876949 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:50:09 crc kubenswrapper[4664]: E1003 07:50:09.876985 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:50:09 crc kubenswrapper[4664]: E1003 07:50:09.877045 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:50:10 crc kubenswrapper[4664]: I1003 07:50:10.875291 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:50:10 crc kubenswrapper[4664]: E1003 07:50:10.875752 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l687s" podUID="7f2800e0-b66e-4ab2-ad4f-37c5ffe60120" Oct 03 07:50:11 crc kubenswrapper[4664]: I1003 07:50:11.876712 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:50:11 crc kubenswrapper[4664]: I1003 07:50:11.877070 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:50:11 crc kubenswrapper[4664]: E1003 07:50:11.877055 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:50:11 crc kubenswrapper[4664]: E1003 07:50:11.877169 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:50:11 crc kubenswrapper[4664]: I1003 07:50:11.877357 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:50:11 crc kubenswrapper[4664]: E1003 07:50:11.877431 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:50:12 crc kubenswrapper[4664]: I1003 07:50:12.875227 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:50:12 crc kubenswrapper[4664]: E1003 07:50:12.875367 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l687s" podUID="7f2800e0-b66e-4ab2-ad4f-37c5ffe60120" Oct 03 07:50:13 crc kubenswrapper[4664]: I1003 07:50:13.875821 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:50:13 crc kubenswrapper[4664]: E1003 07:50:13.875935 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:50:13 crc kubenswrapper[4664]: I1003 07:50:13.876134 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:50:13 crc kubenswrapper[4664]: E1003 07:50:13.876185 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:50:13 crc kubenswrapper[4664]: I1003 07:50:13.876360 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:50:13 crc kubenswrapper[4664]: E1003 07:50:13.876403 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:50:14 crc kubenswrapper[4664]: I1003 07:50:14.875829 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:50:14 crc kubenswrapper[4664]: E1003 07:50:14.876306 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l687s" podUID="7f2800e0-b66e-4ab2-ad4f-37c5ffe60120" Oct 03 07:50:14 crc kubenswrapper[4664]: I1003 07:50:14.876650 4664 scope.go:117] "RemoveContainer" containerID="46c7dd53736fe9d646bd1433c7f564d5096f83984954d65d5f63ce953cd49690" Oct 03 07:50:14 crc kubenswrapper[4664]: E1003 07:50:14.876844 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-2jpvm_openshift-ovn-kubernetes(8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" Oct 03 07:50:15 crc kubenswrapper[4664]: I1003 07:50:15.876222 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:50:15 crc kubenswrapper[4664]: E1003 07:50:15.876442 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:50:15 crc kubenswrapper[4664]: I1003 07:50:15.876533 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:50:15 crc kubenswrapper[4664]: I1003 07:50:15.876664 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:50:15 crc kubenswrapper[4664]: E1003 07:50:15.876669 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:50:15 crc kubenswrapper[4664]: E1003 07:50:15.876748 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:50:16 crc kubenswrapper[4664]: I1003 07:50:16.875226 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:50:16 crc kubenswrapper[4664]: E1003 07:50:16.875427 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l687s" podUID="7f2800e0-b66e-4ab2-ad4f-37c5ffe60120" Oct 03 07:50:17 crc kubenswrapper[4664]: I1003 07:50:17.875441 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:50:17 crc kubenswrapper[4664]: I1003 07:50:17.875517 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:50:17 crc kubenswrapper[4664]: I1003 07:50:17.875595 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:50:17 crc kubenswrapper[4664]: E1003 07:50:17.875717 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:50:17 crc kubenswrapper[4664]: E1003 07:50:17.875838 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:50:17 crc kubenswrapper[4664]: E1003 07:50:17.875920 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:50:18 crc kubenswrapper[4664]: I1003 07:50:18.875588 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:50:18 crc kubenswrapper[4664]: E1003 07:50:18.875778 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l687s" podUID="7f2800e0-b66e-4ab2-ad4f-37c5ffe60120" Oct 03 07:50:19 crc kubenswrapper[4664]: I1003 07:50:19.435543 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-72cm2_6998d742-8d17-4f20-ab52-c30d9f7b0b89/kube-multus/1.log" Oct 03 07:50:19 crc kubenswrapper[4664]: I1003 07:50:19.436100 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-72cm2_6998d742-8d17-4f20-ab52-c30d9f7b0b89/kube-multus/0.log" Oct 03 07:50:19 crc kubenswrapper[4664]: I1003 07:50:19.436174 4664 generic.go:334] "Generic (PLEG): container finished" podID="6998d742-8d17-4f20-ab52-c30d9f7b0b89" containerID="482e54714945acaea85fdeeb4b89eb9b16568c96319d07eb812ef88bd5faeb85" exitCode=1 Oct 03 07:50:19 crc kubenswrapper[4664]: I1003 07:50:19.436230 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-72cm2" event={"ID":"6998d742-8d17-4f20-ab52-c30d9f7b0b89","Type":"ContainerDied","Data":"482e54714945acaea85fdeeb4b89eb9b16568c96319d07eb812ef88bd5faeb85"} Oct 03 07:50:19 crc kubenswrapper[4664]: I1003 07:50:19.436301 4664 scope.go:117] "RemoveContainer" containerID="a93e9f047fd6ffc420a229d1159b5c53c40c5efe1f945e45bafa3ba93a69caef" Oct 03 07:50:19 crc kubenswrapper[4664]: I1003 07:50:19.436970 4664 scope.go:117] "RemoveContainer" containerID="482e54714945acaea85fdeeb4b89eb9b16568c96319d07eb812ef88bd5faeb85" Oct 03 07:50:19 crc kubenswrapper[4664]: E1003 07:50:19.437261 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-72cm2_openshift-multus(6998d742-8d17-4f20-ab52-c30d9f7b0b89)\"" pod="openshift-multus/multus-72cm2" podUID="6998d742-8d17-4f20-ab52-c30d9f7b0b89" Oct 03 07:50:19 crc kubenswrapper[4664]: I1003 07:50:19.459863 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzllp" podStartSLOduration=94.459845352 podStartE2EDuration="1m34.459845352s" podCreationTimestamp="2025-10-03 07:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:08.41854968 +0000 UTC m=+109.239740180" watchObservedRunningTime="2025-10-03 07:50:19.459845352 +0000 UTC m=+120.281035842" Oct 03 07:50:19 crc kubenswrapper[4664]: I1003 07:50:19.875265 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:50:19 crc kubenswrapper[4664]: I1003 07:50:19.875299 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:50:19 crc kubenswrapper[4664]: I1003 07:50:19.875316 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:50:19 crc kubenswrapper[4664]: E1003 07:50:19.876261 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:50:19 crc kubenswrapper[4664]: E1003 07:50:19.876378 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:50:19 crc kubenswrapper[4664]: E1003 07:50:19.876461 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:50:19 crc kubenswrapper[4664]: E1003 07:50:19.910360 4664 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 03 07:50:19 crc kubenswrapper[4664]: E1003 07:50:19.973459 4664 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 03 07:50:20 crc kubenswrapper[4664]: I1003 07:50:20.440878 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-72cm2_6998d742-8d17-4f20-ab52-c30d9f7b0b89/kube-multus/1.log" Oct 03 07:50:20 crc kubenswrapper[4664]: I1003 07:50:20.875636 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:50:20 crc kubenswrapper[4664]: E1003 07:50:20.875761 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l687s" podUID="7f2800e0-b66e-4ab2-ad4f-37c5ffe60120" Oct 03 07:50:21 crc kubenswrapper[4664]: I1003 07:50:21.876053 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:50:21 crc kubenswrapper[4664]: I1003 07:50:21.876070 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:50:21 crc kubenswrapper[4664]: I1003 07:50:21.876238 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:50:21 crc kubenswrapper[4664]: E1003 07:50:21.876429 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:50:21 crc kubenswrapper[4664]: E1003 07:50:21.876477 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:50:21 crc kubenswrapper[4664]: E1003 07:50:21.876332 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:50:22 crc kubenswrapper[4664]: I1003 07:50:22.876258 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:50:22 crc kubenswrapper[4664]: E1003 07:50:22.876454 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l687s" podUID="7f2800e0-b66e-4ab2-ad4f-37c5ffe60120" Oct 03 07:50:23 crc kubenswrapper[4664]: I1003 07:50:23.876072 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:50:23 crc kubenswrapper[4664]: I1003 07:50:23.876096 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:50:23 crc kubenswrapper[4664]: I1003 07:50:23.876105 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:50:23 crc kubenswrapper[4664]: E1003 07:50:23.877017 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:50:23 crc kubenswrapper[4664]: E1003 07:50:23.877248 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:50:23 crc kubenswrapper[4664]: E1003 07:50:23.877416 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:50:24 crc kubenswrapper[4664]: I1003 07:50:24.875321 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:50:24 crc kubenswrapper[4664]: E1003 07:50:24.875665 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l687s" podUID="7f2800e0-b66e-4ab2-ad4f-37c5ffe60120" Oct 03 07:50:24 crc kubenswrapper[4664]: E1003 07:50:24.974995 4664 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 03 07:50:25 crc kubenswrapper[4664]: I1003 07:50:25.875914 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:50:25 crc kubenswrapper[4664]: I1003 07:50:25.875914 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:50:25 crc kubenswrapper[4664]: I1003 07:50:25.875945 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:50:25 crc kubenswrapper[4664]: E1003 07:50:25.876333 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:50:25 crc kubenswrapper[4664]: E1003 07:50:25.876495 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:50:25 crc kubenswrapper[4664]: E1003 07:50:25.876065 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:50:26 crc kubenswrapper[4664]: I1003 07:50:26.875870 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:50:26 crc kubenswrapper[4664]: E1003 07:50:26.876586 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l687s" podUID="7f2800e0-b66e-4ab2-ad4f-37c5ffe60120" Oct 03 07:50:27 crc kubenswrapper[4664]: I1003 07:50:27.876224 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:50:27 crc kubenswrapper[4664]: I1003 07:50:27.876257 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:50:27 crc kubenswrapper[4664]: E1003 07:50:27.876362 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:50:27 crc kubenswrapper[4664]: I1003 07:50:27.876244 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:50:27 crc kubenswrapper[4664]: E1003 07:50:27.876455 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:50:27 crc kubenswrapper[4664]: E1003 07:50:27.876524 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:50:28 crc kubenswrapper[4664]: I1003 07:50:28.876425 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:50:28 crc kubenswrapper[4664]: E1003 07:50:28.877008 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l687s" podUID="7f2800e0-b66e-4ab2-ad4f-37c5ffe60120" Oct 03 07:50:28 crc kubenswrapper[4664]: I1003 07:50:28.877139 4664 scope.go:117] "RemoveContainer" containerID="46c7dd53736fe9d646bd1433c7f564d5096f83984954d65d5f63ce953cd49690" Oct 03 07:50:29 crc kubenswrapper[4664]: I1003 07:50:29.476805 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2jpvm_8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d/ovnkube-controller/3.log" Oct 03 07:50:29 crc kubenswrapper[4664]: I1003 07:50:29.479750 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" event={"ID":"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d","Type":"ContainerStarted","Data":"74dcdcf3bafa074aa1221fd77fffcad57d7519c23c8edc43218e8762a0ca5e1d"} Oct 03 07:50:29 crc kubenswrapper[4664]: I1003 07:50:29.480138 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:50:29 crc kubenswrapper[4664]: I1003 07:50:29.817402 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" podStartSLOduration=104.817374349 podStartE2EDuration="1m44.817374349s" podCreationTimestamp="2025-10-03 07:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:29.507949584 +0000 UTC m=+130.329140144" watchObservedRunningTime="2025-10-03 07:50:29.817374349 +0000 UTC m=+130.638564839" Oct 03 07:50:29 crc kubenswrapper[4664]: I1003 07:50:29.818824 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-l687s"] Oct 03 07:50:29 crc kubenswrapper[4664]: I1003 07:50:29.819442 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:50:29 crc kubenswrapper[4664]: E1003 07:50:29.819716 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l687s" podUID="7f2800e0-b66e-4ab2-ad4f-37c5ffe60120" Oct 03 07:50:29 crc kubenswrapper[4664]: I1003 07:50:29.875853 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:50:29 crc kubenswrapper[4664]: I1003 07:50:29.875890 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:50:29 crc kubenswrapper[4664]: I1003 07:50:29.875925 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:50:29 crc kubenswrapper[4664]: E1003 07:50:29.877158 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:50:29 crc kubenswrapper[4664]: I1003 07:50:29.877596 4664 scope.go:117] "RemoveContainer" containerID="482e54714945acaea85fdeeb4b89eb9b16568c96319d07eb812ef88bd5faeb85" Oct 03 07:50:29 crc kubenswrapper[4664]: E1003 07:50:29.878013 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:50:29 crc kubenswrapper[4664]: E1003 07:50:29.878133 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:50:29 crc kubenswrapper[4664]: E1003 07:50:29.975414 4664 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 03 07:50:30 crc kubenswrapper[4664]: I1003 07:50:30.485992 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-72cm2_6998d742-8d17-4f20-ab52-c30d9f7b0b89/kube-multus/1.log" Oct 03 07:50:30 crc kubenswrapper[4664]: I1003 07:50:30.486189 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-72cm2" event={"ID":"6998d742-8d17-4f20-ab52-c30d9f7b0b89","Type":"ContainerStarted","Data":"4d89d1654dd2e1ba9bea8e000dac63af587a2f465fc475abfe21847cbc232292"} Oct 03 07:50:31 crc kubenswrapper[4664]: I1003 07:50:31.875648 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:50:31 crc kubenswrapper[4664]: I1003 07:50:31.875684 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:50:31 crc kubenswrapper[4664]: I1003 07:50:31.875668 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:50:31 crc kubenswrapper[4664]: I1003 07:50:31.875664 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:50:31 crc kubenswrapper[4664]: E1003 07:50:31.875776 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:50:31 crc kubenswrapper[4664]: E1003 07:50:31.875853 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l687s" podUID="7f2800e0-b66e-4ab2-ad4f-37c5ffe60120" Oct 03 07:50:31 crc kubenswrapper[4664]: E1003 07:50:31.875986 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:50:31 crc kubenswrapper[4664]: E1003 07:50:31.876086 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:50:33 crc kubenswrapper[4664]: I1003 07:50:33.875220 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:50:33 crc kubenswrapper[4664]: I1003 07:50:33.875220 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:50:33 crc kubenswrapper[4664]: I1003 07:50:33.875242 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:50:33 crc kubenswrapper[4664]: I1003 07:50:33.875339 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:50:33 crc kubenswrapper[4664]: E1003 07:50:33.875450 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 07:50:33 crc kubenswrapper[4664]: E1003 07:50:33.875544 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 07:50:33 crc kubenswrapper[4664]: E1003 07:50:33.875594 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 07:50:33 crc kubenswrapper[4664]: E1003 07:50:33.875704 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l687s" podUID="7f2800e0-b66e-4ab2-ad4f-37c5ffe60120" Oct 03 07:50:35 crc kubenswrapper[4664]: I1003 07:50:35.875980 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:50:35 crc kubenswrapper[4664]: I1003 07:50:35.876885 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:50:35 crc kubenswrapper[4664]: I1003 07:50:35.876664 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:50:35 crc kubenswrapper[4664]: I1003 07:50:35.877077 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:50:35 crc kubenswrapper[4664]: I1003 07:50:35.879508 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 03 07:50:35 crc kubenswrapper[4664]: I1003 07:50:35.879687 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 03 07:50:35 crc kubenswrapper[4664]: I1003 07:50:35.880300 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 03 07:50:35 crc kubenswrapper[4664]: I1003 07:50:35.880471 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 03 07:50:35 crc kubenswrapper[4664]: I1003 07:50:35.880523 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 03 07:50:35 crc kubenswrapper[4664]: I1003 07:50:35.880532 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 03 07:50:37 crc kubenswrapper[4664]: I1003 07:50:37.803444 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.567407 4664 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.612367 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-87c8s"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.613219 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-87c8s" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.619053 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.619053 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.619732 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-wd7mj"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.620508 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-wd7mj" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.621409 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fjs9m"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.621960 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.622020 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fjs9m" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.622187 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.622522 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.623238 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.624854 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.625189 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.625405 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.625735 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.625909 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.626048 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.628362 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-sm42l"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.628935 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sm42l" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.629940 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.633305 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.633519 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.633828 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.633993 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.634186 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.634753 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.635091 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.635160 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.635303 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.635455 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.635748 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.636415 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-8sdmc"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.636715 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.637101 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-8sdmc" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.636826 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.637405 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.638416 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.638798 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-smvlm"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.639542 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-smvlm" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.641424 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-6t5kh"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.642060 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6t5kh" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.645493 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.645680 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.645837 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.645992 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.646021 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.646206 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.646539 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.646640 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.646833 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.646869 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.647054 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.647189 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.647404 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.648879 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sd2nj"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.649537 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-sd2nj" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.652147 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.652758 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.653317 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.654819 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.654978 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.655040 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.655175 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.655189 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.655316 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.655355 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.655471 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.655495 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.655627 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.655648 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.657583 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-tdjrr"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.658434 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tdjrr" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.658806 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-jtp75"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.659224 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-jtp75" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.661313 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-zcngp"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.661979 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-zcngp" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.663444 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-h46lz"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.663871 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-h46lz" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.664350 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xrzpw"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.665586 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xrzpw" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.681061 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-b6htg"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.708989 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.709213 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.709229 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-2m6x7"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.709534 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.709811 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-b6htg" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.710491 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.710872 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.711646 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.713958 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-2m6x7" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.716312 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ftnz\" (UniqueName: \"kubernetes.io/projected/0c70928e-51e2-4c62-8ce6-c1f8ba489f8f-kube-api-access-2ftnz\") pod \"cluster-samples-operator-665b6dd947-b6htg\" (UID: \"0c70928e-51e2-4c62-8ce6-c1f8ba489f8f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-b6htg" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.716347 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-h46lz\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " pod="openshift-authentication/oauth-openshift-558db77b4-h46lz" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.716372 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d8683a6-e42e-4541-81b3-2b28fc5e7be6-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-xrzpw\" (UID: \"7d8683a6-e42e-4541-81b3-2b28fc5e7be6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xrzpw" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.716411 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1-serving-cert\") pod \"apiserver-76f77b778f-87c8s\" (UID: \"4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1\") " pod="openshift-apiserver/apiserver-76f77b778f-87c8s" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.716432 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b63fb51-bfa4-4c92-a1b0-9044cf7cff03-config\") pod \"controller-manager-879f6c89f-sd2nj\" (UID: \"1b63fb51-bfa4-4c92-a1b0-9044cf7cff03\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sd2nj" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.716452 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cf947105-e97d-4a1c-9b59-bf6b37461c1e-audit-policies\") pod \"oauth-openshift-558db77b4-h46lz\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " pod="openshift-authentication/oauth-openshift-558db77b4-h46lz" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.716470 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/171032ce-a5a4-4f30-bdc1-8c39e19efe99-serving-cert\") pod \"openshift-config-operator-7777fb866f-tdjrr\" (UID: \"171032ce-a5a4-4f30-bdc1-8c39e19efe99\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tdjrr" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.716489 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e-serving-cert\") pod \"route-controller-manager-6576b87f9c-sm42l\" (UID: \"11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sm42l" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.716507 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2a7ecab1-cd44-4dee-810a-1fd601b96eb4-serving-cert\") pod \"apiserver-7bbb656c7d-smvlm\" (UID: \"2a7ecab1-cd44-4dee-810a-1fd601b96eb4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-smvlm" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.716524 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2a7ecab1-cd44-4dee-810a-1fd601b96eb4-audit-dir\") pod \"apiserver-7bbb656c7d-smvlm\" (UID: \"2a7ecab1-cd44-4dee-810a-1fd601b96eb4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-smvlm" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.716547 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-h46lz\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " pod="openshift-authentication/oauth-openshift-558db77b4-h46lz" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.716570 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1-etcd-client\") pod \"apiserver-76f77b778f-87c8s\" (UID: \"4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1\") " pod="openshift-apiserver/apiserver-76f77b778f-87c8s" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.716591 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/989a2cb8-e14c-4ffb-9b1f-a5bea171b3f5-config\") pod \"openshift-apiserver-operator-796bbdcf4f-fjs9m\" (UID: \"989a2cb8-e14c-4ffb-9b1f-a5bea171b3f5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fjs9m" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.716626 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ceef7ba8-f996-4b56-a477-23873e39cde7-console-serving-cert\") pod \"console-f9d7485db-2m6x7\" (UID: \"ceef7ba8-f996-4b56-a477-23873e39cde7\") " pod="openshift-console/console-f9d7485db-2m6x7" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.716649 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33f7be8a-c6e0-47cd-b9bf-d42447f23980-serving-cert\") pod \"console-operator-58897d9998-jtp75\" (UID: \"33f7be8a-c6e0-47cd-b9bf-d42447f23980\") " pod="openshift-console-operator/console-operator-58897d9998-jtp75" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.716683 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6npk\" (UniqueName: \"kubernetes.io/projected/cf947105-e97d-4a1c-9b59-bf6b37461c1e-kube-api-access-w6npk\") pod \"oauth-openshift-558db77b4-h46lz\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " pod="openshift-authentication/oauth-openshift-558db77b4-h46lz" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.716705 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkqr8\" (UniqueName: \"kubernetes.io/projected/2a7ecab1-cd44-4dee-810a-1fd601b96eb4-kube-api-access-gkqr8\") pod \"apiserver-7bbb656c7d-smvlm\" (UID: \"2a7ecab1-cd44-4dee-810a-1fd601b96eb4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-smvlm" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.716739 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33f7be8a-c6e0-47cd-b9bf-d42447f23980-config\") pod \"console-operator-58897d9998-jtp75\" (UID: \"33f7be8a-c6e0-47cd-b9bf-d42447f23980\") " pod="openshift-console-operator/console-operator-58897d9998-jtp75" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.716760 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcfn9\" (UniqueName: \"kubernetes.io/projected/acf5523b-3f1d-495e-8014-0313925e8727-kube-api-access-kcfn9\") pod \"downloads-7954f5f757-zcngp\" (UID: \"acf5523b-3f1d-495e-8014-0313925e8727\") " pod="openshift-console/downloads-7954f5f757-zcngp" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.716783 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-h46lz\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " pod="openshift-authentication/oauth-openshift-558db77b4-h46lz" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.716805 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1-node-pullsecrets\") pod \"apiserver-76f77b778f-87c8s\" (UID: \"4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1\") " pod="openshift-apiserver/apiserver-76f77b778f-87c8s" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.716826 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ceef7ba8-f996-4b56-a477-23873e39cde7-console-oauth-config\") pod \"console-f9d7485db-2m6x7\" (UID: \"ceef7ba8-f996-4b56-a477-23873e39cde7\") " pod="openshift-console/console-f9d7485db-2m6x7" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.716850 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1-audit-dir\") pod \"apiserver-76f77b778f-87c8s\" (UID: \"4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1\") " pod="openshift-apiserver/apiserver-76f77b778f-87c8s" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.716873 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ceef7ba8-f996-4b56-a477-23873e39cde7-console-config\") pod \"console-f9d7485db-2m6x7\" (UID: \"ceef7ba8-f996-4b56-a477-23873e39cde7\") " pod="openshift-console/console-f9d7485db-2m6x7" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.716897 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/f3a78dac-6d58-4647-83fb-b0f36f2f660a-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-wd7mj\" (UID: \"f3a78dac-6d58-4647-83fb-b0f36f2f660a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wd7mj" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.716919 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2a7ecab1-cd44-4dee-810a-1fd601b96eb4-audit-policies\") pod \"apiserver-7bbb656c7d-smvlm\" (UID: \"2a7ecab1-cd44-4dee-810a-1fd601b96eb4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-smvlm" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.716944 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/ef3e87db-9d6e-400e-bd6e-bd578baabbf6-machine-approver-tls\") pod \"machine-approver-56656f9798-6t5kh\" (UID: \"ef3e87db-9d6e-400e-bd6e-bd578baabbf6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6t5kh" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.716971 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e-config\") pod \"route-controller-manager-6576b87f9c-sm42l\" (UID: \"11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sm42l" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.716996 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/33f7be8a-c6e0-47cd-b9bf-d42447f23980-trusted-ca\") pod \"console-operator-58897d9998-jtp75\" (UID: \"33f7be8a-c6e0-47cd-b9bf-d42447f23980\") " pod="openshift-console-operator/console-operator-58897d9998-jtp75" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.717017 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1-audit\") pod \"apiserver-76f77b778f-87c8s\" (UID: \"4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1\") " pod="openshift-apiserver/apiserver-76f77b778f-87c8s" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.717040 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76eee4ae-1408-4cd8-819a-a5ef5c887b9a-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-8sdmc\" (UID: \"76eee4ae-1408-4cd8-819a-a5ef5c887b9a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8sdmc" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.717066 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/7d8683a6-e42e-4541-81b3-2b28fc5e7be6-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-xrzpw\" (UID: \"7d8683a6-e42e-4541-81b3-2b28fc5e7be6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xrzpw" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.717096 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f3a78dac-6d58-4647-83fb-b0f36f2f660a-images\") pod \"machine-api-operator-5694c8668f-wd7mj\" (UID: \"f3a78dac-6d58-4647-83fb-b0f36f2f660a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wd7mj" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.717118 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/989a2cb8-e14c-4ffb-9b1f-a5bea171b3f5-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-fjs9m\" (UID: \"989a2cb8-e14c-4ffb-9b1f-a5bea171b3f5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fjs9m" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.717141 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3a78dac-6d58-4647-83fb-b0f36f2f660a-config\") pod \"machine-api-operator-5694c8668f-wd7mj\" (UID: \"f3a78dac-6d58-4647-83fb-b0f36f2f660a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wd7mj" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.717160 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cf947105-e97d-4a1c-9b59-bf6b37461c1e-audit-dir\") pod \"oauth-openshift-558db77b4-h46lz\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " pod="openshift-authentication/oauth-openshift-558db77b4-h46lz" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.717181 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-h46lz\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " pod="openshift-authentication/oauth-openshift-558db77b4-h46lz" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.717204 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2h2p\" (UniqueName: \"kubernetes.io/projected/4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1-kube-api-access-n2h2p\") pod \"apiserver-76f77b778f-87c8s\" (UID: \"4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1\") " pod="openshift-apiserver/apiserver-76f77b778f-87c8s" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.717240 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn4s7\" (UniqueName: \"kubernetes.io/projected/f3a78dac-6d58-4647-83fb-b0f36f2f660a-kube-api-access-hn4s7\") pod \"machine-api-operator-5694c8668f-wd7mj\" (UID: \"f3a78dac-6d58-4647-83fb-b0f36f2f660a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wd7mj" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.717262 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbwrk\" (UniqueName: \"kubernetes.io/projected/ef3e87db-9d6e-400e-bd6e-bd578baabbf6-kube-api-access-qbwrk\") pod \"machine-approver-56656f9798-6t5kh\" (UID: \"ef3e87db-9d6e-400e-bd6e-bd578baabbf6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6t5kh" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.717286 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b63fb51-bfa4-4c92-a1b0-9044cf7cff03-serving-cert\") pod \"controller-manager-879f6c89f-sd2nj\" (UID: \"1b63fb51-bfa4-4c92-a1b0-9044cf7cff03\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sd2nj" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.717308 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2a7ecab1-cd44-4dee-810a-1fd601b96eb4-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-smvlm\" (UID: \"2a7ecab1-cd44-4dee-810a-1fd601b96eb4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-smvlm" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.717332 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1-config\") pod \"apiserver-76f77b778f-87c8s\" (UID: \"4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1\") " pod="openshift-apiserver/apiserver-76f77b778f-87c8s" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.717353 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-h46lz\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " pod="openshift-authentication/oauth-openshift-558db77b4-h46lz" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.717372 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/2a7ecab1-cd44-4dee-810a-1fd601b96eb4-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-smvlm\" (UID: \"2a7ecab1-cd44-4dee-810a-1fd601b96eb4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-smvlm" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.717393 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs5mq\" (UniqueName: \"kubernetes.io/projected/33f7be8a-c6e0-47cd-b9bf-d42447f23980-kube-api-access-xs5mq\") pod \"console-operator-58897d9998-jtp75\" (UID: \"33f7be8a-c6e0-47cd-b9bf-d42447f23980\") " pod="openshift-console-operator/console-operator-58897d9998-jtp75" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.717415 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-h46lz\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " pod="openshift-authentication/oauth-openshift-558db77b4-h46lz" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.717437 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-h46lz\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " pod="openshift-authentication/oauth-openshift-558db77b4-h46lz" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.717457 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76eee4ae-1408-4cd8-819a-a5ef5c887b9a-serving-cert\") pod \"authentication-operator-69f744f599-8sdmc\" (UID: \"76eee4ae-1408-4cd8-819a-a5ef5c887b9a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8sdmc" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.717479 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0c70928e-51e2-4c62-8ce6-c1f8ba489f8f-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-b6htg\" (UID: \"0c70928e-51e2-4c62-8ce6-c1f8ba489f8f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-b6htg" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.717500 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-h46lz\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " pod="openshift-authentication/oauth-openshift-558db77b4-h46lz" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.717523 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-h46lz\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " pod="openshift-authentication/oauth-openshift-558db77b4-h46lz" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.717549 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-h46lz\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " pod="openshift-authentication/oauth-openshift-558db77b4-h46lz" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.717571 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1-encryption-config\") pod \"apiserver-76f77b778f-87c8s\" (UID: \"4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1\") " pod="openshift-apiserver/apiserver-76f77b778f-87c8s" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.717596 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ef3e87db-9d6e-400e-bd6e-bd578baabbf6-auth-proxy-config\") pod \"machine-approver-56656f9798-6t5kh\" (UID: \"ef3e87db-9d6e-400e-bd6e-bd578baabbf6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6t5kh" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.717637 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef3e87db-9d6e-400e-bd6e-bd578baabbf6-config\") pod \"machine-approver-56656f9798-6t5kh\" (UID: \"ef3e87db-9d6e-400e-bd6e-bd578baabbf6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6t5kh" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.717663 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/171032ce-a5a4-4f30-bdc1-8c39e19efe99-available-featuregates\") pod \"openshift-config-operator-7777fb866f-tdjrr\" (UID: \"171032ce-a5a4-4f30-bdc1-8c39e19efe99\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tdjrr" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.717688 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76eee4ae-1408-4cd8-819a-a5ef5c887b9a-config\") pod \"authentication-operator-69f744f599-8sdmc\" (UID: \"76eee4ae-1408-4cd8-819a-a5ef5c887b9a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8sdmc" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.717709 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gngmq\" (UniqueName: \"kubernetes.io/projected/76eee4ae-1408-4cd8-819a-a5ef5c887b9a-kube-api-access-gngmq\") pod \"authentication-operator-69f744f599-8sdmc\" (UID: \"76eee4ae-1408-4cd8-819a-a5ef5c887b9a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8sdmc" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.717731 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b63fb51-bfa4-4c92-a1b0-9044cf7cff03-client-ca\") pod \"controller-manager-879f6c89f-sd2nj\" (UID: \"1b63fb51-bfa4-4c92-a1b0-9044cf7cff03\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sd2nj" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.717756 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-h46lz\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " pod="openshift-authentication/oauth-openshift-558db77b4-h46lz" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.717778 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1-trusted-ca-bundle\") pod \"apiserver-76f77b778f-87c8s\" (UID: \"4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1\") " pod="openshift-apiserver/apiserver-76f77b778f-87c8s" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.717802 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh8nh\" (UniqueName: \"kubernetes.io/projected/989a2cb8-e14c-4ffb-9b1f-a5bea171b3f5-kube-api-access-sh8nh\") pod \"openshift-apiserver-operator-796bbdcf4f-fjs9m\" (UID: \"989a2cb8-e14c-4ffb-9b1f-a5bea171b3f5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fjs9m" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.717822 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e-client-ca\") pod \"route-controller-manager-6576b87f9c-sm42l\" (UID: \"11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sm42l" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.717849 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2a7ecab1-cd44-4dee-810a-1fd601b96eb4-etcd-client\") pod \"apiserver-7bbb656c7d-smvlm\" (UID: \"2a7ecab1-cd44-4dee-810a-1fd601b96eb4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-smvlm" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.717871 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zbwr\" (UniqueName: \"kubernetes.io/projected/7d8683a6-e42e-4541-81b3-2b28fc5e7be6-kube-api-access-5zbwr\") pod \"cluster-image-registry-operator-dc59b4c8b-xrzpw\" (UID: \"7d8683a6-e42e-4541-81b3-2b28fc5e7be6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xrzpw" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.717895 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1-etcd-serving-ca\") pod \"apiserver-76f77b778f-87c8s\" (UID: \"4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1\") " pod="openshift-apiserver/apiserver-76f77b778f-87c8s" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.717914 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1b63fb51-bfa4-4c92-a1b0-9044cf7cff03-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-sd2nj\" (UID: \"1b63fb51-bfa4-4c92-a1b0-9044cf7cff03\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sd2nj" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.717935 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bwfx\" (UniqueName: \"kubernetes.io/projected/ceef7ba8-f996-4b56-a477-23873e39cde7-kube-api-access-2bwfx\") pod \"console-f9d7485db-2m6x7\" (UID: \"ceef7ba8-f996-4b56-a477-23873e39cde7\") " pod="openshift-console/console-f9d7485db-2m6x7" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.717958 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/2a7ecab1-cd44-4dee-810a-1fd601b96eb4-encryption-config\") pod \"apiserver-7bbb656c7d-smvlm\" (UID: \"2a7ecab1-cd44-4dee-810a-1fd601b96eb4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-smvlm" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.717979 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ceef7ba8-f996-4b56-a477-23873e39cde7-service-ca\") pod \"console-f9d7485db-2m6x7\" (UID: \"ceef7ba8-f996-4b56-a477-23873e39cde7\") " pod="openshift-console/console-f9d7485db-2m6x7" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.718000 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ceef7ba8-f996-4b56-a477-23873e39cde7-trusted-ca-bundle\") pod \"console-f9d7485db-2m6x7\" (UID: \"ceef7ba8-f996-4b56-a477-23873e39cde7\") " pod="openshift-console/console-f9d7485db-2m6x7" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.718024 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1-image-import-ca\") pod \"apiserver-76f77b778f-87c8s\" (UID: \"4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1\") " pod="openshift-apiserver/apiserver-76f77b778f-87c8s" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.718045 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84l7l\" (UniqueName: \"kubernetes.io/projected/171032ce-a5a4-4f30-bdc1-8c39e19efe99-kube-api-access-84l7l\") pod \"openshift-config-operator-7777fb866f-tdjrr\" (UID: \"171032ce-a5a4-4f30-bdc1-8c39e19efe99\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tdjrr" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.718066 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ceef7ba8-f996-4b56-a477-23873e39cde7-oauth-serving-cert\") pod \"console-f9d7485db-2m6x7\" (UID: \"ceef7ba8-f996-4b56-a477-23873e39cde7\") " pod="openshift-console/console-f9d7485db-2m6x7" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.718086 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8tqq\" (UniqueName: \"kubernetes.io/projected/11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e-kube-api-access-n8tqq\") pod \"route-controller-manager-6576b87f9c-sm42l\" (UID: \"11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sm42l" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.718109 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76eee4ae-1408-4cd8-819a-a5ef5c887b9a-service-ca-bundle\") pod \"authentication-operator-69f744f599-8sdmc\" (UID: \"76eee4ae-1408-4cd8-819a-a5ef5c887b9a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8sdmc" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.718131 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqrm7\" (UniqueName: \"kubernetes.io/projected/1b63fb51-bfa4-4c92-a1b0-9044cf7cff03-kube-api-access-tqrm7\") pod \"controller-manager-879f6c89f-sd2nj\" (UID: \"1b63fb51-bfa4-4c92-a1b0-9044cf7cff03\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sd2nj" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.718269 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d8683a6-e42e-4541-81b3-2b28fc5e7be6-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-xrzpw\" (UID: \"7d8683a6-e42e-4541-81b3-2b28fc5e7be6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xrzpw" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.718661 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-sm42l"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.718822 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.719002 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.719093 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.719131 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.719245 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.719258 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.719337 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.719354 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.719448 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.719456 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.719549 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.719558 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.719643 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.719809 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.719830 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.719949 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.719974 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.719988 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.720120 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.720124 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.720209 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fjs9m"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.727538 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.727774 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.727808 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-87c8s"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.728041 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.728065 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.728494 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.728656 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.728788 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.728984 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.729042 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.729116 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.729174 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.730215 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.730383 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-szr58"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.731197 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.731282 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-h7lkc"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.731991 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-h7lkc" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.737514 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.745236 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.745841 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.746034 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.746159 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.746173 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.746268 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.755545 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-wd7mj"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.763385 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-42xd6"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.775391 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-42xd6" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.776344 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.776682 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.777350 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.777693 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.779246 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.779387 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-jtp75"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.781008 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xrzpw"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.783118 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-j66jg"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.783970 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j66jg" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.784439 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-t9rqg"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.786802 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.789111 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-t9rqg" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.789737 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c5ncx"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.792644 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hpqxb"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.793672 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c5ncx" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.794201 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vcvhp"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.794594 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hpqxb" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.795158 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s7kfk"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.795431 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vcvhp" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.796429 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-tm59k"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.796972 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-tm59k" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.797112 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s7kfk" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.800572 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-vsmt5"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.801310 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-vsmt5" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.801501 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-rlrp9"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.801890 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-rlrp9" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.803091 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-nzfrx"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.804000 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zrt4t"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.804185 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nzfrx" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.805032 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zrt4t" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.806691 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gj7qw"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.807198 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xt6kz"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.807654 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xt6kz" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.807859 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gj7qw" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.808835 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-9z9sg"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.809820 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9z9sg" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.810178 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-z8tqh"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.810859 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-z8tqh" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.811578 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-zsdch"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.812377 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zsdch" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.812864 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324625-nj78m"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.813920 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324625-nj78m" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.814038 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-jxft6"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.814788 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-jxft6" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.815186 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dg8s7"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.815695 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.816142 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dg8s7" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.816799 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-b6htg"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.817718 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-j2cbs"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.818527 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-j2cbs" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.818812 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-2m6x7"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.819240 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/7d8683a6-e42e-4541-81b3-2b28fc5e7be6-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-xrzpw\" (UID: \"7d8683a6-e42e-4541-81b3-2b28fc5e7be6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xrzpw" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.819270 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1-audit\") pod \"apiserver-76f77b778f-87c8s\" (UID: \"4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1\") " pod="openshift-apiserver/apiserver-76f77b778f-87c8s" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.819293 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76eee4ae-1408-4cd8-819a-a5ef5c887b9a-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-8sdmc\" (UID: \"76eee4ae-1408-4cd8-819a-a5ef5c887b9a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8sdmc" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.819319 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ebb51e4-a635-4c8a-b287-79c0e7d74a9c-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-42xd6\" (UID: \"8ebb51e4-a635-4c8a-b287-79c0e7d74a9c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-42xd6" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.819340 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d2108a93-65c6-4626-9fb6-f93854855b80-etcd-client\") pod \"etcd-operator-b45778765-t9rqg\" (UID: \"d2108a93-65c6-4626-9fb6-f93854855b80\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t9rqg" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.819361 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtn69\" (UniqueName: \"kubernetes.io/projected/c47b8dca-2e97-4aba-b303-00c0f2b36ecc-kube-api-access-gtn69\") pod \"service-ca-operator-777779d784-tm59k\" (UID: \"c47b8dca-2e97-4aba-b303-00c0f2b36ecc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tm59k" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.819382 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f3a78dac-6d58-4647-83fb-b0f36f2f660a-images\") pod \"machine-api-operator-5694c8668f-wd7mj\" (UID: \"f3a78dac-6d58-4647-83fb-b0f36f2f660a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wd7mj" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.819402 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/52515c5a-0f7d-42d4-90f8-97e66050f161-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-9z9sg\" (UID: \"52515c5a-0f7d-42d4-90f8-97e66050f161\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9z9sg" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.819418 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/bcb87980-5888-4a30-859f-a9ac5b95f2c0-stats-auth\") pod \"router-default-5444994796-jxft6\" (UID: \"bcb87980-5888-4a30-859f-a9ac5b95f2c0\") " pod="openshift-ingress/router-default-5444994796-jxft6" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.819434 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2108a93-65c6-4626-9fb6-f93854855b80-config\") pod \"etcd-operator-b45778765-t9rqg\" (UID: \"d2108a93-65c6-4626-9fb6-f93854855b80\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t9rqg" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.819451 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ab24ce7c-1a43-44de-98c3-e9bbdb0c7b6c-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-vsmt5\" (UID: \"ab24ce7c-1a43-44de-98c3-e9bbdb0c7b6c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vsmt5" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.819468 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/989a2cb8-e14c-4ffb-9b1f-a5bea171b3f5-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-fjs9m\" (UID: \"989a2cb8-e14c-4ffb-9b1f-a5bea171b3f5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fjs9m" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.819484 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3a78dac-6d58-4647-83fb-b0f36f2f660a-config\") pod \"machine-api-operator-5694c8668f-wd7mj\" (UID: \"f3a78dac-6d58-4647-83fb-b0f36f2f660a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wd7mj" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.819502 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cf947105-e97d-4a1c-9b59-bf6b37461c1e-audit-dir\") pod \"oauth-openshift-558db77b4-h46lz\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " pod="openshift-authentication/oauth-openshift-558db77b4-h46lz" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.819522 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-h46lz\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " pod="openshift-authentication/oauth-openshift-558db77b4-h46lz" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.819541 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a6aa30cf-1aa9-4c2a-a079-40182a1c51ef-proxy-tls\") pod \"machine-config-operator-74547568cd-nzfrx\" (UID: \"a6aa30cf-1aa9-4c2a-a079-40182a1c51ef\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nzfrx" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.819561 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6b56\" (UniqueName: \"kubernetes.io/projected/ab24ce7c-1a43-44de-98c3-e9bbdb0c7b6c-kube-api-access-s6b56\") pod \"multus-admission-controller-857f4d67dd-vsmt5\" (UID: \"ab24ce7c-1a43-44de-98c3-e9bbdb0c7b6c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vsmt5" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.819580 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2h2p\" (UniqueName: \"kubernetes.io/projected/4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1-kube-api-access-n2h2p\") pod \"apiserver-76f77b778f-87c8s\" (UID: \"4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1\") " pod="openshift-apiserver/apiserver-76f77b778f-87c8s" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.819596 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2108a93-65c6-4626-9fb6-f93854855b80-serving-cert\") pod \"etcd-operator-b45778765-t9rqg\" (UID: \"d2108a93-65c6-4626-9fb6-f93854855b80\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t9rqg" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.819647 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn4s7\" (UniqueName: \"kubernetes.io/projected/f3a78dac-6d58-4647-83fb-b0f36f2f660a-kube-api-access-hn4s7\") pod \"machine-api-operator-5694c8668f-wd7mj\" (UID: \"f3a78dac-6d58-4647-83fb-b0f36f2f660a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wd7mj" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.819665 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbwrk\" (UniqueName: \"kubernetes.io/projected/ef3e87db-9d6e-400e-bd6e-bd578baabbf6-kube-api-access-qbwrk\") pod \"machine-approver-56656f9798-6t5kh\" (UID: \"ef3e87db-9d6e-400e-bd6e-bd578baabbf6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6t5kh" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.819683 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xq8pc\" (UniqueName: \"kubernetes.io/projected/8ebb51e4-a635-4c8a-b287-79c0e7d74a9c-kube-api-access-xq8pc\") pod \"openshift-controller-manager-operator-756b6f6bc6-42xd6\" (UID: \"8ebb51e4-a635-4c8a-b287-79c0e7d74a9c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-42xd6" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.819700 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b63fb51-bfa4-4c92-a1b0-9044cf7cff03-serving-cert\") pod \"controller-manager-879f6c89f-sd2nj\" (UID: \"1b63fb51-bfa4-4c92-a1b0-9044cf7cff03\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sd2nj" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.819720 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1-config\") pod \"apiserver-76f77b778f-87c8s\" (UID: \"4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1\") " pod="openshift-apiserver/apiserver-76f77b778f-87c8s" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.819736 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2a7ecab1-cd44-4dee-810a-1fd601b96eb4-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-smvlm\" (UID: \"2a7ecab1-cd44-4dee-810a-1fd601b96eb4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-smvlm" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.819753 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24bdq\" (UniqueName: \"kubernetes.io/projected/a6aa30cf-1aa9-4c2a-a079-40182a1c51ef-kube-api-access-24bdq\") pod \"machine-config-operator-74547568cd-nzfrx\" (UID: \"a6aa30cf-1aa9-4c2a-a079-40182a1c51ef\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nzfrx" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.819768 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cchd\" (UniqueName: \"kubernetes.io/projected/5c3aff15-c0d3-4d20-8ee3-7d753d5f7c91-kube-api-access-9cchd\") pod \"collect-profiles-29324625-nj78m\" (UID: \"5c3aff15-c0d3-4d20-8ee3-7d753d5f7c91\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324625-nj78m" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.819783 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bcb87980-5888-4a30-859f-a9ac5b95f2c0-service-ca-bundle\") pod \"router-default-5444994796-jxft6\" (UID: \"bcb87980-5888-4a30-859f-a9ac5b95f2c0\") " pod="openshift-ingress/router-default-5444994796-jxft6" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.819800 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-h46lz\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " pod="openshift-authentication/oauth-openshift-558db77b4-h46lz" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.819817 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/2a7ecab1-cd44-4dee-810a-1fd601b96eb4-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-smvlm\" (UID: \"2a7ecab1-cd44-4dee-810a-1fd601b96eb4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-smvlm" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.819835 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs5mq\" (UniqueName: \"kubernetes.io/projected/33f7be8a-c6e0-47cd-b9bf-d42447f23980-kube-api-access-xs5mq\") pod \"console-operator-58897d9998-jtp75\" (UID: \"33f7be8a-c6e0-47cd-b9bf-d42447f23980\") " pod="openshift-console-operator/console-operator-58897d9998-jtp75" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.819851 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-h46lz\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " pod="openshift-authentication/oauth-openshift-558db77b4-h46lz" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.819867 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-h46lz\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " pod="openshift-authentication/oauth-openshift-558db77b4-h46lz" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.819882 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76eee4ae-1408-4cd8-819a-a5ef5c887b9a-serving-cert\") pod \"authentication-operator-69f744f599-8sdmc\" (UID: \"76eee4ae-1408-4cd8-819a-a5ef5c887b9a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8sdmc" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.819899 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqcms\" (UniqueName: \"kubernetes.io/projected/f3284298-3f20-43d6-95ae-7d40c56534d3-kube-api-access-cqcms\") pod \"package-server-manager-789f6589d5-dg8s7\" (UID: \"f3284298-3f20-43d6-95ae-7d40c56534d3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dg8s7" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.819920 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0c70928e-51e2-4c62-8ce6-c1f8ba489f8f-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-b6htg\" (UID: \"0c70928e-51e2-4c62-8ce6-c1f8ba489f8f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-b6htg" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.819936 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/52515c5a-0f7d-42d4-90f8-97e66050f161-proxy-tls\") pod \"machine-config-controller-84d6567774-9z9sg\" (UID: \"52515c5a-0f7d-42d4-90f8-97e66050f161\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9z9sg" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.819955 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5c3aff15-c0d3-4d20-8ee3-7d753d5f7c91-secret-volume\") pod \"collect-profiles-29324625-nj78m\" (UID: \"5c3aff15-c0d3-4d20-8ee3-7d753d5f7c91\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324625-nj78m" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.820014 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/d2108a93-65c6-4626-9fb6-f93854855b80-etcd-service-ca\") pod \"etcd-operator-b45778765-t9rqg\" (UID: \"d2108a93-65c6-4626-9fb6-f93854855b80\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t9rqg" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.820092 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-h46lz\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " pod="openshift-authentication/oauth-openshift-558db77b4-h46lz" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.820151 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-h46lz\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " pod="openshift-authentication/oauth-openshift-558db77b4-h46lz" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.820170 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-h46lz\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " pod="openshift-authentication/oauth-openshift-558db77b4-h46lz" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.820186 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1-encryption-config\") pod \"apiserver-76f77b778f-87c8s\" (UID: \"4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1\") " pod="openshift-apiserver/apiserver-76f77b778f-87c8s" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.820229 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bcb87980-5888-4a30-859f-a9ac5b95f2c0-metrics-certs\") pod \"router-default-5444994796-jxft6\" (UID: \"bcb87980-5888-4a30-859f-a9ac5b95f2c0\") " pod="openshift-ingress/router-default-5444994796-jxft6" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.820252 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ef3e87db-9d6e-400e-bd6e-bd578baabbf6-auth-proxy-config\") pod \"machine-approver-56656f9798-6t5kh\" (UID: \"ef3e87db-9d6e-400e-bd6e-bd578baabbf6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6t5kh" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.820270 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef3e87db-9d6e-400e-bd6e-bd578baabbf6-config\") pod \"machine-approver-56656f9798-6t5kh\" (UID: \"ef3e87db-9d6e-400e-bd6e-bd578baabbf6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6t5kh" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.820320 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/171032ce-a5a4-4f30-bdc1-8c39e19efe99-available-featuregates\") pod \"openshift-config-operator-7777fb866f-tdjrr\" (UID: \"171032ce-a5a4-4f30-bdc1-8c39e19efe99\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tdjrr" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.820341 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76eee4ae-1408-4cd8-819a-a5ef5c887b9a-config\") pod \"authentication-operator-69f744f599-8sdmc\" (UID: \"76eee4ae-1408-4cd8-819a-a5ef5c887b9a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8sdmc" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.820357 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gngmq\" (UniqueName: \"kubernetes.io/projected/76eee4ae-1408-4cd8-819a-a5ef5c887b9a-kube-api-access-gngmq\") pod \"authentication-operator-69f744f599-8sdmc\" (UID: \"76eee4ae-1408-4cd8-819a-a5ef5c887b9a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8sdmc" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.820397 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a6aa30cf-1aa9-4c2a-a079-40182a1c51ef-auth-proxy-config\") pod \"machine-config-operator-74547568cd-nzfrx\" (UID: \"a6aa30cf-1aa9-4c2a-a079-40182a1c51ef\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nzfrx" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.820418 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-h46lz\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " pod="openshift-authentication/oauth-openshift-558db77b4-h46lz" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.820434 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1-trusted-ca-bundle\") pod \"apiserver-76f77b778f-87c8s\" (UID: \"4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1\") " pod="openshift-apiserver/apiserver-76f77b778f-87c8s" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.820473 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh8nh\" (UniqueName: \"kubernetes.io/projected/989a2cb8-e14c-4ffb-9b1f-a5bea171b3f5-kube-api-access-sh8nh\") pod \"openshift-apiserver-operator-796bbdcf4f-fjs9m\" (UID: \"989a2cb8-e14c-4ffb-9b1f-a5bea171b3f5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fjs9m" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.820500 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b63fb51-bfa4-4c92-a1b0-9044cf7cff03-client-ca\") pod \"controller-manager-879f6c89f-sd2nj\" (UID: \"1b63fb51-bfa4-4c92-a1b0-9044cf7cff03\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sd2nj" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.820556 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2a7ecab1-cd44-4dee-810a-1fd601b96eb4-etcd-client\") pod \"apiserver-7bbb656c7d-smvlm\" (UID: \"2a7ecab1-cd44-4dee-810a-1fd601b96eb4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-smvlm" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.820561 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cf947105-e97d-4a1c-9b59-bf6b37461c1e-audit-dir\") pod \"oauth-openshift-558db77b4-h46lz\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " pod="openshift-authentication/oauth-openshift-558db77b4-h46lz" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.820584 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e-client-ca\") pod \"route-controller-manager-6576b87f9c-sm42l\" (UID: \"11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sm42l" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.820899 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ebb51e4-a635-4c8a-b287-79c0e7d74a9c-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-42xd6\" (UID: \"8ebb51e4-a635-4c8a-b287-79c0e7d74a9c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-42xd6" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.820982 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zbwr\" (UniqueName: \"kubernetes.io/projected/7d8683a6-e42e-4541-81b3-2b28fc5e7be6-kube-api-access-5zbwr\") pod \"cluster-image-registry-operator-dc59b4c8b-xrzpw\" (UID: \"7d8683a6-e42e-4541-81b3-2b28fc5e7be6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xrzpw" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.821739 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1-etcd-serving-ca\") pod \"apiserver-76f77b778f-87c8s\" (UID: \"4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1\") " pod="openshift-apiserver/apiserver-76f77b778f-87c8s" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.821760 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1b63fb51-bfa4-4c92-a1b0-9044cf7cff03-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-sd2nj\" (UID: \"1b63fb51-bfa4-4c92-a1b0-9044cf7cff03\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sd2nj" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.821805 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c47b8dca-2e97-4aba-b303-00c0f2b36ecc-config\") pod \"service-ca-operator-777779d784-tm59k\" (UID: \"c47b8dca-2e97-4aba-b303-00c0f2b36ecc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tm59k" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.821829 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/2a7ecab1-cd44-4dee-810a-1fd601b96eb4-encryption-config\") pod \"apiserver-7bbb656c7d-smvlm\" (UID: \"2a7ecab1-cd44-4dee-810a-1fd601b96eb4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-smvlm" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.821886 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ceef7ba8-f996-4b56-a477-23873e39cde7-service-ca\") pod \"console-f9d7485db-2m6x7\" (UID: \"ceef7ba8-f996-4b56-a477-23873e39cde7\") " pod="openshift-console/console-f9d7485db-2m6x7" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.821911 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ceef7ba8-f996-4b56-a477-23873e39cde7-trusted-ca-bundle\") pod \"console-f9d7485db-2m6x7\" (UID: \"ceef7ba8-f996-4b56-a477-23873e39cde7\") " pod="openshift-console/console-f9d7485db-2m6x7" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.821936 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bwfx\" (UniqueName: \"kubernetes.io/projected/ceef7ba8-f996-4b56-a477-23873e39cde7-kube-api-access-2bwfx\") pod \"console-f9d7485db-2m6x7\" (UID: \"ceef7ba8-f996-4b56-a477-23873e39cde7\") " pod="openshift-console/console-f9d7485db-2m6x7" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.821968 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f3284298-3f20-43d6-95ae-7d40c56534d3-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-dg8s7\" (UID: \"f3284298-3f20-43d6-95ae-7d40c56534d3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dg8s7" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.821991 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1-image-import-ca\") pod \"apiserver-76f77b778f-87c8s\" (UID: \"4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1\") " pod="openshift-apiserver/apiserver-76f77b778f-87c8s" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.822010 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84l7l\" (UniqueName: \"kubernetes.io/projected/171032ce-a5a4-4f30-bdc1-8c39e19efe99-kube-api-access-84l7l\") pod \"openshift-config-operator-7777fb866f-tdjrr\" (UID: \"171032ce-a5a4-4f30-bdc1-8c39e19efe99\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tdjrr" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.822029 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ceef7ba8-f996-4b56-a477-23873e39cde7-oauth-serving-cert\") pod \"console-f9d7485db-2m6x7\" (UID: \"ceef7ba8-f996-4b56-a477-23873e39cde7\") " pod="openshift-console/console-f9d7485db-2m6x7" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.822050 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8tqq\" (UniqueName: \"kubernetes.io/projected/11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e-kube-api-access-n8tqq\") pod \"route-controller-manager-6576b87f9c-sm42l\" (UID: \"11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sm42l" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.822058 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3a78dac-6d58-4647-83fb-b0f36f2f660a-config\") pod \"machine-api-operator-5694c8668f-wd7mj\" (UID: \"f3a78dac-6d58-4647-83fb-b0f36f2f660a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wd7mj" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.822069 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtm4d\" (UniqueName: \"kubernetes.io/projected/bcb87980-5888-4a30-859f-a9ac5b95f2c0-kube-api-access-mtm4d\") pod \"router-default-5444994796-jxft6\" (UID: \"bcb87980-5888-4a30-859f-a9ac5b95f2c0\") " pod="openshift-ingress/router-default-5444994796-jxft6" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.822096 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1-config\") pod \"apiserver-76f77b778f-87c8s\" (UID: \"4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1\") " pod="openshift-apiserver/apiserver-76f77b778f-87c8s" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.822143 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4efd784b-b02d-4298-a96b-ed5663641afa-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gj7qw\" (UID: \"4efd784b-b02d-4298-a96b-ed5663641afa\") " pod="openshift-marketplace/marketplace-operator-79b997595-gj7qw" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.822218 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76eee4ae-1408-4cd8-819a-a5ef5c887b9a-service-ca-bundle\") pod \"authentication-operator-69f744f599-8sdmc\" (UID: \"76eee4ae-1408-4cd8-819a-a5ef5c887b9a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8sdmc" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.822251 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqrm7\" (UniqueName: \"kubernetes.io/projected/1b63fb51-bfa4-4c92-a1b0-9044cf7cff03-kube-api-access-tqrm7\") pod \"controller-manager-879f6c89f-sd2nj\" (UID: \"1b63fb51-bfa4-4c92-a1b0-9044cf7cff03\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sd2nj" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.822277 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4efd784b-b02d-4298-a96b-ed5663641afa-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gj7qw\" (UID: \"4efd784b-b02d-4298-a96b-ed5663641afa\") " pod="openshift-marketplace/marketplace-operator-79b997595-gj7qw" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.822308 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjwjr\" (UniqueName: \"kubernetes.io/projected/4efd784b-b02d-4298-a96b-ed5663641afa-kube-api-access-mjwjr\") pod \"marketplace-operator-79b997595-gj7qw\" (UID: \"4efd784b-b02d-4298-a96b-ed5663641afa\") " pod="openshift-marketplace/marketplace-operator-79b997595-gj7qw" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.822357 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d8683a6-e42e-4541-81b3-2b28fc5e7be6-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-xrzpw\" (UID: \"7d8683a6-e42e-4541-81b3-2b28fc5e7be6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xrzpw" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.822395 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ftnz\" (UniqueName: \"kubernetes.io/projected/0c70928e-51e2-4c62-8ce6-c1f8ba489f8f-kube-api-access-2ftnz\") pod \"cluster-samples-operator-665b6dd947-b6htg\" (UID: \"0c70928e-51e2-4c62-8ce6-c1f8ba489f8f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-b6htg" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.822444 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-h46lz\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " pod="openshift-authentication/oauth-openshift-558db77b4-h46lz" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.822468 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d8683a6-e42e-4541-81b3-2b28fc5e7be6-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-xrzpw\" (UID: \"7d8683a6-e42e-4541-81b3-2b28fc5e7be6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xrzpw" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.822490 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1-serving-cert\") pod \"apiserver-76f77b778f-87c8s\" (UID: \"4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1\") " pod="openshift-apiserver/apiserver-76f77b778f-87c8s" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.822506 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b63fb51-bfa4-4c92-a1b0-9044cf7cff03-config\") pod \"controller-manager-879f6c89f-sd2nj\" (UID: \"1b63fb51-bfa4-4c92-a1b0-9044cf7cff03\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sd2nj" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.822521 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2a7ecab1-cd44-4dee-810a-1fd601b96eb4-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-smvlm\" (UID: \"2a7ecab1-cd44-4dee-810a-1fd601b96eb4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-smvlm" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.822524 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cf947105-e97d-4a1c-9b59-bf6b37461c1e-audit-policies\") pod \"oauth-openshift-558db77b4-h46lz\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " pod="openshift-authentication/oauth-openshift-558db77b4-h46lz" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.822574 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/171032ce-a5a4-4f30-bdc1-8c39e19efe99-serving-cert\") pod \"openshift-config-operator-7777fb866f-tdjrr\" (UID: \"171032ce-a5a4-4f30-bdc1-8c39e19efe99\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tdjrr" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.822595 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e-serving-cert\") pod \"route-controller-manager-6576b87f9c-sm42l\" (UID: \"11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sm42l" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.822639 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxpft\" (UniqueName: \"kubernetes.io/projected/52515c5a-0f7d-42d4-90f8-97e66050f161-kube-api-access-bxpft\") pod \"machine-config-controller-84d6567774-9z9sg\" (UID: \"52515c5a-0f7d-42d4-90f8-97e66050f161\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9z9sg" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.822663 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/bcb87980-5888-4a30-859f-a9ac5b95f2c0-default-certificate\") pod \"router-default-5444994796-jxft6\" (UID: \"bcb87980-5888-4a30-859f-a9ac5b95f2c0\") " pod="openshift-ingress/router-default-5444994796-jxft6" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.822694 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-h46lz\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " pod="openshift-authentication/oauth-openshift-558db77b4-h46lz" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.822713 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1-etcd-client\") pod \"apiserver-76f77b778f-87c8s\" (UID: \"4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1\") " pod="openshift-apiserver/apiserver-76f77b778f-87c8s" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.822732 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2a7ecab1-cd44-4dee-810a-1fd601b96eb4-serving-cert\") pod \"apiserver-7bbb656c7d-smvlm\" (UID: \"2a7ecab1-cd44-4dee-810a-1fd601b96eb4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-smvlm" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.822752 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2a7ecab1-cd44-4dee-810a-1fd601b96eb4-audit-dir\") pod \"apiserver-7bbb656c7d-smvlm\" (UID: \"2a7ecab1-cd44-4dee-810a-1fd601b96eb4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-smvlm" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.822773 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/989a2cb8-e14c-4ffb-9b1f-a5bea171b3f5-config\") pod \"openshift-apiserver-operator-796bbdcf4f-fjs9m\" (UID: \"989a2cb8-e14c-4ffb-9b1f-a5bea171b3f5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fjs9m" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.823231 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cf947105-e97d-4a1c-9b59-bf6b37461c1e-audit-policies\") pod \"oauth-openshift-558db77b4-h46lz\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " pod="openshift-authentication/oauth-openshift-558db77b4-h46lz" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.823809 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-v4cvf"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.828442 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/2a7ecab1-cd44-4dee-810a-1fd601b96eb4-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-smvlm\" (UID: \"2a7ecab1-cd44-4dee-810a-1fd601b96eb4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-smvlm" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.828980 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1-audit\") pod \"apiserver-76f77b778f-87c8s\" (UID: \"4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1\") " pod="openshift-apiserver/apiserver-76f77b778f-87c8s" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.829094 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-h46lz\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " pod="openshift-authentication/oauth-openshift-558db77b4-h46lz" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.829594 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-h46lz\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " pod="openshift-authentication/oauth-openshift-558db77b4-h46lz" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.830325 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76eee4ae-1408-4cd8-819a-a5ef5c887b9a-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-8sdmc\" (UID: \"76eee4ae-1408-4cd8-819a-a5ef5c887b9a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8sdmc" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.831424 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f3a78dac-6d58-4647-83fb-b0f36f2f660a-images\") pod \"machine-api-operator-5694c8668f-wd7mj\" (UID: \"f3a78dac-6d58-4647-83fb-b0f36f2f660a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wd7mj" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.832050 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76eee4ae-1408-4cd8-819a-a5ef5c887b9a-service-ca-bundle\") pod \"authentication-operator-69f744f599-8sdmc\" (UID: \"76eee4ae-1408-4cd8-819a-a5ef5c887b9a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8sdmc" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.822791 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ceef7ba8-f996-4b56-a477-23873e39cde7-console-serving-cert\") pod \"console-f9d7485db-2m6x7\" (UID: \"ceef7ba8-f996-4b56-a477-23873e39cde7\") " pod="openshift-console/console-f9d7485db-2m6x7" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.837089 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33f7be8a-c6e0-47cd-b9bf-d42447f23980-serving-cert\") pod \"console-operator-58897d9998-jtp75\" (UID: \"33f7be8a-c6e0-47cd-b9bf-d42447f23980\") " pod="openshift-console-operator/console-operator-58897d9998-jtp75" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.837135 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1-trusted-ca-bundle\") pod \"apiserver-76f77b778f-87c8s\" (UID: \"4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1\") " pod="openshift-apiserver/apiserver-76f77b778f-87c8s" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.837142 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c47b8dca-2e97-4aba-b303-00c0f2b36ecc-serving-cert\") pod \"service-ca-operator-777779d784-tm59k\" (UID: \"c47b8dca-2e97-4aba-b303-00c0f2b36ecc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tm59k" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.837310 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6npk\" (UniqueName: \"kubernetes.io/projected/cf947105-e97d-4a1c-9b59-bf6b37461c1e-kube-api-access-w6npk\") pod \"oauth-openshift-558db77b4-h46lz\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " pod="openshift-authentication/oauth-openshift-558db77b4-h46lz" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.837356 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkqr8\" (UniqueName: \"kubernetes.io/projected/2a7ecab1-cd44-4dee-810a-1fd601b96eb4-kube-api-access-gkqr8\") pod \"apiserver-7bbb656c7d-smvlm\" (UID: \"2a7ecab1-cd44-4dee-810a-1fd601b96eb4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-smvlm" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.837415 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33f7be8a-c6e0-47cd-b9bf-d42447f23980-config\") pod \"console-operator-58897d9998-jtp75\" (UID: \"33f7be8a-c6e0-47cd-b9bf-d42447f23980\") " pod="openshift-console-operator/console-operator-58897d9998-jtp75" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.837455 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcfn9\" (UniqueName: \"kubernetes.io/projected/acf5523b-3f1d-495e-8014-0313925e8727-kube-api-access-kcfn9\") pod \"downloads-7954f5f757-zcngp\" (UID: \"acf5523b-3f1d-495e-8014-0313925e8727\") " pod="openshift-console/downloads-7954f5f757-zcngp" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.837496 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-h46lz\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " pod="openshift-authentication/oauth-openshift-558db77b4-h46lz" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.837536 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1-node-pullsecrets\") pod \"apiserver-76f77b778f-87c8s\" (UID: \"4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1\") " pod="openshift-apiserver/apiserver-76f77b778f-87c8s" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.837576 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e-client-ca\") pod \"route-controller-manager-6576b87f9c-sm42l\" (UID: \"11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sm42l" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.837590 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ceef7ba8-f996-4b56-a477-23873e39cde7-console-oauth-config\") pod \"console-f9d7485db-2m6x7\" (UID: \"ceef7ba8-f996-4b56-a477-23873e39cde7\") " pod="openshift-console/console-f9d7485db-2m6x7" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.837746 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1-audit-dir\") pod \"apiserver-76f77b778f-87c8s\" (UID: \"4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1\") " pod="openshift-apiserver/apiserver-76f77b778f-87c8s" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.837792 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ceef7ba8-f996-4b56-a477-23873e39cde7-console-config\") pod \"console-f9d7485db-2m6x7\" (UID: \"ceef7ba8-f996-4b56-a477-23873e39cde7\") " pod="openshift-console/console-f9d7485db-2m6x7" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.837846 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a6aa30cf-1aa9-4c2a-a079-40182a1c51ef-images\") pod \"machine-config-operator-74547568cd-nzfrx\" (UID: \"a6aa30cf-1aa9-4c2a-a079-40182a1c51ef\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nzfrx" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.837881 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhvkt\" (UniqueName: \"kubernetes.io/projected/d2108a93-65c6-4626-9fb6-f93854855b80-kube-api-access-lhvkt\") pod \"etcd-operator-b45778765-t9rqg\" (UID: \"d2108a93-65c6-4626-9fb6-f93854855b80\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t9rqg" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.837926 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/f3a78dac-6d58-4647-83fb-b0f36f2f660a-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-wd7mj\" (UID: \"f3a78dac-6d58-4647-83fb-b0f36f2f660a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wd7mj" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.837960 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2a7ecab1-cd44-4dee-810a-1fd601b96eb4-audit-policies\") pod \"apiserver-7bbb656c7d-smvlm\" (UID: \"2a7ecab1-cd44-4dee-810a-1fd601b96eb4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-smvlm" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.838001 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/ef3e87db-9d6e-400e-bd6e-bd578baabbf6-machine-approver-tls\") pod \"machine-approver-56656f9798-6t5kh\" (UID: \"ef3e87db-9d6e-400e-bd6e-bd578baabbf6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6t5kh" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.838034 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e-config\") pod \"route-controller-manager-6576b87f9c-sm42l\" (UID: \"11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sm42l" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.838066 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c3aff15-c0d3-4d20-8ee3-7d753d5f7c91-config-volume\") pod \"collect-profiles-29324625-nj78m\" (UID: \"5c3aff15-c0d3-4d20-8ee3-7d753d5f7c91\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324625-nj78m" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.838102 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/d2108a93-65c6-4626-9fb6-f93854855b80-etcd-ca\") pod \"etcd-operator-b45778765-t9rqg\" (UID: \"d2108a93-65c6-4626-9fb6-f93854855b80\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t9rqg" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.838149 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/33f7be8a-c6e0-47cd-b9bf-d42447f23980-trusted-ca\") pod \"console-operator-58897d9998-jtp75\" (UID: \"33f7be8a-c6e0-47cd-b9bf-d42447f23980\") " pod="openshift-console-operator/console-operator-58897d9998-jtp75" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.839164 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b63fb51-bfa4-4c92-a1b0-9044cf7cff03-client-ca\") pod \"controller-manager-879f6c89f-sd2nj\" (UID: \"1b63fb51-bfa4-4c92-a1b0-9044cf7cff03\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sd2nj" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.843106 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.844503 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-h46lz\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " pod="openshift-authentication/oauth-openshift-558db77b4-h46lz" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.844942 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1-etcd-serving-ca\") pod \"apiserver-76f77b778f-87c8s\" (UID: \"4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1\") " pod="openshift-apiserver/apiserver-76f77b778f-87c8s" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.846681 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ceef7ba8-f996-4b56-a477-23873e39cde7-console-serving-cert\") pod \"console-f9d7485db-2m6x7\" (UID: \"ceef7ba8-f996-4b56-a477-23873e39cde7\") " pod="openshift-console/console-f9d7485db-2m6x7" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.849252 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33f7be8a-c6e0-47cd-b9bf-d42447f23980-config\") pod \"console-operator-58897d9998-jtp75\" (UID: \"33f7be8a-c6e0-47cd-b9bf-d42447f23980\") " pod="openshift-console-operator/console-operator-58897d9998-jtp75" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.849979 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-h46lz\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " pod="openshift-authentication/oauth-openshift-558db77b4-h46lz" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.850049 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1-node-pullsecrets\") pod \"apiserver-76f77b778f-87c8s\" (UID: \"4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1\") " pod="openshift-apiserver/apiserver-76f77b778f-87c8s" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.850540 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1-audit-dir\") pod \"apiserver-76f77b778f-87c8s\" (UID: \"4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1\") " pod="openshift-apiserver/apiserver-76f77b778f-87c8s" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.850708 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/33f7be8a-c6e0-47cd-b9bf-d42447f23980-trusted-ca\") pod \"console-operator-58897d9998-jtp75\" (UID: \"33f7be8a-c6e0-47cd-b9bf-d42447f23980\") " pod="openshift-console-operator/console-operator-58897d9998-jtp75" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.851059 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2a7ecab1-cd44-4dee-810a-1fd601b96eb4-audit-dir\") pod \"apiserver-7bbb656c7d-smvlm\" (UID: \"2a7ecab1-cd44-4dee-810a-1fd601b96eb4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-smvlm" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.851882 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/989a2cb8-e14c-4ffb-9b1f-a5bea171b3f5-config\") pod \"openshift-apiserver-operator-796bbdcf4f-fjs9m\" (UID: \"989a2cb8-e14c-4ffb-9b1f-a5bea171b3f5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fjs9m" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.852678 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ceef7ba8-f996-4b56-a477-23873e39cde7-console-config\") pod \"console-f9d7485db-2m6x7\" (UID: \"ceef7ba8-f996-4b56-a477-23873e39cde7\") " pod="openshift-console/console-f9d7485db-2m6x7" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.854102 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ceef7ba8-f996-4b56-a477-23873e39cde7-service-ca\") pod \"console-f9d7485db-2m6x7\" (UID: \"ceef7ba8-f996-4b56-a477-23873e39cde7\") " pod="openshift-console/console-f9d7485db-2m6x7" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.855497 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1b63fb51-bfa4-4c92-a1b0-9044cf7cff03-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-sd2nj\" (UID: \"1b63fb51-bfa4-4c92-a1b0-9044cf7cff03\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sd2nj" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.855637 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1-etcd-client\") pod \"apiserver-76f77b778f-87c8s\" (UID: \"4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1\") " pod="openshift-apiserver/apiserver-76f77b778f-87c8s" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.856027 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-h46lz\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " pod="openshift-authentication/oauth-openshift-558db77b4-h46lz" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.856106 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b63fb51-bfa4-4c92-a1b0-9044cf7cff03-serving-cert\") pod \"controller-manager-879f6c89f-sd2nj\" (UID: \"1b63fb51-bfa4-4c92-a1b0-9044cf7cff03\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sd2nj" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.856539 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33f7be8a-c6e0-47cd-b9bf-d42447f23980-serving-cert\") pod \"console-operator-58897d9998-jtp75\" (UID: \"33f7be8a-c6e0-47cd-b9bf-d42447f23980\") " pod="openshift-console-operator/console-operator-58897d9998-jtp75" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.857073 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.857201 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-h46lz\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " pod="openshift-authentication/oauth-openshift-558db77b4-h46lz" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.859129 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-h46lz\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " pod="openshift-authentication/oauth-openshift-558db77b4-h46lz" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.859747 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-h46lz\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " pod="openshift-authentication/oauth-openshift-558db77b4-h46lz" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.863343 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76eee4ae-1408-4cd8-819a-a5ef5c887b9a-serving-cert\") pod \"authentication-operator-69f744f599-8sdmc\" (UID: \"76eee4ae-1408-4cd8-819a-a5ef5c887b9a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8sdmc" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.863380 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/7d8683a6-e42e-4541-81b3-2b28fc5e7be6-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-xrzpw\" (UID: \"7d8683a6-e42e-4541-81b3-2b28fc5e7be6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xrzpw" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.863348 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d8683a6-e42e-4541-81b3-2b28fc5e7be6-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-xrzpw\" (UID: \"7d8683a6-e42e-4541-81b3-2b28fc5e7be6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xrzpw" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.864240 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/171032ce-a5a4-4f30-bdc1-8c39e19efe99-serving-cert\") pod \"openshift-config-operator-7777fb866f-tdjrr\" (UID: \"171032ce-a5a4-4f30-bdc1-8c39e19efe99\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tdjrr" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.865340 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ceef7ba8-f996-4b56-a477-23873e39cde7-trusted-ca-bundle\") pod \"console-f9d7485db-2m6x7\" (UID: \"ceef7ba8-f996-4b56-a477-23873e39cde7\") " pod="openshift-console/console-f9d7485db-2m6x7" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.867179 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/989a2cb8-e14c-4ffb-9b1f-a5bea171b3f5-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-fjs9m\" (UID: \"989a2cb8-e14c-4ffb-9b1f-a5bea171b3f5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fjs9m" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.867462 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0c70928e-51e2-4c62-8ce6-c1f8ba489f8f-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-b6htg\" (UID: \"0c70928e-51e2-4c62-8ce6-c1f8ba489f8f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-b6htg" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.867533 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1-encryption-config\") pod \"apiserver-76f77b778f-87c8s\" (UID: \"4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1\") " pod="openshift-apiserver/apiserver-76f77b778f-87c8s" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.867570 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-h46lz\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " pod="openshift-authentication/oauth-openshift-558db77b4-h46lz" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.867620 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e-serving-cert\") pod \"route-controller-manager-6576b87f9c-sm42l\" (UID: \"11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sm42l" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.869096 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ceef7ba8-f996-4b56-a477-23873e39cde7-oauth-serving-cert\") pod \"console-f9d7485db-2m6x7\" (UID: \"ceef7ba8-f996-4b56-a477-23873e39cde7\") " pod="openshift-console/console-f9d7485db-2m6x7" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.869340 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/171032ce-a5a4-4f30-bdc1-8c39e19efe99-available-featuregates\") pod \"openshift-config-operator-7777fb866f-tdjrr\" (UID: \"171032ce-a5a4-4f30-bdc1-8c39e19efe99\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tdjrr" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.876356 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2a7ecab1-cd44-4dee-810a-1fd601b96eb4-etcd-client\") pod \"apiserver-7bbb656c7d-smvlm\" (UID: \"2a7ecab1-cd44-4dee-810a-1fd601b96eb4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-smvlm" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.876871 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/2a7ecab1-cd44-4dee-810a-1fd601b96eb4-encryption-config\") pod \"apiserver-7bbb656c7d-smvlm\" (UID: \"2a7ecab1-cd44-4dee-810a-1fd601b96eb4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-smvlm" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.881048 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ceef7ba8-f996-4b56-a477-23873e39cde7-console-oauth-config\") pod \"console-f9d7485db-2m6x7\" (UID: \"ceef7ba8-f996-4b56-a477-23873e39cde7\") " pod="openshift-console/console-f9d7485db-2m6x7" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.881156 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-h46lz\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " pod="openshift-authentication/oauth-openshift-558db77b4-h46lz" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.881625 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-8sdmc"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.881674 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-zcngp"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.881688 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-h7lkc"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.881699 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-t9rqg"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.881711 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-smvlm"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.881806 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-v4cvf" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.882162 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2a7ecab1-cd44-4dee-810a-1fd601b96eb4-serving-cert\") pod \"apiserver-7bbb656c7d-smvlm\" (UID: \"2a7ecab1-cd44-4dee-810a-1fd601b96eb4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-smvlm" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.882300 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ef3e87db-9d6e-400e-bd6e-bd578baabbf6-auth-proxy-config\") pod \"machine-approver-56656f9798-6t5kh\" (UID: \"ef3e87db-9d6e-400e-bd6e-bd578baabbf6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6t5kh" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.869519 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1-image-import-ca\") pod \"apiserver-76f77b778f-87c8s\" (UID: \"4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1\") " pod="openshift-apiserver/apiserver-76f77b778f-87c8s" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.882434 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef3e87db-9d6e-400e-bd6e-bd578baabbf6-config\") pod \"machine-approver-56656f9798-6t5kh\" (UID: \"ef3e87db-9d6e-400e-bd6e-bd578baabbf6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6t5kh" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.882671 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76eee4ae-1408-4cd8-819a-a5ef5c887b9a-config\") pod \"authentication-operator-69f744f599-8sdmc\" (UID: \"76eee4ae-1408-4cd8-819a-a5ef5c887b9a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8sdmc" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.883002 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e-config\") pod \"route-controller-manager-6576b87f9c-sm42l\" (UID: \"11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sm42l" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.883104 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-szr58"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.883559 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b63fb51-bfa4-4c92-a1b0-9044cf7cff03-config\") pod \"controller-manager-879f6c89f-sd2nj\" (UID: \"1b63fb51-bfa4-4c92-a1b0-9044cf7cff03\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sd2nj" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.883651 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2a7ecab1-cd44-4dee-810a-1fd601b96eb4-audit-policies\") pod \"apiserver-7bbb656c7d-smvlm\" (UID: \"2a7ecab1-cd44-4dee-810a-1fd601b96eb4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-smvlm" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.884009 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1-serving-cert\") pod \"apiserver-76f77b778f-87c8s\" (UID: \"4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1\") " pod="openshift-apiserver/apiserver-76f77b778f-87c8s" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.884207 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/ef3e87db-9d6e-400e-bd6e-bd578baabbf6-machine-approver-tls\") pod \"machine-approver-56656f9798-6t5kh\" (UID: \"ef3e87db-9d6e-400e-bd6e-bd578baabbf6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6t5kh" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.885250 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-tdjrr"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.885807 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.886269 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-h46lz\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " pod="openshift-authentication/oauth-openshift-558db77b4-h46lz" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.886522 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/f3a78dac-6d58-4647-83fb-b0f36f2f660a-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-wd7mj\" (UID: \"f3a78dac-6d58-4647-83fb-b0f36f2f660a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wd7mj" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.887203 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sd2nj"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.888348 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hpqxb"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.889420 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vcvhp"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.890549 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s7kfk"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.891571 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-42xd6"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.892804 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-j66jg"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.894103 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c5ncx"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.895775 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-h46lz"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.895834 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.898692 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-942tg"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.899909 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-942tg" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.900189 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-gvjlm"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.901230 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-gvjlm" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.901745 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dg8s7"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.904930 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-rlrp9"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.906636 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gj7qw"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.908348 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-zsdch"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.909668 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-vsmt5"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.910854 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-9z9sg"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.912414 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-tm59k"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.913914 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-nzfrx"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.915478 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-z8tqh"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.916010 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.917128 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zrt4t"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.918537 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-j2cbs"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.920079 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-v4cvf"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.921852 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xt6kz"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.923329 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324625-nj78m"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.924950 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-942tg"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.926338 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-bs9r7"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.927092 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bs9r7" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.927740 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-bs9r7"] Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.939959 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqcms\" (UniqueName: \"kubernetes.io/projected/f3284298-3f20-43d6-95ae-7d40c56534d3-kube-api-access-cqcms\") pod \"package-server-manager-789f6589d5-dg8s7\" (UID: \"f3284298-3f20-43d6-95ae-7d40c56534d3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dg8s7" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.939996 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/52515c5a-0f7d-42d4-90f8-97e66050f161-proxy-tls\") pod \"machine-config-controller-84d6567774-9z9sg\" (UID: \"52515c5a-0f7d-42d4-90f8-97e66050f161\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9z9sg" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.940017 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5c3aff15-c0d3-4d20-8ee3-7d753d5f7c91-secret-volume\") pod \"collect-profiles-29324625-nj78m\" (UID: \"5c3aff15-c0d3-4d20-8ee3-7d753d5f7c91\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324625-nj78m" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.940035 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/d2108a93-65c6-4626-9fb6-f93854855b80-etcd-service-ca\") pod \"etcd-operator-b45778765-t9rqg\" (UID: \"d2108a93-65c6-4626-9fb6-f93854855b80\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t9rqg" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.940056 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bcb87980-5888-4a30-859f-a9ac5b95f2c0-metrics-certs\") pod \"router-default-5444994796-jxft6\" (UID: \"bcb87980-5888-4a30-859f-a9ac5b95f2c0\") " pod="openshift-ingress/router-default-5444994796-jxft6" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.940081 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a6aa30cf-1aa9-4c2a-a079-40182a1c51ef-auth-proxy-config\") pod \"machine-config-operator-74547568cd-nzfrx\" (UID: \"a6aa30cf-1aa9-4c2a-a079-40182a1c51ef\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nzfrx" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.940104 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ebb51e4-a635-4c8a-b287-79c0e7d74a9c-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-42xd6\" (UID: \"8ebb51e4-a635-4c8a-b287-79c0e7d74a9c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-42xd6" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.940215 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c47b8dca-2e97-4aba-b303-00c0f2b36ecc-config\") pod \"service-ca-operator-777779d784-tm59k\" (UID: \"c47b8dca-2e97-4aba-b303-00c0f2b36ecc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tm59k" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.940279 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f3284298-3f20-43d6-95ae-7d40c56534d3-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-dg8s7\" (UID: \"f3284298-3f20-43d6-95ae-7d40c56534d3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dg8s7" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.940339 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtm4d\" (UniqueName: \"kubernetes.io/projected/bcb87980-5888-4a30-859f-a9ac5b95f2c0-kube-api-access-mtm4d\") pod \"router-default-5444994796-jxft6\" (UID: \"bcb87980-5888-4a30-859f-a9ac5b95f2c0\") " pod="openshift-ingress/router-default-5444994796-jxft6" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.940460 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4efd784b-b02d-4298-a96b-ed5663641afa-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gj7qw\" (UID: \"4efd784b-b02d-4298-a96b-ed5663641afa\") " pod="openshift-marketplace/marketplace-operator-79b997595-gj7qw" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.940485 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4efd784b-b02d-4298-a96b-ed5663641afa-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gj7qw\" (UID: \"4efd784b-b02d-4298-a96b-ed5663641afa\") " pod="openshift-marketplace/marketplace-operator-79b997595-gj7qw" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.940574 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjwjr\" (UniqueName: \"kubernetes.io/projected/4efd784b-b02d-4298-a96b-ed5663641afa-kube-api-access-mjwjr\") pod \"marketplace-operator-79b997595-gj7qw\" (UID: \"4efd784b-b02d-4298-a96b-ed5663641afa\") " pod="openshift-marketplace/marketplace-operator-79b997595-gj7qw" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.940720 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxpft\" (UniqueName: \"kubernetes.io/projected/52515c5a-0f7d-42d4-90f8-97e66050f161-kube-api-access-bxpft\") pod \"machine-config-controller-84d6567774-9z9sg\" (UID: \"52515c5a-0f7d-42d4-90f8-97e66050f161\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9z9sg" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.940742 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/bcb87980-5888-4a30-859f-a9ac5b95f2c0-default-certificate\") pod \"router-default-5444994796-jxft6\" (UID: \"bcb87980-5888-4a30-859f-a9ac5b95f2c0\") " pod="openshift-ingress/router-default-5444994796-jxft6" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.940826 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c47b8dca-2e97-4aba-b303-00c0f2b36ecc-serving-cert\") pod \"service-ca-operator-777779d784-tm59k\" (UID: \"c47b8dca-2e97-4aba-b303-00c0f2b36ecc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tm59k" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.940902 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a6aa30cf-1aa9-4c2a-a079-40182a1c51ef-images\") pod \"machine-config-operator-74547568cd-nzfrx\" (UID: \"a6aa30cf-1aa9-4c2a-a079-40182a1c51ef\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nzfrx" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.940919 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhvkt\" (UniqueName: \"kubernetes.io/projected/d2108a93-65c6-4626-9fb6-f93854855b80-kube-api-access-lhvkt\") pod \"etcd-operator-b45778765-t9rqg\" (UID: \"d2108a93-65c6-4626-9fb6-f93854855b80\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t9rqg" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.940937 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/d2108a93-65c6-4626-9fb6-f93854855b80-etcd-ca\") pod \"etcd-operator-b45778765-t9rqg\" (UID: \"d2108a93-65c6-4626-9fb6-f93854855b80\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t9rqg" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.941044 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c3aff15-c0d3-4d20-8ee3-7d753d5f7c91-config-volume\") pod \"collect-profiles-29324625-nj78m\" (UID: \"5c3aff15-c0d3-4d20-8ee3-7d753d5f7c91\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324625-nj78m" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.940945 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ebb51e4-a635-4c8a-b287-79c0e7d74a9c-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-42xd6\" (UID: \"8ebb51e4-a635-4c8a-b287-79c0e7d74a9c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-42xd6" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.941117 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ebb51e4-a635-4c8a-b287-79c0e7d74a9c-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-42xd6\" (UID: \"8ebb51e4-a635-4c8a-b287-79c0e7d74a9c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-42xd6" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.941153 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a6aa30cf-1aa9-4c2a-a079-40182a1c51ef-auth-proxy-config\") pod \"machine-config-operator-74547568cd-nzfrx\" (UID: \"a6aa30cf-1aa9-4c2a-a079-40182a1c51ef\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nzfrx" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.941178 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d2108a93-65c6-4626-9fb6-f93854855b80-etcd-client\") pod \"etcd-operator-b45778765-t9rqg\" (UID: \"d2108a93-65c6-4626-9fb6-f93854855b80\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t9rqg" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.941197 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtn69\" (UniqueName: \"kubernetes.io/projected/c47b8dca-2e97-4aba-b303-00c0f2b36ecc-kube-api-access-gtn69\") pod \"service-ca-operator-777779d784-tm59k\" (UID: \"c47b8dca-2e97-4aba-b303-00c0f2b36ecc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tm59k" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.941213 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2108a93-65c6-4626-9fb6-f93854855b80-config\") pod \"etcd-operator-b45778765-t9rqg\" (UID: \"d2108a93-65c6-4626-9fb6-f93854855b80\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t9rqg" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.941230 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/52515c5a-0f7d-42d4-90f8-97e66050f161-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-9z9sg\" (UID: \"52515c5a-0f7d-42d4-90f8-97e66050f161\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9z9sg" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.941246 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/bcb87980-5888-4a30-859f-a9ac5b95f2c0-stats-auth\") pod \"router-default-5444994796-jxft6\" (UID: \"bcb87980-5888-4a30-859f-a9ac5b95f2c0\") " pod="openshift-ingress/router-default-5444994796-jxft6" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.942073 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/52515c5a-0f7d-42d4-90f8-97e66050f161-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-9z9sg\" (UID: \"52515c5a-0f7d-42d4-90f8-97e66050f161\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9z9sg" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.941263 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ab24ce7c-1a43-44de-98c3-e9bbdb0c7b6c-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-vsmt5\" (UID: \"ab24ce7c-1a43-44de-98c3-e9bbdb0c7b6c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vsmt5" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.942133 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6b56\" (UniqueName: \"kubernetes.io/projected/ab24ce7c-1a43-44de-98c3-e9bbdb0c7b6c-kube-api-access-s6b56\") pod \"multus-admission-controller-857f4d67dd-vsmt5\" (UID: \"ab24ce7c-1a43-44de-98c3-e9bbdb0c7b6c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vsmt5" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.942839 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a6aa30cf-1aa9-4c2a-a079-40182a1c51ef-proxy-tls\") pod \"machine-config-operator-74547568cd-nzfrx\" (UID: \"a6aa30cf-1aa9-4c2a-a079-40182a1c51ef\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nzfrx" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.943001 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2108a93-65c6-4626-9fb6-f93854855b80-serving-cert\") pod \"etcd-operator-b45778765-t9rqg\" (UID: \"d2108a93-65c6-4626-9fb6-f93854855b80\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t9rqg" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.943150 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xq8pc\" (UniqueName: \"kubernetes.io/projected/8ebb51e4-a635-4c8a-b287-79c0e7d74a9c-kube-api-access-xq8pc\") pod \"openshift-controller-manager-operator-756b6f6bc6-42xd6\" (UID: \"8ebb51e4-a635-4c8a-b287-79c0e7d74a9c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-42xd6" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.943196 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24bdq\" (UniqueName: \"kubernetes.io/projected/a6aa30cf-1aa9-4c2a-a079-40182a1c51ef-kube-api-access-24bdq\") pod \"machine-config-operator-74547568cd-nzfrx\" (UID: \"a6aa30cf-1aa9-4c2a-a079-40182a1c51ef\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nzfrx" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.943225 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cchd\" (UniqueName: \"kubernetes.io/projected/5c3aff15-c0d3-4d20-8ee3-7d753d5f7c91-kube-api-access-9cchd\") pod \"collect-profiles-29324625-nj78m\" (UID: \"5c3aff15-c0d3-4d20-8ee3-7d753d5f7c91\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324625-nj78m" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.943246 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bcb87980-5888-4a30-859f-a9ac5b95f2c0-service-ca-bundle\") pod \"router-default-5444994796-jxft6\" (UID: \"bcb87980-5888-4a30-859f-a9ac5b95f2c0\") " pod="openshift-ingress/router-default-5444994796-jxft6" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.944154 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ebb51e4-a635-4c8a-b287-79c0e7d74a9c-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-42xd6\" (UID: \"8ebb51e4-a635-4c8a-b287-79c0e7d74a9c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-42xd6" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.945510 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.956926 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.976107 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 03 07:50:38 crc kubenswrapper[4664]: I1003 07:50:38.996402 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.015891 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.022589 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2108a93-65c6-4626-9fb6-f93854855b80-config\") pod \"etcd-operator-b45778765-t9rqg\" (UID: \"d2108a93-65c6-4626-9fb6-f93854855b80\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t9rqg" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.036141 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.056063 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.066528 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2108a93-65c6-4626-9fb6-f93854855b80-serving-cert\") pod \"etcd-operator-b45778765-t9rqg\" (UID: \"d2108a93-65c6-4626-9fb6-f93854855b80\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t9rqg" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.075575 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.084729 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d2108a93-65c6-4626-9fb6-f93854855b80-etcd-client\") pod \"etcd-operator-b45778765-t9rqg\" (UID: \"d2108a93-65c6-4626-9fb6-f93854855b80\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t9rqg" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.097941 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.103549 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/d2108a93-65c6-4626-9fb6-f93854855b80-etcd-ca\") pod \"etcd-operator-b45778765-t9rqg\" (UID: \"d2108a93-65c6-4626-9fb6-f93854855b80\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t9rqg" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.117963 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.121091 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/d2108a93-65c6-4626-9fb6-f93854855b80-etcd-service-ca\") pod \"etcd-operator-b45778765-t9rqg\" (UID: \"d2108a93-65c6-4626-9fb6-f93854855b80\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t9rqg" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.135457 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.155880 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.175896 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.196498 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.216017 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.236072 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.255899 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.275391 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.295718 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.316256 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.336656 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.356979 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.376289 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.396224 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.416019 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.436297 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.442015 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c47b8dca-2e97-4aba-b303-00c0f2b36ecc-config\") pod \"service-ca-operator-777779d784-tm59k\" (UID: \"c47b8dca-2e97-4aba-b303-00c0f2b36ecc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tm59k" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.456239 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.463907 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5c3aff15-c0d3-4d20-8ee3-7d753d5f7c91-secret-volume\") pod \"collect-profiles-29324625-nj78m\" (UID: \"5c3aff15-c0d3-4d20-8ee3-7d753d5f7c91\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324625-nj78m" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.475308 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.484532 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c47b8dca-2e97-4aba-b303-00c0f2b36ecc-serving-cert\") pod \"service-ca-operator-777779d784-tm59k\" (UID: \"c47b8dca-2e97-4aba-b303-00c0f2b36ecc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tm59k" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.496538 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.515811 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.536598 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.555670 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.576652 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.595902 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.615488 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.657088 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.664453 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ab24ce7c-1a43-44de-98c3-e9bbdb0c7b6c-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-vsmt5\" (UID: \"ab24ce7c-1a43-44de-98c3-e9bbdb0c7b6c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vsmt5" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.676802 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.696487 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.716000 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.735792 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.755866 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.776552 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.797248 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.814657 4664 request.go:700] Waited for 1.00980167s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/configmaps?fieldSelector=metadata.name%3Dmachine-config-operator-images&limit=500&resourceVersion=0 Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.816519 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.822449 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a6aa30cf-1aa9-4c2a-a079-40182a1c51ef-images\") pod \"machine-config-operator-74547568cd-nzfrx\" (UID: \"a6aa30cf-1aa9-4c2a-a079-40182a1c51ef\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nzfrx" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.836386 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.847333 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a6aa30cf-1aa9-4c2a-a079-40182a1c51ef-proxy-tls\") pod \"machine-config-operator-74547568cd-nzfrx\" (UID: \"a6aa30cf-1aa9-4c2a-a079-40182a1c51ef\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nzfrx" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.858127 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.876549 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.895731 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.916930 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.936302 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 03 07:50:39 crc kubenswrapper[4664]: E1003 07:50:39.940317 4664 secret.go:188] Couldn't get secret openshift-machine-config-operator/mcc-proxy-tls: failed to sync secret cache: timed out waiting for the condition Oct 03 07:50:39 crc kubenswrapper[4664]: E1003 07:50:39.940453 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52515c5a-0f7d-42d4-90f8-97e66050f161-proxy-tls podName:52515c5a-0f7d-42d4-90f8-97e66050f161 nodeName:}" failed. No retries permitted until 2025-10-03 07:50:40.440407916 +0000 UTC m=+141.261598416 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/52515c5a-0f7d-42d4-90f8-97e66050f161-proxy-tls") pod "machine-config-controller-84d6567774-9z9sg" (UID: "52515c5a-0f7d-42d4-90f8-97e66050f161") : failed to sync secret cache: timed out waiting for the condition Oct 03 07:50:39 crc kubenswrapper[4664]: E1003 07:50:39.940472 4664 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: failed to sync secret cache: timed out waiting for the condition Oct 03 07:50:39 crc kubenswrapper[4664]: E1003 07:50:39.940559 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3284298-3f20-43d6-95ae-7d40c56534d3-package-server-manager-serving-cert podName:f3284298-3f20-43d6-95ae-7d40c56534d3 nodeName:}" failed. No retries permitted until 2025-10-03 07:50:40.44053796 +0000 UTC m=+141.261728450 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/f3284298-3f20-43d6-95ae-7d40c56534d3-package-server-manager-serving-cert") pod "package-server-manager-789f6589d5-dg8s7" (UID: "f3284298-3f20-43d6-95ae-7d40c56534d3") : failed to sync secret cache: timed out waiting for the condition Oct 03 07:50:39 crc kubenswrapper[4664]: E1003 07:50:39.940589 4664 secret.go:188] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: failed to sync secret cache: timed out waiting for the condition Oct 03 07:50:39 crc kubenswrapper[4664]: E1003 07:50:39.940641 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4efd784b-b02d-4298-a96b-ed5663641afa-marketplace-operator-metrics podName:4efd784b-b02d-4298-a96b-ed5663641afa nodeName:}" failed. No retries permitted until 2025-10-03 07:50:40.440633613 +0000 UTC m=+141.261824103 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/4efd784b-b02d-4298-a96b-ed5663641afa-marketplace-operator-metrics") pod "marketplace-operator-79b997595-gj7qw" (UID: "4efd784b-b02d-4298-a96b-ed5663641afa") : failed to sync secret cache: timed out waiting for the condition Oct 03 07:50:39 crc kubenswrapper[4664]: E1003 07:50:39.940644 4664 configmap.go:193] Couldn't get configMap openshift-marketplace/marketplace-trusted-ca: failed to sync configmap cache: timed out waiting for the condition Oct 03 07:50:39 crc kubenswrapper[4664]: E1003 07:50:39.940731 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4efd784b-b02d-4298-a96b-ed5663641afa-marketplace-trusted-ca podName:4efd784b-b02d-4298-a96b-ed5663641afa nodeName:}" failed. No retries permitted until 2025-10-03 07:50:40.440708016 +0000 UTC m=+141.261898576 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-trusted-ca" (UniqueName: "kubernetes.io/configmap/4efd784b-b02d-4298-a96b-ed5663641afa-marketplace-trusted-ca") pod "marketplace-operator-79b997595-gj7qw" (UID: "4efd784b-b02d-4298-a96b-ed5663641afa") : failed to sync configmap cache: timed out waiting for the condition Oct 03 07:50:39 crc kubenswrapper[4664]: E1003 07:50:39.940830 4664 secret.go:188] Couldn't get secret openshift-ingress/router-metrics-certs-default: failed to sync secret cache: timed out waiting for the condition Oct 03 07:50:39 crc kubenswrapper[4664]: E1003 07:50:39.940874 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcb87980-5888-4a30-859f-a9ac5b95f2c0-metrics-certs podName:bcb87980-5888-4a30-859f-a9ac5b95f2c0 nodeName:}" failed. No retries permitted until 2025-10-03 07:50:40.440860631 +0000 UTC m=+141.262051201 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bcb87980-5888-4a30-859f-a9ac5b95f2c0-metrics-certs") pod "router-default-5444994796-jxft6" (UID: "bcb87980-5888-4a30-859f-a9ac5b95f2c0") : failed to sync secret cache: timed out waiting for the condition Oct 03 07:50:39 crc kubenswrapper[4664]: E1003 07:50:39.941198 4664 configmap.go:193] Couldn't get configMap openshift-operator-lifecycle-manager/collect-profiles-config: failed to sync configmap cache: timed out waiting for the condition Oct 03 07:50:39 crc kubenswrapper[4664]: E1003 07:50:39.941221 4664 secret.go:188] Couldn't get secret openshift-ingress/router-certs-default: failed to sync secret cache: timed out waiting for the condition Oct 03 07:50:39 crc kubenswrapper[4664]: E1003 07:50:39.941242 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5c3aff15-c0d3-4d20-8ee3-7d753d5f7c91-config-volume podName:5c3aff15-c0d3-4d20-8ee3-7d753d5f7c91 nodeName:}" failed. No retries permitted until 2025-10-03 07:50:40.441229454 +0000 UTC m=+141.262420014 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/5c3aff15-c0d3-4d20-8ee3-7d753d5f7c91-config-volume") pod "collect-profiles-29324625-nj78m" (UID: "5c3aff15-c0d3-4d20-8ee3-7d753d5f7c91") : failed to sync configmap cache: timed out waiting for the condition Oct 03 07:50:39 crc kubenswrapper[4664]: E1003 07:50:39.941260 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcb87980-5888-4a30-859f-a9ac5b95f2c0-default-certificate podName:bcb87980-5888-4a30-859f-a9ac5b95f2c0 nodeName:}" failed. No retries permitted until 2025-10-03 07:50:40.441250564 +0000 UTC m=+141.262441054 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-certificate" (UniqueName: "kubernetes.io/secret/bcb87980-5888-4a30-859f-a9ac5b95f2c0-default-certificate") pod "router-default-5444994796-jxft6" (UID: "bcb87980-5888-4a30-859f-a9ac5b95f2c0") : failed to sync secret cache: timed out waiting for the condition Oct 03 07:50:39 crc kubenswrapper[4664]: E1003 07:50:39.942418 4664 secret.go:188] Couldn't get secret openshift-ingress/router-stats-default: failed to sync secret cache: timed out waiting for the condition Oct 03 07:50:39 crc kubenswrapper[4664]: E1003 07:50:39.942477 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcb87980-5888-4a30-859f-a9ac5b95f2c0-stats-auth podName:bcb87980-5888-4a30-859f-a9ac5b95f2c0 nodeName:}" failed. No retries permitted until 2025-10-03 07:50:40.442467366 +0000 UTC m=+141.263657856 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "stats-auth" (UniqueName: "kubernetes.io/secret/bcb87980-5888-4a30-859f-a9ac5b95f2c0-stats-auth") pod "router-default-5444994796-jxft6" (UID: "bcb87980-5888-4a30-859f-a9ac5b95f2c0") : failed to sync secret cache: timed out waiting for the condition Oct 03 07:50:39 crc kubenswrapper[4664]: E1003 07:50:39.943581 4664 configmap.go:193] Couldn't get configMap openshift-ingress/service-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Oct 03 07:50:39 crc kubenswrapper[4664]: E1003 07:50:39.943659 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bcb87980-5888-4a30-859f-a9ac5b95f2c0-service-ca-bundle podName:bcb87980-5888-4a30-859f-a9ac5b95f2c0 nodeName:}" failed. No retries permitted until 2025-10-03 07:50:40.443644136 +0000 UTC m=+141.264834716 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/bcb87980-5888-4a30-859f-a9ac5b95f2c0-service-ca-bundle") pod "router-default-5444994796-jxft6" (UID: "bcb87980-5888-4a30-859f-a9ac5b95f2c0") : failed to sync configmap cache: timed out waiting for the condition Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.955569 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.981889 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 03 07:50:39 crc kubenswrapper[4664]: I1003 07:50:39.996341 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.017036 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.036856 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.056457 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.076834 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.096253 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.116067 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.136367 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.156517 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.176523 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.196672 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.215836 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.235441 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.256488 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.276190 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.295893 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.315649 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.336257 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.356282 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.396651 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.417236 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.452948 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn4s7\" (UniqueName: \"kubernetes.io/projected/f3a78dac-6d58-4647-83fb-b0f36f2f660a-kube-api-access-hn4s7\") pod \"machine-api-operator-5694c8668f-wd7mj\" (UID: \"f3a78dac-6d58-4647-83fb-b0f36f2f660a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wd7mj" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.458262 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-wd7mj" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.462291 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bcb87980-5888-4a30-859f-a9ac5b95f2c0-service-ca-bundle\") pod \"router-default-5444994796-jxft6\" (UID: \"bcb87980-5888-4a30-859f-a9ac5b95f2c0\") " pod="openshift-ingress/router-default-5444994796-jxft6" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.462362 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/52515c5a-0f7d-42d4-90f8-97e66050f161-proxy-tls\") pod \"machine-config-controller-84d6567774-9z9sg\" (UID: \"52515c5a-0f7d-42d4-90f8-97e66050f161\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9z9sg" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.462393 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bcb87980-5888-4a30-859f-a9ac5b95f2c0-metrics-certs\") pod \"router-default-5444994796-jxft6\" (UID: \"bcb87980-5888-4a30-859f-a9ac5b95f2c0\") " pod="openshift-ingress/router-default-5444994796-jxft6" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.462452 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f3284298-3f20-43d6-95ae-7d40c56534d3-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-dg8s7\" (UID: \"f3284298-3f20-43d6-95ae-7d40c56534d3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dg8s7" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.462502 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4efd784b-b02d-4298-a96b-ed5663641afa-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gj7qw\" (UID: \"4efd784b-b02d-4298-a96b-ed5663641afa\") " pod="openshift-marketplace/marketplace-operator-79b997595-gj7qw" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.462535 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4efd784b-b02d-4298-a96b-ed5663641afa-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gj7qw\" (UID: \"4efd784b-b02d-4298-a96b-ed5663641afa\") " pod="openshift-marketplace/marketplace-operator-79b997595-gj7qw" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.462621 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/bcb87980-5888-4a30-859f-a9ac5b95f2c0-default-certificate\") pod \"router-default-5444994796-jxft6\" (UID: \"bcb87980-5888-4a30-859f-a9ac5b95f2c0\") " pod="openshift-ingress/router-default-5444994796-jxft6" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.463205 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bcb87980-5888-4a30-859f-a9ac5b95f2c0-service-ca-bundle\") pod \"router-default-5444994796-jxft6\" (UID: \"bcb87980-5888-4a30-859f-a9ac5b95f2c0\") " pod="openshift-ingress/router-default-5444994796-jxft6" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.463915 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4efd784b-b02d-4298-a96b-ed5663641afa-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gj7qw\" (UID: \"4efd784b-b02d-4298-a96b-ed5663641afa\") " pod="openshift-marketplace/marketplace-operator-79b997595-gj7qw" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.464320 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c3aff15-c0d3-4d20-8ee3-7d753d5f7c91-config-volume\") pod \"collect-profiles-29324625-nj78m\" (UID: \"5c3aff15-c0d3-4d20-8ee3-7d753d5f7c91\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324625-nj78m" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.464355 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/bcb87980-5888-4a30-859f-a9ac5b95f2c0-stats-auth\") pod \"router-default-5444994796-jxft6\" (UID: \"bcb87980-5888-4a30-859f-a9ac5b95f2c0\") " pod="openshift-ingress/router-default-5444994796-jxft6" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.465387 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c3aff15-c0d3-4d20-8ee3-7d753d5f7c91-config-volume\") pod \"collect-profiles-29324625-nj78m\" (UID: \"5c3aff15-c0d3-4d20-8ee3-7d753d5f7c91\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324625-nj78m" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.465533 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f3284298-3f20-43d6-95ae-7d40c56534d3-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-dg8s7\" (UID: \"f3284298-3f20-43d6-95ae-7d40c56534d3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dg8s7" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.466067 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4efd784b-b02d-4298-a96b-ed5663641afa-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gj7qw\" (UID: \"4efd784b-b02d-4298-a96b-ed5663641afa\") " pod="openshift-marketplace/marketplace-operator-79b997595-gj7qw" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.467959 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/bcb87980-5888-4a30-859f-a9ac5b95f2c0-default-certificate\") pod \"router-default-5444994796-jxft6\" (UID: \"bcb87980-5888-4a30-859f-a9ac5b95f2c0\") " pod="openshift-ingress/router-default-5444994796-jxft6" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.468213 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/bcb87980-5888-4a30-859f-a9ac5b95f2c0-stats-auth\") pod \"router-default-5444994796-jxft6\" (UID: \"bcb87980-5888-4a30-859f-a9ac5b95f2c0\") " pod="openshift-ingress/router-default-5444994796-jxft6" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.468491 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bcb87980-5888-4a30-859f-a9ac5b95f2c0-metrics-certs\") pod \"router-default-5444994796-jxft6\" (UID: \"bcb87980-5888-4a30-859f-a9ac5b95f2c0\") " pod="openshift-ingress/router-default-5444994796-jxft6" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.471371 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2h2p\" (UniqueName: \"kubernetes.io/projected/4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1-kube-api-access-n2h2p\") pod \"apiserver-76f77b778f-87c8s\" (UID: \"4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1\") " pod="openshift-apiserver/apiserver-76f77b778f-87c8s" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.472035 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/52515c5a-0f7d-42d4-90f8-97e66050f161-proxy-tls\") pod \"machine-config-controller-84d6567774-9z9sg\" (UID: \"52515c5a-0f7d-42d4-90f8-97e66050f161\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9z9sg" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.491416 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbwrk\" (UniqueName: \"kubernetes.io/projected/ef3e87db-9d6e-400e-bd6e-bd578baabbf6-kube-api-access-qbwrk\") pod \"machine-approver-56656f9798-6t5kh\" (UID: \"ef3e87db-9d6e-400e-bd6e-bd578baabbf6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6t5kh" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.510038 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gngmq\" (UniqueName: \"kubernetes.io/projected/76eee4ae-1408-4cd8-819a-a5ef5c887b9a-kube-api-access-gngmq\") pod \"authentication-operator-69f744f599-8sdmc\" (UID: \"76eee4ae-1408-4cd8-819a-a5ef5c887b9a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8sdmc" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.534543 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs5mq\" (UniqueName: \"kubernetes.io/projected/33f7be8a-c6e0-47cd-b9bf-d42447f23980-kube-api-access-xs5mq\") pod \"console-operator-58897d9998-jtp75\" (UID: \"33f7be8a-c6e0-47cd-b9bf-d42447f23980\") " pod="openshift-console-operator/console-operator-58897d9998-jtp75" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.553431 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh8nh\" (UniqueName: \"kubernetes.io/projected/989a2cb8-e14c-4ffb-9b1f-a5bea171b3f5-kube-api-access-sh8nh\") pod \"openshift-apiserver-operator-796bbdcf4f-fjs9m\" (UID: \"989a2cb8-e14c-4ffb-9b1f-a5bea171b3f5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fjs9m" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.562145 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-8sdmc" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.573880 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqrm7\" (UniqueName: \"kubernetes.io/projected/1b63fb51-bfa4-4c92-a1b0-9044cf7cff03-kube-api-access-tqrm7\") pod \"controller-manager-879f6c89f-sd2nj\" (UID: \"1b63fb51-bfa4-4c92-a1b0-9044cf7cff03\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sd2nj" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.580744 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6t5kh" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.592154 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-sd2nj" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.593903 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zbwr\" (UniqueName: \"kubernetes.io/projected/7d8683a6-e42e-4541-81b3-2b28fc5e7be6-kube-api-access-5zbwr\") pod \"cluster-image-registry-operator-dc59b4c8b-xrzpw\" (UID: \"7d8683a6-e42e-4541-81b3-2b28fc5e7be6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xrzpw" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.610594 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-jtp75" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.613726 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6npk\" (UniqueName: \"kubernetes.io/projected/cf947105-e97d-4a1c-9b59-bf6b37461c1e-kube-api-access-w6npk\") pod \"oauth-openshift-558db77b4-h46lz\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " pod="openshift-authentication/oauth-openshift-558db77b4-h46lz" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.632312 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkqr8\" (UniqueName: \"kubernetes.io/projected/2a7ecab1-cd44-4dee-810a-1fd601b96eb4-kube-api-access-gkqr8\") pod \"apiserver-7bbb656c7d-smvlm\" (UID: \"2a7ecab1-cd44-4dee-810a-1fd601b96eb4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-smvlm" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.651789 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcfn9\" (UniqueName: \"kubernetes.io/projected/acf5523b-3f1d-495e-8014-0313925e8727-kube-api-access-kcfn9\") pod \"downloads-7954f5f757-zcngp\" (UID: \"acf5523b-3f1d-495e-8014-0313925e8727\") " pod="openshift-console/downloads-7954f5f757-zcngp" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.675346 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-wd7mj"] Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.677545 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bwfx\" (UniqueName: \"kubernetes.io/projected/ceef7ba8-f996-4b56-a477-23873e39cde7-kube-api-access-2bwfx\") pod \"console-f9d7485db-2m6x7\" (UID: \"ceef7ba8-f996-4b56-a477-23873e39cde7\") " pod="openshift-console/console-f9d7485db-2m6x7" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.691468 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d8683a6-e42e-4541-81b3-2b28fc5e7be6-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-xrzpw\" (UID: \"7d8683a6-e42e-4541-81b3-2b28fc5e7be6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xrzpw" Oct 03 07:50:40 crc kubenswrapper[4664]: W1003 07:50:40.693704 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3a78dac_6d58_4647_83fb_b0f36f2f660a.slice/crio-5bdb6668dcf673c115e07695b7b3f5363174dccfe78b4ed087b77a98626e5e83 WatchSource:0}: Error finding container 5bdb6668dcf673c115e07695b7b3f5363174dccfe78b4ed087b77a98626e5e83: Status 404 returned error can't find the container with id 5bdb6668dcf673c115e07695b7b3f5363174dccfe78b4ed087b77a98626e5e83 Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.699964 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-h46lz" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.706974 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xrzpw" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.712322 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ftnz\" (UniqueName: \"kubernetes.io/projected/0c70928e-51e2-4c62-8ce6-c1f8ba489f8f-kube-api-access-2ftnz\") pod \"cluster-samples-operator-665b6dd947-b6htg\" (UID: \"0c70928e-51e2-4c62-8ce6-c1f8ba489f8f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-b6htg" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.714385 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-b6htg" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.723309 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-2m6x7" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.730919 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84l7l\" (UniqueName: \"kubernetes.io/projected/171032ce-a5a4-4f30-bdc1-8c39e19efe99-kube-api-access-84l7l\") pod \"openshift-config-operator-7777fb866f-tdjrr\" (UID: \"171032ce-a5a4-4f30-bdc1-8c39e19efe99\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tdjrr" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.739088 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-87c8s" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.759829 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.760933 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8tqq\" (UniqueName: \"kubernetes.io/projected/11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e-kube-api-access-n8tqq\") pod \"route-controller-manager-6576b87f9c-sm42l\" (UID: \"11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sm42l" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.776791 4664 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.796847 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.806909 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-smvlm" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.814653 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-8sdmc"] Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.814911 4664 request.go:700] Waited for 1.914803037s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-dns/secrets?fieldSelector=metadata.name%3Ddns-dockercfg-jwfmh&limit=500&resourceVersion=0 Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.820125 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.836933 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.838390 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fjs9m" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.841598 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sd2nj"] Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.857017 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.863565 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sm42l" Oct 03 07:50:40 crc kubenswrapper[4664]: W1003 07:50:40.875096 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b63fb51_bfa4_4c92_a1b0_9044cf7cff03.slice/crio-a6e177761557151257501904255ec8e94c8e46f85fa67ec67cb557b0406ca568 WatchSource:0}: Error finding container a6e177761557151257501904255ec8e94c8e46f85fa67ec67cb557b0406ca568: Status 404 returned error can't find the container with id a6e177761557151257501904255ec8e94c8e46f85fa67ec67cb557b0406ca568 Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.875549 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.881017 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-jtp75"] Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.896273 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.904729 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tdjrr" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.916993 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.936046 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.943510 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-zcngp" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.957142 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 03 07:50:40 crc kubenswrapper[4664]: I1003 07:50:40.990242 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:40.999960 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.044713 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqcms\" (UniqueName: \"kubernetes.io/projected/f3284298-3f20-43d6-95ae-7d40c56534d3-kube-api-access-cqcms\") pod \"package-server-manager-789f6589d5-dg8s7\" (UID: \"f3284298-3f20-43d6-95ae-7d40c56534d3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dg8s7" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.075326 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtm4d\" (UniqueName: \"kubernetes.io/projected/bcb87980-5888-4a30-859f-a9ac5b95f2c0-kube-api-access-mtm4d\") pod \"router-default-5444994796-jxft6\" (UID: \"bcb87980-5888-4a30-859f-a9ac5b95f2c0\") " pod="openshift-ingress/router-default-5444994796-jxft6" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.084670 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjwjr\" (UniqueName: \"kubernetes.io/projected/4efd784b-b02d-4298-a96b-ed5663641afa-kube-api-access-mjwjr\") pod \"marketplace-operator-79b997595-gj7qw\" (UID: \"4efd784b-b02d-4298-a96b-ed5663641afa\") " pod="openshift-marketplace/marketplace-operator-79b997595-gj7qw" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.094733 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-h46lz"] Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.099965 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxpft\" (UniqueName: \"kubernetes.io/projected/52515c5a-0f7d-42d4-90f8-97e66050f161-kube-api-access-bxpft\") pod \"machine-config-controller-84d6567774-9z9sg\" (UID: \"52515c5a-0f7d-42d4-90f8-97e66050f161\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9z9sg" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.111326 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhvkt\" (UniqueName: \"kubernetes.io/projected/d2108a93-65c6-4626-9fb6-f93854855b80-kube-api-access-lhvkt\") pod \"etcd-operator-b45778765-t9rqg\" (UID: \"d2108a93-65c6-4626-9fb6-f93854855b80\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t9rqg" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.141296 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gj7qw" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.151684 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9z9sg" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.157731 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6b56\" (UniqueName: \"kubernetes.io/projected/ab24ce7c-1a43-44de-98c3-e9bbdb0c7b6c-kube-api-access-s6b56\") pod \"multus-admission-controller-857f4d67dd-vsmt5\" (UID: \"ab24ce7c-1a43-44de-98c3-e9bbdb0c7b6c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vsmt5" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.174946 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtn69\" (UniqueName: \"kubernetes.io/projected/c47b8dca-2e97-4aba-b303-00c0f2b36ecc-kube-api-access-gtn69\") pod \"service-ca-operator-777779d784-tm59k\" (UID: \"c47b8dca-2e97-4aba-b303-00c0f2b36ecc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tm59k" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.180099 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-jxft6" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.190244 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dg8s7" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.196233 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xq8pc\" (UniqueName: \"kubernetes.io/projected/8ebb51e4-a635-4c8a-b287-79c0e7d74a9c-kube-api-access-xq8pc\") pod \"openshift-controller-manager-operator-756b6f6bc6-42xd6\" (UID: \"8ebb51e4-a635-4c8a-b287-79c0e7d74a9c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-42xd6" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.201975 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24bdq\" (UniqueName: \"kubernetes.io/projected/a6aa30cf-1aa9-4c2a-a079-40182a1c51ef-kube-api-access-24bdq\") pod \"machine-config-operator-74547568cd-nzfrx\" (UID: \"a6aa30cf-1aa9-4c2a-a079-40182a1c51ef\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nzfrx" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.225327 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cchd\" (UniqueName: \"kubernetes.io/projected/5c3aff15-c0d3-4d20-8ee3-7d753d5f7c91-kube-api-access-9cchd\") pod \"collect-profiles-29324625-nj78m\" (UID: \"5c3aff15-c0d3-4d20-8ee3-7d753d5f7c91\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324625-nj78m" Oct 03 07:50:41 crc kubenswrapper[4664]: W1003 07:50:41.273046 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcb87980_5888_4a30_859f_a9ac5b95f2c0.slice/crio-c39b1588963bb8308efb9a907fdee13768846358f693650d5e0d03be8300c1a9 WatchSource:0}: Error finding container c39b1588963bb8308efb9a907fdee13768846358f693650d5e0d03be8300c1a9: Status 404 returned error can't find the container with id c39b1588963bb8308efb9a907fdee13768846358f693650d5e0d03be8300c1a9 Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.366073 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-42xd6" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.367331 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-t9rqg" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.367977 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2f159324-ddf2-49a4-b432-46e5efd09e16-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-hpqxb\" (UID: \"2f159324-ddf2-49a4-b432-46e5efd09e16\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hpqxb" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.368086 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/478f27ac-050e-4086-82c9-2e23559cf70b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.368134 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/478f27ac-050e-4086-82c9-2e23559cf70b-registry-tls\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.368806 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/478f27ac-050e-4086-82c9-2e23559cf70b-bound-sa-token\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.369477 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/203737df-98aa-47e4-856a-88670695ce9c-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-c5ncx\" (UID: \"203737df-98aa-47e4-856a-88670695ce9c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c5ncx" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.370047 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f159324-ddf2-49a4-b432-46e5efd09e16-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-hpqxb\" (UID: \"2f159324-ddf2-49a4-b432-46e5efd09e16\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hpqxb" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.370137 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nx8jk\" (UniqueName: \"kubernetes.io/projected/b56b2f61-0758-42d4-a4c8-77c9848991ab-kube-api-access-nx8jk\") pod \"migrator-59844c95c7-zsdch\" (UID: \"b56b2f61-0758-42d4-a4c8-77c9848991ab\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zsdch" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.370284 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsnrt\" (UniqueName: \"kubernetes.io/projected/478f27ac-050e-4086-82c9-2e23559cf70b-kube-api-access-fsnrt\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.370334 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/478f27ac-050e-4086-82c9-2e23559cf70b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.370478 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/478f27ac-050e-4086-82c9-2e23559cf70b-trusted-ca\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.370509 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2209b7cc-875e-4b43-94ac-555eefedcc58-profile-collector-cert\") pod \"catalog-operator-68c6474976-s7kfk\" (UID: \"2209b7cc-875e-4b43-94ac-555eefedcc58\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s7kfk" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.370530 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdcwr\" (UniqueName: \"kubernetes.io/projected/2209b7cc-875e-4b43-94ac-555eefedcc58-kube-api-access-jdcwr\") pod \"catalog-operator-68c6474976-s7kfk\" (UID: \"2209b7cc-875e-4b43-94ac-555eefedcc58\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s7kfk" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.370566 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.370648 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76lbb\" (UniqueName: \"kubernetes.io/projected/203737df-98aa-47e4-856a-88670695ce9c-kube-api-access-76lbb\") pod \"kube-storage-version-migrator-operator-b67b599dd-c5ncx\" (UID: \"203737df-98aa-47e4-856a-88670695ce9c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c5ncx" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.370736 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/478f27ac-050e-4086-82c9-2e23559cf70b-registry-certificates\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.370763 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f159324-ddf2-49a4-b432-46e5efd09e16-config\") pod \"kube-apiserver-operator-766d6c64bb-hpqxb\" (UID: \"2f159324-ddf2-49a4-b432-46e5efd09e16\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hpqxb" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.370785 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/203737df-98aa-47e4-856a-88670695ce9c-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-c5ncx\" (UID: \"203737df-98aa-47e4-856a-88670695ce9c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c5ncx" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.370858 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5gc5\" (UniqueName: \"kubernetes.io/projected/5c7c6d84-6feb-47fe-8f80-0cd9ca2e7686-kube-api-access-c5gc5\") pod \"dns-operator-744455d44c-h7lkc\" (UID: \"5c7c6d84-6feb-47fe-8f80-0cd9ca2e7686\") " pod="openshift-dns-operator/dns-operator-744455d44c-h7lkc" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.370908 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5c7c6d84-6feb-47fe-8f80-0cd9ca2e7686-metrics-tls\") pod \"dns-operator-744455d44c-h7lkc\" (UID: \"5c7c6d84-6feb-47fe-8f80-0cd9ca2e7686\") " pod="openshift-dns-operator/dns-operator-744455d44c-h7lkc" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.371137 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2209b7cc-875e-4b43-94ac-555eefedcc58-srv-cert\") pod \"catalog-operator-68c6474976-s7kfk\" (UID: \"2209b7cc-875e-4b43-94ac-555eefedcc58\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s7kfk" Oct 03 07:50:41 crc kubenswrapper[4664]: E1003 07:50:41.372242 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 07:50:41.872226116 +0000 UTC m=+142.693416806 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-szr58" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.392480 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-tm59k" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.408429 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-vsmt5" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.419581 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nzfrx" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.440874 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-2m6x7"] Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.467281 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xrzpw"] Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.474395 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324625-nj78m" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.474796 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:50:41 crc kubenswrapper[4664]: E1003 07:50:41.474913 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 07:50:41.974890528 +0000 UTC m=+142.796081018 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.474991 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d0c882b-060d-4104-9d29-0a9576607a4a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vcvhp\" (UID: \"5d0c882b-060d-4104-9d29-0a9576607a4a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vcvhp" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.475029 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5c7c6d84-6feb-47fe-8f80-0cd9ca2e7686-metrics-tls\") pod \"dns-operator-744455d44c-h7lkc\" (UID: \"5c7c6d84-6feb-47fe-8f80-0cd9ca2e7686\") " pod="openshift-dns-operator/dns-operator-744455d44c-h7lkc" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.475055 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5gc5\" (UniqueName: \"kubernetes.io/projected/5c7c6d84-6feb-47fe-8f80-0cd9ca2e7686-kube-api-access-c5gc5\") pod \"dns-operator-744455d44c-h7lkc\" (UID: \"5c7c6d84-6feb-47fe-8f80-0cd9ca2e7686\") " pod="openshift-dns-operator/dns-operator-744455d44c-h7lkc" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.475093 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5dba8f90-eac6-468f-91c7-b95d2d04b2ec-config-volume\") pod \"dns-default-942tg\" (UID: \"5dba8f90-eac6-468f-91c7-b95d2d04b2ec\") " pod="openshift-dns/dns-default-942tg" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.475150 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2209b7cc-875e-4b43-94ac-555eefedcc58-srv-cert\") pod \"catalog-operator-68c6474976-s7kfk\" (UID: \"2209b7cc-875e-4b43-94ac-555eefedcc58\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s7kfk" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.475170 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/77a9c147-cee1-4a60-a2e1-8dd93096f35f-apiservice-cert\") pod \"packageserver-d55dfcdfc-xt6kz\" (UID: \"77a9c147-cee1-4a60-a2e1-8dd93096f35f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xt6kz" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.475203 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2f159324-ddf2-49a4-b432-46e5efd09e16-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-hpqxb\" (UID: \"2f159324-ddf2-49a4-b432-46e5efd09e16\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hpqxb" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.475284 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/77a9c147-cee1-4a60-a2e1-8dd93096f35f-webhook-cert\") pod \"packageserver-d55dfcdfc-xt6kz\" (UID: \"77a9c147-cee1-4a60-a2e1-8dd93096f35f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xt6kz" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.475388 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/478f27ac-050e-4086-82c9-2e23559cf70b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.475503 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d0c882b-060d-4104-9d29-0a9576607a4a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vcvhp\" (UID: \"5d0c882b-060d-4104-9d29-0a9576607a4a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vcvhp" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.475548 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/478f27ac-050e-4086-82c9-2e23559cf70b-registry-tls\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.475578 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/32eb3af2-4f0e-43f7-99ca-60eb556894d9-registration-dir\") pod \"csi-hostpathplugin-v4cvf\" (UID: \"32eb3af2-4f0e-43f7-99ca-60eb556894d9\") " pod="hostpath-provisioner/csi-hostpathplugin-v4cvf" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.475638 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7vb7\" (UniqueName: \"kubernetes.io/projected/6547d936-f7d9-4373-92ba-08a7e610c2c3-kube-api-access-c7vb7\") pod \"olm-operator-6b444d44fb-j2cbs\" (UID: \"6547d936-f7d9-4373-92ba-08a7e610c2c3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-j2cbs" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.475663 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/478f27ac-050e-4086-82c9-2e23559cf70b-bound-sa-token\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.475724 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/203737df-98aa-47e4-856a-88670695ce9c-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-c5ncx\" (UID: \"203737df-98aa-47e4-856a-88670695ce9c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c5ncx" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.475777 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5d0c882b-060d-4104-9d29-0a9576607a4a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vcvhp\" (UID: \"5d0c882b-060d-4104-9d29-0a9576607a4a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vcvhp" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.475800 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/32eb3af2-4f0e-43f7-99ca-60eb556894d9-mountpoint-dir\") pod \"csi-hostpathplugin-v4cvf\" (UID: \"32eb3af2-4f0e-43f7-99ca-60eb556894d9\") " pod="hostpath-provisioner/csi-hostpathplugin-v4cvf" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.475822 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvplk\" (UniqueName: \"kubernetes.io/projected/7cbc7be3-c40d-4679-995b-de87266f7587-kube-api-access-qvplk\") pod \"machine-config-server-gvjlm\" (UID: \"7cbc7be3-c40d-4679-995b-de87266f7587\") " pod="openshift-machine-config-operator/machine-config-server-gvjlm" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.475859 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/7cbc7be3-c40d-4679-995b-de87266f7587-node-bootstrap-token\") pod \"machine-config-server-gvjlm\" (UID: \"7cbc7be3-c40d-4679-995b-de87266f7587\") " pod="openshift-machine-config-operator/machine-config-server-gvjlm" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.475905 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/32eb3af2-4f0e-43f7-99ca-60eb556894d9-socket-dir\") pod \"csi-hostpathplugin-v4cvf\" (UID: \"32eb3af2-4f0e-43f7-99ca-60eb556894d9\") " pod="hostpath-provisioner/csi-hostpathplugin-v4cvf" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.475963 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f159324-ddf2-49a4-b432-46e5efd09e16-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-hpqxb\" (UID: \"2f159324-ddf2-49a4-b432-46e5efd09e16\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hpqxb" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.476033 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe9caff3-df9d-4bf2-a811-c7bba9f5856e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-zrt4t\" (UID: \"fe9caff3-df9d-4bf2-a811-c7bba9f5856e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zrt4t" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.476068 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29pxj\" (UniqueName: \"kubernetes.io/projected/5dba8f90-eac6-468f-91c7-b95d2d04b2ec-kube-api-access-29pxj\") pod \"dns-default-942tg\" (UID: \"5dba8f90-eac6-468f-91c7-b95d2d04b2ec\") " pod="openshift-dns/dns-default-942tg" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.477700 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nx8jk\" (UniqueName: \"kubernetes.io/projected/b56b2f61-0758-42d4-a4c8-77c9848991ab-kube-api-access-nx8jk\") pod \"migrator-59844c95c7-zsdch\" (UID: \"b56b2f61-0758-42d4-a4c8-77c9848991ab\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zsdch" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.478210 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtmch\" (UniqueName: \"kubernetes.io/projected/7cc4bac1-3b43-4b15-b1b6-7c738e625dd7-kube-api-access-gtmch\") pod \"ingress-operator-5b745b69d9-j66jg\" (UID: \"7cc4bac1-3b43-4b15-b1b6-7c738e625dd7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j66jg" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.478656 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/478f27ac-050e-4086-82c9-2e23559cf70b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.479088 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsnrt\" (UniqueName: \"kubernetes.io/projected/478f27ac-050e-4086-82c9-2e23559cf70b-kube-api-access-fsnrt\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.479534 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e1622b1d-7974-49dd-aa02-14972bf9cb8a-signing-cabundle\") pod \"service-ca-9c57cc56f-rlrp9\" (UID: \"e1622b1d-7974-49dd-aa02-14972bf9cb8a\") " pod="openshift-service-ca/service-ca-9c57cc56f-rlrp9" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.479757 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/478f27ac-050e-4086-82c9-2e23559cf70b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.480745 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/77a9c147-cee1-4a60-a2e1-8dd93096f35f-tmpfs\") pod \"packageserver-d55dfcdfc-xt6kz\" (UID: \"77a9c147-cee1-4a60-a2e1-8dd93096f35f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xt6kz" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.480804 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/c452acc1-8a32-47e7-9f61-4a7203b878c0-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-z8tqh\" (UID: \"c452acc1-8a32-47e7-9f61-4a7203b878c0\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-z8tqh" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.482377 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5c7c6d84-6feb-47fe-8f80-0cd9ca2e7686-metrics-tls\") pod \"dns-operator-744455d44c-h7lkc\" (UID: \"5c7c6d84-6feb-47fe-8f80-0cd9ca2e7686\") " pod="openshift-dns-operator/dns-operator-744455d44c-h7lkc" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.482445 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/478f27ac-050e-4086-82c9-2e23559cf70b-trusted-ca\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.482505 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2209b7cc-875e-4b43-94ac-555eefedcc58-profile-collector-cert\") pod \"catalog-operator-68c6474976-s7kfk\" (UID: \"2209b7cc-875e-4b43-94ac-555eefedcc58\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s7kfk" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.482653 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdcwr\" (UniqueName: \"kubernetes.io/projected/2209b7cc-875e-4b43-94ac-555eefedcc58-kube-api-access-jdcwr\") pod \"catalog-operator-68c6474976-s7kfk\" (UID: \"2209b7cc-875e-4b43-94ac-555eefedcc58\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s7kfk" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.482950 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.483143 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/478f27ac-050e-4086-82c9-2e23559cf70b-registry-tls\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.483168 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/203737df-98aa-47e4-856a-88670695ce9c-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-c5ncx\" (UID: \"203737df-98aa-47e4-856a-88670695ce9c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c5ncx" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.483236 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7cc4bac1-3b43-4b15-b1b6-7c738e625dd7-metrics-tls\") pod \"ingress-operator-5b745b69d9-j66jg\" (UID: \"7cc4bac1-3b43-4b15-b1b6-7c738e625dd7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j66jg" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.483318 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6547d936-f7d9-4373-92ba-08a7e610c2c3-srv-cert\") pod \"olm-operator-6b444d44fb-j2cbs\" (UID: \"6547d936-f7d9-4373-92ba-08a7e610c2c3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-j2cbs" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.483415 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e1622b1d-7974-49dd-aa02-14972bf9cb8a-signing-key\") pod \"service-ca-9c57cc56f-rlrp9\" (UID: \"e1622b1d-7974-49dd-aa02-14972bf9cb8a\") " pod="openshift-service-ca/service-ca-9c57cc56f-rlrp9" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.483439 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpf69\" (UniqueName: \"kubernetes.io/projected/c452acc1-8a32-47e7-9f61-4a7203b878c0-kube-api-access-xpf69\") pod \"control-plane-machine-set-operator-78cbb6b69f-z8tqh\" (UID: \"c452acc1-8a32-47e7-9f61-4a7203b878c0\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-z8tqh" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.483470 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4v88s\" (UniqueName: \"kubernetes.io/projected/32eb3af2-4f0e-43f7-99ca-60eb556894d9-kube-api-access-4v88s\") pod \"csi-hostpathplugin-v4cvf\" (UID: \"32eb3af2-4f0e-43f7-99ca-60eb556894d9\") " pod="hostpath-provisioner/csi-hostpathplugin-v4cvf" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.484006 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/478f27ac-050e-4086-82c9-2e23559cf70b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:41 crc kubenswrapper[4664]: E1003 07:50:41.483594 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 07:50:41.983578502 +0000 UTC m=+142.804768992 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-szr58" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.484910 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76lbb\" (UniqueName: \"kubernetes.io/projected/203737df-98aa-47e4-856a-88670695ce9c-kube-api-access-76lbb\") pod \"kube-storage-version-migrator-operator-b67b599dd-c5ncx\" (UID: \"203737df-98aa-47e4-856a-88670695ce9c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c5ncx" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.484976 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7cc4bac1-3b43-4b15-b1b6-7c738e625dd7-bound-sa-token\") pod \"ingress-operator-5b745b69d9-j66jg\" (UID: \"7cc4bac1-3b43-4b15-b1b6-7c738e625dd7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j66jg" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.485022 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/478f27ac-050e-4086-82c9-2e23559cf70b-trusted-ca\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.485938 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6547d936-f7d9-4373-92ba-08a7e610c2c3-profile-collector-cert\") pod \"olm-operator-6b444d44fb-j2cbs\" (UID: \"6547d936-f7d9-4373-92ba-08a7e610c2c3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-j2cbs" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.486056 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7cc4bac1-3b43-4b15-b1b6-7c738e625dd7-trusted-ca\") pod \"ingress-operator-5b745b69d9-j66jg\" (UID: \"7cc4bac1-3b43-4b15-b1b6-7c738e625dd7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j66jg" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.486079 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f159324-ddf2-49a4-b432-46e5efd09e16-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-hpqxb\" (UID: \"2f159324-ddf2-49a4-b432-46e5efd09e16\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hpqxb" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.486120 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fe9caff3-df9d-4bf2-a811-c7bba9f5856e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-zrt4t\" (UID: \"fe9caff3-df9d-4bf2-a811-c7bba9f5856e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zrt4t" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.486181 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/32eb3af2-4f0e-43f7-99ca-60eb556894d9-csi-data-dir\") pod \"csi-hostpathplugin-v4cvf\" (UID: \"32eb3af2-4f0e-43f7-99ca-60eb556894d9\") " pod="hostpath-provisioner/csi-hostpathplugin-v4cvf" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.486203 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/478f27ac-050e-4086-82c9-2e23559cf70b-registry-certificates\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.486220 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/7cbc7be3-c40d-4679-995b-de87266f7587-certs\") pod \"machine-config-server-gvjlm\" (UID: \"7cbc7be3-c40d-4679-995b-de87266f7587\") " pod="openshift-machine-config-operator/machine-config-server-gvjlm" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.486261 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f159324-ddf2-49a4-b432-46e5efd09e16-config\") pod \"kube-apiserver-operator-766d6c64bb-hpqxb\" (UID: \"2f159324-ddf2-49a4-b432-46e5efd09e16\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hpqxb" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.486279 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe9caff3-df9d-4bf2-a811-c7bba9f5856e-config\") pod \"kube-controller-manager-operator-78b949d7b-zrt4t\" (UID: \"fe9caff3-df9d-4bf2-a811-c7bba9f5856e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zrt4t" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.486301 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/32eb3af2-4f0e-43f7-99ca-60eb556894d9-plugins-dir\") pod \"csi-hostpathplugin-v4cvf\" (UID: \"32eb3af2-4f0e-43f7-99ca-60eb556894d9\") " pod="hostpath-provisioner/csi-hostpathplugin-v4cvf" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.486328 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/203737df-98aa-47e4-856a-88670695ce9c-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-c5ncx\" (UID: \"203737df-98aa-47e4-856a-88670695ce9c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c5ncx" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.486347 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6s55\" (UniqueName: \"kubernetes.io/projected/e1622b1d-7974-49dd-aa02-14972bf9cb8a-kube-api-access-d6s55\") pod \"service-ca-9c57cc56f-rlrp9\" (UID: \"e1622b1d-7974-49dd-aa02-14972bf9cb8a\") " pod="openshift-service-ca/service-ca-9c57cc56f-rlrp9" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.486376 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5dba8f90-eac6-468f-91c7-b95d2d04b2ec-metrics-tls\") pod \"dns-default-942tg\" (UID: \"5dba8f90-eac6-468f-91c7-b95d2d04b2ec\") " pod="openshift-dns/dns-default-942tg" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.486428 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5knx5\" (UniqueName: \"kubernetes.io/projected/77a9c147-cee1-4a60-a2e1-8dd93096f35f-kube-api-access-5knx5\") pod \"packageserver-d55dfcdfc-xt6kz\" (UID: \"77a9c147-cee1-4a60-a2e1-8dd93096f35f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xt6kz" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.488447 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2209b7cc-875e-4b43-94ac-555eefedcc58-profile-collector-cert\") pod \"catalog-operator-68c6474976-s7kfk\" (UID: \"2209b7cc-875e-4b43-94ac-555eefedcc58\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s7kfk" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.488650 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f159324-ddf2-49a4-b432-46e5efd09e16-config\") pod \"kube-apiserver-operator-766d6c64bb-hpqxb\" (UID: \"2f159324-ddf2-49a4-b432-46e5efd09e16\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hpqxb" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.487993 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/203737df-98aa-47e4-856a-88670695ce9c-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-c5ncx\" (UID: \"203737df-98aa-47e4-856a-88670695ce9c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c5ncx" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.489326 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/478f27ac-050e-4086-82c9-2e23559cf70b-registry-certificates\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.490311 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2209b7cc-875e-4b43-94ac-555eefedcc58-srv-cert\") pod \"catalog-operator-68c6474976-s7kfk\" (UID: \"2209b7cc-875e-4b43-94ac-555eefedcc58\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s7kfk" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.513205 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5gc5\" (UniqueName: \"kubernetes.io/projected/5c7c6d84-6feb-47fe-8f80-0cd9ca2e7686-kube-api-access-c5gc5\") pod \"dns-operator-744455d44c-h7lkc\" (UID: \"5c7c6d84-6feb-47fe-8f80-0cd9ca2e7686\") " pod="openshift-dns-operator/dns-operator-744455d44c-h7lkc" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.524952 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-jxft6" event={"ID":"bcb87980-5888-4a30-859f-a9ac5b95f2c0","Type":"ContainerStarted","Data":"c39b1588963bb8308efb9a907fdee13768846358f693650d5e0d03be8300c1a9"} Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.534086 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-jtp75" event={"ID":"33f7be8a-c6e0-47cd-b9bf-d42447f23980","Type":"ContainerStarted","Data":"98cd7cbbd50c142756a8eb0051d8639ee7edee2137bfb80fd598ceeae1e17002"} Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.537037 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-sd2nj" event={"ID":"1b63fb51-bfa4-4c92-a1b0-9044cf7cff03","Type":"ContainerStarted","Data":"a6e177761557151257501904255ec8e94c8e46f85fa67ec67cb557b0406ca568"} Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.538477 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nx8jk\" (UniqueName: \"kubernetes.io/projected/b56b2f61-0758-42d4-a4c8-77c9848991ab-kube-api-access-nx8jk\") pod \"migrator-59844c95c7-zsdch\" (UID: \"b56b2f61-0758-42d4-a4c8-77c9848991ab\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zsdch" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.542977 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-h46lz" event={"ID":"cf947105-e97d-4a1c-9b59-bf6b37461c1e","Type":"ContainerStarted","Data":"3ebb4f216ef38315c507dba210b0bd4c9426b9e7c02e4ee9007d37313e2d1919"} Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.543861 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-8sdmc" event={"ID":"76eee4ae-1408-4cd8-819a-a5ef5c887b9a","Type":"ContainerStarted","Data":"d8a50192156120ffc8e0d5d810ee412aba84e584d2f64a5ba7b09bf429d40daa"} Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.545150 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-wd7mj" event={"ID":"f3a78dac-6d58-4647-83fb-b0f36f2f660a","Type":"ContainerStarted","Data":"b47cfeb25674babb8d95c1c909c3be9f7a9f7577e6266d9aaff26ceb6d448da1"} Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.545178 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-wd7mj" event={"ID":"f3a78dac-6d58-4647-83fb-b0f36f2f660a","Type":"ContainerStarted","Data":"5bdb6668dcf673c115e07695b7b3f5363174dccfe78b4ed087b77a98626e5e83"} Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.546128 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6t5kh" event={"ID":"ef3e87db-9d6e-400e-bd6e-bd578baabbf6","Type":"ContainerStarted","Data":"dc2c157470a2b5299406aa662b3d2e2a31bc3310165cf376e03c9091cf4aa5ac"} Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.554330 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/478f27ac-050e-4086-82c9-2e23559cf70b-bound-sa-token\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.561796 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gj7qw"] Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.579023 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2f159324-ddf2-49a4-b432-46e5efd09e16-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-hpqxb\" (UID: \"2f159324-ddf2-49a4-b432-46e5efd09e16\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hpqxb" Oct 03 07:50:41 crc kubenswrapper[4664]: E1003 07:50:41.588036 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 07:50:42.088003463 +0000 UTC m=+142.909193953 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.589213 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.598391 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dg8s7"] Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.602448 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7cc4bac1-3b43-4b15-b1b6-7c738e625dd7-bound-sa-token\") pod \"ingress-operator-5b745b69d9-j66jg\" (UID: \"7cc4bac1-3b43-4b15-b1b6-7c738e625dd7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j66jg" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.602538 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e1622b1d-7974-49dd-aa02-14972bf9cb8a-signing-key\") pod \"service-ca-9c57cc56f-rlrp9\" (UID: \"e1622b1d-7974-49dd-aa02-14972bf9cb8a\") " pod="openshift-service-ca/service-ca-9c57cc56f-rlrp9" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.602567 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpf69\" (UniqueName: \"kubernetes.io/projected/c452acc1-8a32-47e7-9f61-4a7203b878c0-kube-api-access-xpf69\") pod \"control-plane-machine-set-operator-78cbb6b69f-z8tqh\" (UID: \"c452acc1-8a32-47e7-9f61-4a7203b878c0\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-z8tqh" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.602594 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4v88s\" (UniqueName: \"kubernetes.io/projected/32eb3af2-4f0e-43f7-99ca-60eb556894d9-kube-api-access-4v88s\") pod \"csi-hostpathplugin-v4cvf\" (UID: \"32eb3af2-4f0e-43f7-99ca-60eb556894d9\") " pod="hostpath-provisioner/csi-hostpathplugin-v4cvf" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.602654 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7cc4bac1-3b43-4b15-b1b6-7c738e625dd7-trusted-ca\") pod \"ingress-operator-5b745b69d9-j66jg\" (UID: \"7cc4bac1-3b43-4b15-b1b6-7c738e625dd7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j66jg" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.602677 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6547d936-f7d9-4373-92ba-08a7e610c2c3-profile-collector-cert\") pod \"olm-operator-6b444d44fb-j2cbs\" (UID: \"6547d936-f7d9-4373-92ba-08a7e610c2c3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-j2cbs" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.602708 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fe9caff3-df9d-4bf2-a811-c7bba9f5856e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-zrt4t\" (UID: \"fe9caff3-df9d-4bf2-a811-c7bba9f5856e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zrt4t" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.602736 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/32eb3af2-4f0e-43f7-99ca-60eb556894d9-csi-data-dir\") pod \"csi-hostpathplugin-v4cvf\" (UID: \"32eb3af2-4f0e-43f7-99ca-60eb556894d9\") " pod="hostpath-provisioner/csi-hostpathplugin-v4cvf" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.602763 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/7cbc7be3-c40d-4679-995b-de87266f7587-certs\") pod \"machine-config-server-gvjlm\" (UID: \"7cbc7be3-c40d-4679-995b-de87266f7587\") " pod="openshift-machine-config-operator/machine-config-server-gvjlm" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.602791 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe9caff3-df9d-4bf2-a811-c7bba9f5856e-config\") pod \"kube-controller-manager-operator-78b949d7b-zrt4t\" (UID: \"fe9caff3-df9d-4bf2-a811-c7bba9f5856e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zrt4t" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.602813 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/32eb3af2-4f0e-43f7-99ca-60eb556894d9-plugins-dir\") pod \"csi-hostpathplugin-v4cvf\" (UID: \"32eb3af2-4f0e-43f7-99ca-60eb556894d9\") " pod="hostpath-provisioner/csi-hostpathplugin-v4cvf" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.602838 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6s55\" (UniqueName: \"kubernetes.io/projected/e1622b1d-7974-49dd-aa02-14972bf9cb8a-kube-api-access-d6s55\") pod \"service-ca-9c57cc56f-rlrp9\" (UID: \"e1622b1d-7974-49dd-aa02-14972bf9cb8a\") " pod="openshift-service-ca/service-ca-9c57cc56f-rlrp9" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.602866 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5dba8f90-eac6-468f-91c7-b95d2d04b2ec-metrics-tls\") pod \"dns-default-942tg\" (UID: \"5dba8f90-eac6-468f-91c7-b95d2d04b2ec\") " pod="openshift-dns/dns-default-942tg" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.602900 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/74867a3b-68c8-4f94-9e0c-77c22db7b494-cert\") pod \"ingress-canary-bs9r7\" (UID: \"74867a3b-68c8-4f94-9e0c-77c22db7b494\") " pod="openshift-ingress-canary/ingress-canary-bs9r7" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.602930 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5knx5\" (UniqueName: \"kubernetes.io/projected/77a9c147-cee1-4a60-a2e1-8dd93096f35f-kube-api-access-5knx5\") pod \"packageserver-d55dfcdfc-xt6kz\" (UID: \"77a9c147-cee1-4a60-a2e1-8dd93096f35f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xt6kz" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.602958 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d0c882b-060d-4104-9d29-0a9576607a4a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vcvhp\" (UID: \"5d0c882b-060d-4104-9d29-0a9576607a4a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vcvhp" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.602992 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5dba8f90-eac6-468f-91c7-b95d2d04b2ec-config-volume\") pod \"dns-default-942tg\" (UID: \"5dba8f90-eac6-468f-91c7-b95d2d04b2ec\") " pod="openshift-dns/dns-default-942tg" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.603018 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/77a9c147-cee1-4a60-a2e1-8dd93096f35f-apiservice-cert\") pod \"packageserver-d55dfcdfc-xt6kz\" (UID: \"77a9c147-cee1-4a60-a2e1-8dd93096f35f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xt6kz" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.603071 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/77a9c147-cee1-4a60-a2e1-8dd93096f35f-webhook-cert\") pod \"packageserver-d55dfcdfc-xt6kz\" (UID: \"77a9c147-cee1-4a60-a2e1-8dd93096f35f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xt6kz" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.603097 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d0c882b-060d-4104-9d29-0a9576607a4a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vcvhp\" (UID: \"5d0c882b-060d-4104-9d29-0a9576607a4a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vcvhp" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.603144 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/32eb3af2-4f0e-43f7-99ca-60eb556894d9-registration-dir\") pod \"csi-hostpathplugin-v4cvf\" (UID: \"32eb3af2-4f0e-43f7-99ca-60eb556894d9\") " pod="hostpath-provisioner/csi-hostpathplugin-v4cvf" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.603171 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsdtn\" (UniqueName: \"kubernetes.io/projected/74867a3b-68c8-4f94-9e0c-77c22db7b494-kube-api-access-vsdtn\") pod \"ingress-canary-bs9r7\" (UID: \"74867a3b-68c8-4f94-9e0c-77c22db7b494\") " pod="openshift-ingress-canary/ingress-canary-bs9r7" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.603200 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7vb7\" (UniqueName: \"kubernetes.io/projected/6547d936-f7d9-4373-92ba-08a7e610c2c3-kube-api-access-c7vb7\") pod \"olm-operator-6b444d44fb-j2cbs\" (UID: \"6547d936-f7d9-4373-92ba-08a7e610c2c3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-j2cbs" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.603232 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5d0c882b-060d-4104-9d29-0a9576607a4a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vcvhp\" (UID: \"5d0c882b-060d-4104-9d29-0a9576607a4a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vcvhp" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.603257 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/32eb3af2-4f0e-43f7-99ca-60eb556894d9-mountpoint-dir\") pod \"csi-hostpathplugin-v4cvf\" (UID: \"32eb3af2-4f0e-43f7-99ca-60eb556894d9\") " pod="hostpath-provisioner/csi-hostpathplugin-v4cvf" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.603284 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvplk\" (UniqueName: \"kubernetes.io/projected/7cbc7be3-c40d-4679-995b-de87266f7587-kube-api-access-qvplk\") pod \"machine-config-server-gvjlm\" (UID: \"7cbc7be3-c40d-4679-995b-de87266f7587\") " pod="openshift-machine-config-operator/machine-config-server-gvjlm" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.603309 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/7cbc7be3-c40d-4679-995b-de87266f7587-node-bootstrap-token\") pod \"machine-config-server-gvjlm\" (UID: \"7cbc7be3-c40d-4679-995b-de87266f7587\") " pod="openshift-machine-config-operator/machine-config-server-gvjlm" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.603336 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/32eb3af2-4f0e-43f7-99ca-60eb556894d9-socket-dir\") pod \"csi-hostpathplugin-v4cvf\" (UID: \"32eb3af2-4f0e-43f7-99ca-60eb556894d9\") " pod="hostpath-provisioner/csi-hostpathplugin-v4cvf" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.603368 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe9caff3-df9d-4bf2-a811-c7bba9f5856e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-zrt4t\" (UID: \"fe9caff3-df9d-4bf2-a811-c7bba9f5856e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zrt4t" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.603395 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29pxj\" (UniqueName: \"kubernetes.io/projected/5dba8f90-eac6-468f-91c7-b95d2d04b2ec-kube-api-access-29pxj\") pod \"dns-default-942tg\" (UID: \"5dba8f90-eac6-468f-91c7-b95d2d04b2ec\") " pod="openshift-dns/dns-default-942tg" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.603430 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtmch\" (UniqueName: \"kubernetes.io/projected/7cc4bac1-3b43-4b15-b1b6-7c738e625dd7-kube-api-access-gtmch\") pod \"ingress-operator-5b745b69d9-j66jg\" (UID: \"7cc4bac1-3b43-4b15-b1b6-7c738e625dd7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j66jg" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.603474 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e1622b1d-7974-49dd-aa02-14972bf9cb8a-signing-cabundle\") pod \"service-ca-9c57cc56f-rlrp9\" (UID: \"e1622b1d-7974-49dd-aa02-14972bf9cb8a\") " pod="openshift-service-ca/service-ca-9c57cc56f-rlrp9" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.603502 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/77a9c147-cee1-4a60-a2e1-8dd93096f35f-tmpfs\") pod \"packageserver-d55dfcdfc-xt6kz\" (UID: \"77a9c147-cee1-4a60-a2e1-8dd93096f35f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xt6kz" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.603528 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/c452acc1-8a32-47e7-9f61-4a7203b878c0-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-z8tqh\" (UID: \"c452acc1-8a32-47e7-9f61-4a7203b878c0\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-z8tqh" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.603592 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.603637 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7cc4bac1-3b43-4b15-b1b6-7c738e625dd7-metrics-tls\") pod \"ingress-operator-5b745b69d9-j66jg\" (UID: \"7cc4bac1-3b43-4b15-b1b6-7c738e625dd7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j66jg" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.603659 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6547d936-f7d9-4373-92ba-08a7e610c2c3-srv-cert\") pod \"olm-operator-6b444d44fb-j2cbs\" (UID: \"6547d936-f7d9-4373-92ba-08a7e610c2c3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-j2cbs" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.603867 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/32eb3af2-4f0e-43f7-99ca-60eb556894d9-plugins-dir\") pod \"csi-hostpathplugin-v4cvf\" (UID: \"32eb3af2-4f0e-43f7-99ca-60eb556894d9\") " pod="hostpath-provisioner/csi-hostpathplugin-v4cvf" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.605313 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5dba8f90-eac6-468f-91c7-b95d2d04b2ec-config-volume\") pod \"dns-default-942tg\" (UID: \"5dba8f90-eac6-468f-91c7-b95d2d04b2ec\") " pod="openshift-dns/dns-default-942tg" Oct 03 07:50:41 crc kubenswrapper[4664]: E1003 07:50:41.605654 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 07:50:42.105636981 +0000 UTC m=+142.926827641 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-szr58" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.606240 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7cc4bac1-3b43-4b15-b1b6-7c738e625dd7-trusted-ca\") pod \"ingress-operator-5b745b69d9-j66jg\" (UID: \"7cc4bac1-3b43-4b15-b1b6-7c738e625dd7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j66jg" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.606642 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/32eb3af2-4f0e-43f7-99ca-60eb556894d9-socket-dir\") pod \"csi-hostpathplugin-v4cvf\" (UID: \"32eb3af2-4f0e-43f7-99ca-60eb556894d9\") " pod="hostpath-provisioner/csi-hostpathplugin-v4cvf" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.606728 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/32eb3af2-4f0e-43f7-99ca-60eb556894d9-csi-data-dir\") pod \"csi-hostpathplugin-v4cvf\" (UID: \"32eb3af2-4f0e-43f7-99ca-60eb556894d9\") " pod="hostpath-provisioner/csi-hostpathplugin-v4cvf" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.607413 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe9caff3-df9d-4bf2-a811-c7bba9f5856e-config\") pod \"kube-controller-manager-operator-78b949d7b-zrt4t\" (UID: \"fe9caff3-df9d-4bf2-a811-c7bba9f5856e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zrt4t" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.608415 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/77a9c147-cee1-4a60-a2e1-8dd93096f35f-tmpfs\") pod \"packageserver-d55dfcdfc-xt6kz\" (UID: \"77a9c147-cee1-4a60-a2e1-8dd93096f35f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xt6kz" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.609688 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d0c882b-060d-4104-9d29-0a9576607a4a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vcvhp\" (UID: \"5d0c882b-060d-4104-9d29-0a9576607a4a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vcvhp" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.610410 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/32eb3af2-4f0e-43f7-99ca-60eb556894d9-registration-dir\") pod \"csi-hostpathplugin-v4cvf\" (UID: \"32eb3af2-4f0e-43f7-99ca-60eb556894d9\") " pod="hostpath-provisioner/csi-hostpathplugin-v4cvf" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.611483 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e1622b1d-7974-49dd-aa02-14972bf9cb8a-signing-cabundle\") pod \"service-ca-9c57cc56f-rlrp9\" (UID: \"e1622b1d-7974-49dd-aa02-14972bf9cb8a\") " pod="openshift-service-ca/service-ca-9c57cc56f-rlrp9" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.611783 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/32eb3af2-4f0e-43f7-99ca-60eb556894d9-mountpoint-dir\") pod \"csi-hostpathplugin-v4cvf\" (UID: \"32eb3af2-4f0e-43f7-99ca-60eb556894d9\") " pod="hostpath-provisioner/csi-hostpathplugin-v4cvf" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.619903 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-smvlm"] Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.622846 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6547d936-f7d9-4373-92ba-08a7e610c2c3-srv-cert\") pod \"olm-operator-6b444d44fb-j2cbs\" (UID: \"6547d936-f7d9-4373-92ba-08a7e610c2c3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-j2cbs" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.623532 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsnrt\" (UniqueName: \"kubernetes.io/projected/478f27ac-050e-4086-82c9-2e23559cf70b-kube-api-access-fsnrt\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.626324 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d0c882b-060d-4104-9d29-0a9576607a4a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vcvhp\" (UID: \"5d0c882b-060d-4104-9d29-0a9576607a4a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vcvhp" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.626995 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e1622b1d-7974-49dd-aa02-14972bf9cb8a-signing-key\") pod \"service-ca-9c57cc56f-rlrp9\" (UID: \"e1622b1d-7974-49dd-aa02-14972bf9cb8a\") " pod="openshift-service-ca/service-ca-9c57cc56f-rlrp9" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.627518 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/77a9c147-cee1-4a60-a2e1-8dd93096f35f-apiservice-cert\") pod \"packageserver-d55dfcdfc-xt6kz\" (UID: \"77a9c147-cee1-4a60-a2e1-8dd93096f35f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xt6kz" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.627752 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/77a9c147-cee1-4a60-a2e1-8dd93096f35f-webhook-cert\") pod \"packageserver-d55dfcdfc-xt6kz\" (UID: \"77a9c147-cee1-4a60-a2e1-8dd93096f35f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xt6kz" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.632648 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-9z9sg"] Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.633538 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7cc4bac1-3b43-4b15-b1b6-7c738e625dd7-metrics-tls\") pod \"ingress-operator-5b745b69d9-j66jg\" (UID: \"7cc4bac1-3b43-4b15-b1b6-7c738e625dd7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j66jg" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.634217 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/7cbc7be3-c40d-4679-995b-de87266f7587-certs\") pod \"machine-config-server-gvjlm\" (UID: \"7cbc7be3-c40d-4679-995b-de87266f7587\") " pod="openshift-machine-config-operator/machine-config-server-gvjlm" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.634271 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-b6htg"] Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.635915 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6547d936-f7d9-4373-92ba-08a7e610c2c3-profile-collector-cert\") pod \"olm-operator-6b444d44fb-j2cbs\" (UID: \"6547d936-f7d9-4373-92ba-08a7e610c2c3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-j2cbs" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.636388 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5dba8f90-eac6-468f-91c7-b95d2d04b2ec-metrics-tls\") pod \"dns-default-942tg\" (UID: \"5dba8f90-eac6-468f-91c7-b95d2d04b2ec\") " pod="openshift-dns/dns-default-942tg" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.637895 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-h7lkc" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.639793 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdcwr\" (UniqueName: \"kubernetes.io/projected/2209b7cc-875e-4b43-94ac-555eefedcc58-kube-api-access-jdcwr\") pod \"catalog-operator-68c6474976-s7kfk\" (UID: \"2209b7cc-875e-4b43-94ac-555eefedcc58\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s7kfk" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.642717 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe9caff3-df9d-4bf2-a811-c7bba9f5856e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-zrt4t\" (UID: \"fe9caff3-df9d-4bf2-a811-c7bba9f5856e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zrt4t" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.644476 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/c452acc1-8a32-47e7-9f61-4a7203b878c0-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-z8tqh\" (UID: \"c452acc1-8a32-47e7-9f61-4a7203b878c0\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-z8tqh" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.645481 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/7cbc7be3-c40d-4679-995b-de87266f7587-node-bootstrap-token\") pod \"machine-config-server-gvjlm\" (UID: \"7cbc7be3-c40d-4679-995b-de87266f7587\") " pod="openshift-machine-config-operator/machine-config-server-gvjlm" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.671237 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76lbb\" (UniqueName: \"kubernetes.io/projected/203737df-98aa-47e4-856a-88670695ce9c-kube-api-access-76lbb\") pod \"kube-storage-version-migrator-operator-b67b599dd-c5ncx\" (UID: \"203737df-98aa-47e4-856a-88670695ce9c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c5ncx" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.693782 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fjs9m"] Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.699299 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-tdjrr"] Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.701129 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s7kfk" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.702078 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hpqxb" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.709894 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.710339 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsdtn\" (UniqueName: \"kubernetes.io/projected/74867a3b-68c8-4f94-9e0c-77c22db7b494-kube-api-access-vsdtn\") pod \"ingress-canary-bs9r7\" (UID: \"74867a3b-68c8-4f94-9e0c-77c22db7b494\") " pod="openshift-ingress-canary/ingress-canary-bs9r7" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.710799 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/74867a3b-68c8-4f94-9e0c-77c22db7b494-cert\") pod \"ingress-canary-bs9r7\" (UID: \"74867a3b-68c8-4f94-9e0c-77c22db7b494\") " pod="openshift-ingress-canary/ingress-canary-bs9r7" Oct 03 07:50:41 crc kubenswrapper[4664]: E1003 07:50:41.712098 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 07:50:42.212051629 +0000 UTC m=+143.033242119 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.712450 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4v88s\" (UniqueName: \"kubernetes.io/projected/32eb3af2-4f0e-43f7-99ca-60eb556894d9-kube-api-access-4v88s\") pod \"csi-hostpathplugin-v4cvf\" (UID: \"32eb3af2-4f0e-43f7-99ca-60eb556894d9\") " pod="hostpath-provisioner/csi-hostpathplugin-v4cvf" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.716033 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-87c8s"] Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.721533 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-sm42l"] Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.724579 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-42xd6"] Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.726679 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/74867a3b-68c8-4f94-9e0c-77c22db7b494-cert\") pod \"ingress-canary-bs9r7\" (UID: \"74867a3b-68c8-4f94-9e0c-77c22db7b494\") " pod="openshift-ingress-canary/ingress-canary-bs9r7" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.734861 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-zcngp"] Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.735495 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6s55\" (UniqueName: \"kubernetes.io/projected/e1622b1d-7974-49dd-aa02-14972bf9cb8a-kube-api-access-d6s55\") pod \"service-ca-9c57cc56f-rlrp9\" (UID: \"e1622b1d-7974-49dd-aa02-14972bf9cb8a\") " pod="openshift-service-ca/service-ca-9c57cc56f-rlrp9" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.739367 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7cc4bac1-3b43-4b15-b1b6-7c738e625dd7-bound-sa-token\") pod \"ingress-operator-5b745b69d9-j66jg\" (UID: \"7cc4bac1-3b43-4b15-b1b6-7c738e625dd7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j66jg" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.763563 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpf69\" (UniqueName: \"kubernetes.io/projected/c452acc1-8a32-47e7-9f61-4a7203b878c0-kube-api-access-xpf69\") pod \"control-plane-machine-set-operator-78cbb6b69f-z8tqh\" (UID: \"c452acc1-8a32-47e7-9f61-4a7203b878c0\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-z8tqh" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.764981 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zsdch" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.778420 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fe9caff3-df9d-4bf2-a811-c7bba9f5856e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-zrt4t\" (UID: \"fe9caff3-df9d-4bf2-a811-c7bba9f5856e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zrt4t" Oct 03 07:50:41 crc kubenswrapper[4664]: W1003 07:50:41.778552 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podacf5523b_3f1d_495e_8014_0313925e8727.slice/crio-f4a19334ac12361d5b6b3fe64eb544b3a7f747aa87258449424bd00786be563d WatchSource:0}: Error finding container f4a19334ac12361d5b6b3fe64eb544b3a7f747aa87258449424bd00786be563d: Status 404 returned error can't find the container with id f4a19334ac12361d5b6b3fe64eb544b3a7f747aa87258449424bd00786be563d Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.802155 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvplk\" (UniqueName: \"kubernetes.io/projected/7cbc7be3-c40d-4679-995b-de87266f7587-kube-api-access-qvplk\") pod \"machine-config-server-gvjlm\" (UID: \"7cbc7be3-c40d-4679-995b-de87266f7587\") " pod="openshift-machine-config-operator/machine-config-server-gvjlm" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.812590 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:41 crc kubenswrapper[4664]: E1003 07:50:41.813229 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 07:50:42.313210339 +0000 UTC m=+143.134400899 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-szr58" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.821014 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-v4cvf" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.823483 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7vb7\" (UniqueName: \"kubernetes.io/projected/6547d936-f7d9-4373-92ba-08a7e610c2c3-kube-api-access-c7vb7\") pod \"olm-operator-6b444d44fb-j2cbs\" (UID: \"6547d936-f7d9-4373-92ba-08a7e610c2c3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-j2cbs" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.833074 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtmch\" (UniqueName: \"kubernetes.io/projected/7cc4bac1-3b43-4b15-b1b6-7c738e625dd7-kube-api-access-gtmch\") pod \"ingress-operator-5b745b69d9-j66jg\" (UID: \"7cc4bac1-3b43-4b15-b1b6-7c738e625dd7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j66jg" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.839040 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-gvjlm" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.863483 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29pxj\" (UniqueName: \"kubernetes.io/projected/5dba8f90-eac6-468f-91c7-b95d2d04b2ec-kube-api-access-29pxj\") pod \"dns-default-942tg\" (UID: \"5dba8f90-eac6-468f-91c7-b95d2d04b2ec\") " pod="openshift-dns/dns-default-942tg" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.889881 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5d0c882b-060d-4104-9d29-0a9576607a4a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vcvhp\" (UID: \"5d0c882b-060d-4104-9d29-0a9576607a4a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vcvhp" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.909383 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5knx5\" (UniqueName: \"kubernetes.io/projected/77a9c147-cee1-4a60-a2e1-8dd93096f35f-kube-api-access-5knx5\") pod \"packageserver-d55dfcdfc-xt6kz\" (UID: \"77a9c147-cee1-4a60-a2e1-8dd93096f35f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xt6kz" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.914369 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:50:41 crc kubenswrapper[4664]: E1003 07:50:41.914932 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 07:50:42.414913348 +0000 UTC m=+143.236103838 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.937079 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsdtn\" (UniqueName: \"kubernetes.io/projected/74867a3b-68c8-4f94-9e0c-77c22db7b494-kube-api-access-vsdtn\") pod \"ingress-canary-bs9r7\" (UID: \"74867a3b-68c8-4f94-9e0c-77c22db7b494\") " pod="openshift-ingress-canary/ingress-canary-bs9r7" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.957411 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j66jg" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.981268 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c5ncx" Oct 03 07:50:41 crc kubenswrapper[4664]: I1003 07:50:41.984930 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vcvhp" Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.014451 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-rlrp9" Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.017059 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:42 crc kubenswrapper[4664]: E1003 07:50:42.017496 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 07:50:42.517472215 +0000 UTC m=+143.338662885 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-szr58" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.020421 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-t9rqg"] Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.026219 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zrt4t" Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.039257 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xt6kz" Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.057496 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-z8tqh" Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.105290 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-j2cbs" Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.118727 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:50:42 crc kubenswrapper[4664]: E1003 07:50:42.119869 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 07:50:42.619796855 +0000 UTC m=+143.440987535 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.130480 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-942tg" Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.145739 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bs9r7" Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.180754 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-vsmt5"] Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.188431 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-nzfrx"] Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.220287 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:42 crc kubenswrapper[4664]: E1003 07:50:42.220595 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 07:50:42.720579323 +0000 UTC m=+143.541769853 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-szr58" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.240656 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324625-nj78m"] Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.321960 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:50:42 crc kubenswrapper[4664]: E1003 07:50:42.322139 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 07:50:42.822116715 +0000 UTC m=+143.643307215 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.322583 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:42 crc kubenswrapper[4664]: E1003 07:50:42.322926 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 07:50:42.822918613 +0000 UTC m=+143.644109103 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-szr58" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.326204 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-tm59k"] Oct 03 07:50:42 crc kubenswrapper[4664]: W1003 07:50:42.349053 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab24ce7c_1a43_44de_98c3_e9bbdb0c7b6c.slice/crio-f5274f3cb62856eb08775c3a25037aa2f51b86f6e96b0414b913624f88184c95 WatchSource:0}: Error finding container f5274f3cb62856eb08775c3a25037aa2f51b86f6e96b0414b913624f88184c95: Status 404 returned error can't find the container with id f5274f3cb62856eb08775c3a25037aa2f51b86f6e96b0414b913624f88184c95 Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.360924 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-h7lkc"] Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.422569 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s7kfk"] Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.423280 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:50:42 crc kubenswrapper[4664]: E1003 07:50:42.423778 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 07:50:42.923763202 +0000 UTC m=+143.744953692 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.526433 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:42 crc kubenswrapper[4664]: E1003 07:50:42.526925 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 07:50:43.026905779 +0000 UTC m=+143.848096269 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-szr58" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.527362 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hpqxb"] Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.562233 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-smvlm" event={"ID":"2a7ecab1-cd44-4dee-810a-1fd601b96eb4","Type":"ContainerStarted","Data":"ee4dcd67f3936d20d200e933f1b3a2557ea56ff337e4ce5a4ebbf79e29dcdbc8"} Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.565211 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-wd7mj" event={"ID":"f3a78dac-6d58-4647-83fb-b0f36f2f660a","Type":"ContainerStarted","Data":"7747c7811da973715b01bcea451854df5a208e55a7780cfc2d8212a3c09faaf9"} Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.571722 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-h7lkc" event={"ID":"5c7c6d84-6feb-47fe-8f80-0cd9ca2e7686","Type":"ContainerStarted","Data":"d133ecf1eb5cff8fb511ba29219080f5e930f14899e20db6a1a88460976d75f8"} Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.573556 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-b6htg" event={"ID":"0c70928e-51e2-4c62-8ce6-c1f8ba489f8f","Type":"ContainerStarted","Data":"06d3d7b7ddbe4a77e46bfbc33ad7e83ac2fe237bd944be4538914f536f063ef7"} Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.573616 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-b6htg" event={"ID":"0c70928e-51e2-4c62-8ce6-c1f8ba489f8f","Type":"ContainerStarted","Data":"88cec6e8be1ce70abf119371a639be2a1265d9313f9156796015d6454b737b93"} Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.575452 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nzfrx" event={"ID":"a6aa30cf-1aa9-4c2a-a079-40182a1c51ef","Type":"ContainerStarted","Data":"abd96ea4dfd8539f2c853d0855be0bf51c3f6af05651ed31cc457aa5f558f5a6"} Oct 03 07:50:42 crc kubenswrapper[4664]: W1003 07:50:42.579134 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7cbc7be3_c40d_4679_995b_de87266f7587.slice/crio-854b5291e59050d417574e2c9fa65ef40c7ad2cb54b87c2c35e23d9a71e21a12 WatchSource:0}: Error finding container 854b5291e59050d417574e2c9fa65ef40c7ad2cb54b87c2c35e23d9a71e21a12: Status 404 returned error can't find the container with id 854b5291e59050d417574e2c9fa65ef40c7ad2cb54b87c2c35e23d9a71e21a12 Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.579946 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-t9rqg" event={"ID":"d2108a93-65c6-4626-9fb6-f93854855b80","Type":"ContainerStarted","Data":"bb084982fd1d915c5608ce530f19ddab27ea1480237583985428a77d34253c64"} Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.581250 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-h46lz" event={"ID":"cf947105-e97d-4a1c-9b59-bf6b37461c1e","Type":"ContainerStarted","Data":"5c9e2acb46f4810eaf773f4cd064187a66df4475f0cdad646fdd494e5a5711d5"} Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.582560 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-h46lz" Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.587420 4664 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-h46lz container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.24:6443/healthz\": dial tcp 10.217.0.24:6443: connect: connection refused" start-of-body= Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.587485 4664 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-h46lz" podUID="cf947105-e97d-4a1c-9b59-bf6b37461c1e" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.24:6443/healthz\": dial tcp 10.217.0.24:6443: connect: connection refused" Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.597415 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-2m6x7" event={"ID":"ceef7ba8-f996-4b56-a477-23873e39cde7","Type":"ContainerStarted","Data":"086931ca54355c86710ba259601c4f4286a007200d30c9e44598c5566186b8a5"} Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.597463 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-2m6x7" event={"ID":"ceef7ba8-f996-4b56-a477-23873e39cde7","Type":"ContainerStarted","Data":"680a65dcdfed2bdc8b9984b9cfa41a1021e3051184894e539050a71b14a4bdc3"} Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.606934 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9z9sg" event={"ID":"52515c5a-0f7d-42d4-90f8-97e66050f161","Type":"ContainerStarted","Data":"45bd1c7acdf26825eaa56e552f524b0bd4714f59613e3ba654e0cc612f2cc472"} Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.608997 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-zsdch"] Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.609313 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sm42l" event={"ID":"11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e","Type":"ContainerStarted","Data":"05edfd23a24c34af5986d8f26457b0d1ea4b17c287ece267a213f607d9bb5f28"} Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.613491 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-8sdmc" event={"ID":"76eee4ae-1408-4cd8-819a-a5ef5c887b9a","Type":"ContainerStarted","Data":"33ad7985d91bc2a951cf0ef6688a890f41e754849876687a7f87cd9104be59f6"} Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.615329 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-tm59k" event={"ID":"c47b8dca-2e97-4aba-b303-00c0f2b36ecc","Type":"ContainerStarted","Data":"0443a607f3192b69322fc456522becd41d280097cd7be3a9744fd4f1e8673bf2"} Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.618892 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-42xd6" event={"ID":"8ebb51e4-a635-4c8a-b287-79c0e7d74a9c","Type":"ContainerStarted","Data":"7f68663f49a44bc866f88bf491880278b87a70f77c74172e7d229da137e681aa"} Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.622345 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-sd2nj" event={"ID":"1b63fb51-bfa4-4c92-a1b0-9044cf7cff03","Type":"ContainerStarted","Data":"bfdc5c9858c31055e7fb71a810ef444fe98f36862c5c54c54bf46c24840a5952"} Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.623312 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-sd2nj" Oct 03 07:50:42 crc kubenswrapper[4664]: W1003 07:50:42.625628 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f159324_ddf2_49a4_b432_46e5efd09e16.slice/crio-7da1e3da9068b64f99b9f6fc69424bfdc435d67a4f2985e0d3cce316b7090bd0 WatchSource:0}: Error finding container 7da1e3da9068b64f99b9f6fc69424bfdc435d67a4f2985e0d3cce316b7090bd0: Status 404 returned error can't find the container with id 7da1e3da9068b64f99b9f6fc69424bfdc435d67a4f2985e0d3cce316b7090bd0 Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.626821 4664 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-sd2nj container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.626874 4664 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-sd2nj" podUID="1b63fb51-bfa4-4c92-a1b0-9044cf7cff03" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.627891 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:50:42 crc kubenswrapper[4664]: E1003 07:50:42.629035 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 07:50:43.129001311 +0000 UTC m=+143.950191821 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.629935 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-jtp75" event={"ID":"33f7be8a-c6e0-47cd-b9bf-d42447f23980","Type":"ContainerStarted","Data":"80f99b6fcda4b4014653a5867abf20be85379a375e38d83f71b7e1c0b5146375"} Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.630891 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-jtp75" Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.633799 4664 patch_prober.go:28] interesting pod/console-operator-58897d9998-jtp75 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.633849 4664 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-jtp75" podUID="33f7be8a-c6e0-47cd-b9bf-d42447f23980" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.634947 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-87c8s" event={"ID":"4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1","Type":"ContainerStarted","Data":"96e81adf1e5e2db8ac4f578ff1f1bf74f4def00134773ab2276b18ffb6b74990"} Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.639544 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324625-nj78m" event={"ID":"5c3aff15-c0d3-4d20-8ee3-7d753d5f7c91","Type":"ContainerStarted","Data":"e55ddbdee2a0207580bbf04cd051441d8f90da78dbb4a6116d66741cfbf9388e"} Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.640861 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gj7qw" event={"ID":"4efd784b-b02d-4298-a96b-ed5663641afa","Type":"ContainerStarted","Data":"a4dcd594943948e7c9116c5878c9c29d03d0bffb47f432a37d6138483d4ea04d"} Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.643133 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-zcngp" event={"ID":"acf5523b-3f1d-495e-8014-0313925e8727","Type":"ContainerStarted","Data":"f4a19334ac12361d5b6b3fe64eb544b3a7f747aa87258449424bd00786be563d"} Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.656551 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-jxft6" event={"ID":"bcb87980-5888-4a30-859f-a9ac5b95f2c0","Type":"ContainerStarted","Data":"dfc183ea2d3088f950f9aee770c367bd7ea4bef0f1b5915d72e006b752e86442"} Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.659807 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s7kfk" event={"ID":"2209b7cc-875e-4b43-94ac-555eefedcc58","Type":"ContainerStarted","Data":"eb1ff58d6ef47ccbd309fddfb7daab541e97c2d0d096d5c3eb9b23f639b129b6"} Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.661370 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xrzpw" event={"ID":"7d8683a6-e42e-4541-81b3-2b28fc5e7be6","Type":"ContainerStarted","Data":"899ff31998be3c3e26d4c76fe01e570950647856e461ec1f06eeac550608246a"} Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.661398 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xrzpw" event={"ID":"7d8683a6-e42e-4541-81b3-2b28fc5e7be6","Type":"ContainerStarted","Data":"8a33b133b4d32f4064f28544899bdded9ee5b472c9601e24226acb8bc3d6ad7b"} Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.667158 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dg8s7" event={"ID":"f3284298-3f20-43d6-95ae-7d40c56534d3","Type":"ContainerStarted","Data":"5d2d45983f5275b4aa0971668a6ede69408edfa33532d452a0acee3ed71c30a6"} Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.667194 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dg8s7" event={"ID":"f3284298-3f20-43d6-95ae-7d40c56534d3","Type":"ContainerStarted","Data":"3a06ee4a8556bf36533e2af7d62e480e777cb5f1db1130d57a156ffa0e4190ed"} Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.672207 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tdjrr" event={"ID":"171032ce-a5a4-4f30-bdc1-8c39e19efe99","Type":"ContainerStarted","Data":"13b25349698f4321db94ef6ba5743d65db4ce955819cf22392c129fa03cf8caf"} Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.673489 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-vsmt5" event={"ID":"ab24ce7c-1a43-44de-98c3-e9bbdb0c7b6c","Type":"ContainerStarted","Data":"f5274f3cb62856eb08775c3a25037aa2f51b86f6e96b0414b913624f88184c95"} Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.675413 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6t5kh" event={"ID":"ef3e87db-9d6e-400e-bd6e-bd578baabbf6","Type":"ContainerStarted","Data":"65f6921b9bcca7d03aec6363bf46989184ac92902770ada9680ef72c0c4b4a73"} Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.677092 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fjs9m" event={"ID":"989a2cb8-e14c-4ffb-9b1f-a5bea171b3f5","Type":"ContainerStarted","Data":"463af75b8a120887396d1fdd71e93896f1948122fb5dc4c26db951f3e30a3c36"} Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.677123 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fjs9m" event={"ID":"989a2cb8-e14c-4ffb-9b1f-a5bea171b3f5","Type":"ContainerStarted","Data":"aaed9ea6a052d786b9d577fc363be9e4d5ac9da56e7e825a6757069054a1a6ea"} Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.740797 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:42 crc kubenswrapper[4664]: E1003 07:50:42.744094 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 07:50:43.244078283 +0000 UTC m=+144.065268773 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-szr58" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.786969 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-v4cvf"] Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.857055 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:50:42 crc kubenswrapper[4664]: E1003 07:50:42.857380 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 07:50:43.357365584 +0000 UTC m=+144.178556074 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:42 crc kubenswrapper[4664]: I1003 07:50:42.958263 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:42 crc kubenswrapper[4664]: E1003 07:50:42.958652 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 07:50:43.458637598 +0000 UTC m=+144.279828098 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-szr58" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.059181 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:50:43 crc kubenswrapper[4664]: E1003 07:50:43.059517 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 07:50:43.559492078 +0000 UTC m=+144.380682588 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.161191 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:43 crc kubenswrapper[4664]: E1003 07:50:43.161731 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 07:50:43.661712134 +0000 UTC m=+144.482902674 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-szr58" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.185918 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-jxft6" Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.187268 4664 patch_prober.go:28] interesting pod/router-default-5444994796-jxft6 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.187326 4664 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jxft6" podUID="bcb87980-5888-4a30-859f-a9ac5b95f2c0" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.227300 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zrt4t"] Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.262496 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:50:43 crc kubenswrapper[4664]: E1003 07:50:43.263205 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 07:50:43.763189855 +0000 UTC m=+144.584380345 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.269538 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-jxft6" podStartSLOduration=118.269513709 podStartE2EDuration="1m58.269513709s" podCreationTimestamp="2025-10-03 07:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:43.261738626 +0000 UTC m=+144.082929136" watchObservedRunningTime="2025-10-03 07:50:43.269513709 +0000 UTC m=+144.090704199" Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.290500 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-sd2nj" podStartSLOduration=118.29047669 podStartE2EDuration="1m58.29047669s" podCreationTimestamp="2025-10-03 07:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:43.286160084 +0000 UTC m=+144.107350584" watchObservedRunningTime="2025-10-03 07:50:43.29047669 +0000 UTC m=+144.111667190" Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.364409 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:43 crc kubenswrapper[4664]: E1003 07:50:43.364877 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 07:50:43.864862532 +0000 UTC m=+144.686053022 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-szr58" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.398759 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-h46lz" podStartSLOduration=119.398739231 podStartE2EDuration="1m59.398739231s" podCreationTimestamp="2025-10-03 07:48:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:43.391368101 +0000 UTC m=+144.212558611" watchObservedRunningTime="2025-10-03 07:50:43.398739231 +0000 UTC m=+144.219929721" Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.399432 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-8sdmc" podStartSLOduration=119.399422804 podStartE2EDuration="1m59.399422804s" podCreationTimestamp="2025-10-03 07:48:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:43.348187387 +0000 UTC m=+144.169377897" watchObservedRunningTime="2025-10-03 07:50:43.399422804 +0000 UTC m=+144.220613304" Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.436930 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-2m6x7" podStartSLOduration=118.436913896 podStartE2EDuration="1m58.436913896s" podCreationTimestamp="2025-10-03 07:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:43.436344156 +0000 UTC m=+144.257534656" watchObservedRunningTime="2025-10-03 07:50:43.436913896 +0000 UTC m=+144.258104386" Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.466273 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:50:43 crc kubenswrapper[4664]: E1003 07:50:43.466945 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 07:50:43.966926093 +0000 UTC m=+144.788116583 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.476187 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-wd7mj" podStartSLOduration=118.476163966 podStartE2EDuration="1m58.476163966s" podCreationTimestamp="2025-10-03 07:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:43.460216716 +0000 UTC m=+144.281407226" watchObservedRunningTime="2025-10-03 07:50:43.476163966 +0000 UTC m=+144.297354456" Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.503526 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-jtp75" podStartSLOduration=118.503508364 podStartE2EDuration="1m58.503508364s" podCreationTimestamp="2025-10-03 07:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:43.502449178 +0000 UTC m=+144.323639688" watchObservedRunningTime="2025-10-03 07:50:43.503508364 +0000 UTC m=+144.324698854" Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.547185 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-j2cbs"] Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.576917 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-z8tqh"] Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.578634 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:43 crc kubenswrapper[4664]: E1003 07:50:43.579123 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 07:50:44.079107537 +0000 UTC m=+144.900298027 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-szr58" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.582746 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xt6kz"] Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.594426 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-942tg"] Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.598076 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xrzpw" podStartSLOduration=118.59805549 podStartE2EDuration="1m58.59805549s" podCreationTimestamp="2025-10-03 07:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:43.532760516 +0000 UTC m=+144.353951016" watchObservedRunningTime="2025-10-03 07:50:43.59805549 +0000 UTC m=+144.419245980" Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.602163 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-j66jg"] Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.605790 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fjs9m" podStartSLOduration=119.605772351 podStartE2EDuration="1m59.605772351s" podCreationTimestamp="2025-10-03 07:48:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:43.581762697 +0000 UTC m=+144.402953217" watchObservedRunningTime="2025-10-03 07:50:43.605772351 +0000 UTC m=+144.426962861" Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.668389 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vcvhp"] Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.683552 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:50:43 crc kubenswrapper[4664]: E1003 07:50:43.684218 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 07:50:44.18418578 +0000 UTC m=+145.005376270 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.699814 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dg8s7" event={"ID":"f3284298-3f20-43d6-95ae-7d40c56534d3","Type":"ContainerStarted","Data":"7c79dad9889d83f984e10523a9980757a36f51d64bf38febb8d5f6e507653c40"} Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.700421 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dg8s7" Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.703514 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j66jg" event={"ID":"7cc4bac1-3b43-4b15-b1b6-7c738e625dd7","Type":"ContainerStarted","Data":"9383f054300eb1be6b99aa4a268b16bee15c0c221363ff23701c3e5893f16b8a"} Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.710575 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-942tg" event={"ID":"5dba8f90-eac6-468f-91c7-b95d2d04b2ec","Type":"ContainerStarted","Data":"29e6d24037174a69d764c8bf5949511b2288d600db440c9d24ba4133442f2930"} Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.713888 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-h7lkc" event={"ID":"5c7c6d84-6feb-47fe-8f80-0cd9ca2e7686","Type":"ContainerStarted","Data":"27694d972f085799faced7d62c4c4c926a1d6df250d6c77f893b29f2a9f6f651"} Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.719779 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-zcngp" event={"ID":"acf5523b-3f1d-495e-8014-0313925e8727","Type":"ContainerStarted","Data":"ce56aafdebe41cb90897a58cdf2421cc687816b9da9925f5473456e902587143"} Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.720088 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-zcngp" Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.721775 4664 patch_prober.go:28] interesting pod/downloads-7954f5f757-zcngp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.721838 4664 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zcngp" podUID="acf5523b-3f1d-495e-8014-0313925e8727" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.726959 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-j2cbs" event={"ID":"6547d936-f7d9-4373-92ba-08a7e610c2c3","Type":"ContainerStarted","Data":"aac219dfae585c81512024936aa3d87e3d179983d82dae47c2df6244da819873"} Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.734569 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xt6kz" event={"ID":"77a9c147-cee1-4a60-a2e1-8dd93096f35f","Type":"ContainerStarted","Data":"efa8cb65ba556090ba973092128faa078eb0d25b9d8ca700898472841c7678ff"} Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.748007 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6t5kh" event={"ID":"ef3e87db-9d6e-400e-bd6e-bd578baabbf6","Type":"ContainerStarted","Data":"6bf8c3c9f093f79d942542509138f79e5cf2310752d0cdee426bef8f65e28a55"} Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.757679 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-bs9r7"] Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.763035 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-t9rqg" event={"ID":"d2108a93-65c6-4626-9fb6-f93854855b80","Type":"ContainerStarted","Data":"95021a3928d41141e44e58d44c13a48e58a46a866d5e735d69cdead8a670c3fe"} Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.765687 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-rlrp9"] Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.768207 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c5ncx"] Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.771371 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gj7qw" event={"ID":"4efd784b-b02d-4298-a96b-ed5663641afa","Type":"ContainerStarted","Data":"0e9a92668969e66d232e906c266e4572dfc1ca353999125f1588251a903ce3e8"} Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.772186 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-gj7qw" Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.781070 4664 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-gj7qw container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.781294 4664 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-gj7qw" podUID="4efd784b-b02d-4298-a96b-ed5663641afa" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.785407 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:43 crc kubenswrapper[4664]: W1003 07:50:43.786308 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod203737df_98aa_47e4_856a_88670695ce9c.slice/crio-7fc16c9f61cc9876de35bd5877205765ce2b1ba463eea1e34470cf41a1d956cd WatchSource:0}: Error finding container 7fc16c9f61cc9876de35bd5877205765ce2b1ba463eea1e34470cf41a1d956cd: Status 404 returned error can't find the container with id 7fc16c9f61cc9876de35bd5877205765ce2b1ba463eea1e34470cf41a1d956cd Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.786682 4664 generic.go:334] "Generic (PLEG): container finished" podID="2a7ecab1-cd44-4dee-810a-1fd601b96eb4" containerID="44ff6870b18576894678ef3266d3c85d137a5c0b570489a41b9f199c92f00bb0" exitCode=0 Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.786759 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-smvlm" event={"ID":"2a7ecab1-cd44-4dee-810a-1fd601b96eb4","Type":"ContainerDied","Data":"44ff6870b18576894678ef3266d3c85d137a5c0b570489a41b9f199c92f00bb0"} Oct 03 07:50:43 crc kubenswrapper[4664]: E1003 07:50:43.787649 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 07:50:44.287634578 +0000 UTC m=+145.108825148 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-szr58" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:43 crc kubenswrapper[4664]: W1003 07:50:43.808882 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74867a3b_68c8_4f94_9e0c_77c22db7b494.slice/crio-869e37b6c2ac37a6a08e34a7987878baaa197cd30b2baa07b02cfe87ec9528c4 WatchSource:0}: Error finding container 869e37b6c2ac37a6a08e34a7987878baaa197cd30b2baa07b02cfe87ec9528c4: Status 404 returned error can't find the container with id 869e37b6c2ac37a6a08e34a7987878baaa197cd30b2baa07b02cfe87ec9528c4 Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.809922 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-42xd6" event={"ID":"8ebb51e4-a635-4c8a-b287-79c0e7d74a9c","Type":"ContainerStarted","Data":"0aa4880b31c19e6ad16635512830dd26beaaa0d22f08f471fc2ff6f847e79fbf"} Oct 03 07:50:43 crc kubenswrapper[4664]: W1003 07:50:43.817643 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1622b1d_7974_49dd_aa02_14972bf9cb8a.slice/crio-ecfe99544dddf8e7f624d2b173c749e96c9f0f43d9cbdcbc202ce14b871b4f09 WatchSource:0}: Error finding container ecfe99544dddf8e7f624d2b173c749e96c9f0f43d9cbdcbc202ce14b871b4f09: Status 404 returned error can't find the container with id ecfe99544dddf8e7f624d2b173c749e96c9f0f43d9cbdcbc202ce14b871b4f09 Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.819673 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zsdch" event={"ID":"b56b2f61-0758-42d4-a4c8-77c9848991ab","Type":"ContainerStarted","Data":"30e353877fa2410346ed4b84b8b4f07e02d62285ea97a8e2b9c27e9db7600038"} Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.819728 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zsdch" event={"ID":"b56b2f61-0758-42d4-a4c8-77c9848991ab","Type":"ContainerStarted","Data":"65160705d8230117f3589b6d4402c74b6125d34da5bb66a59317cf21dabbe0f0"} Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.823168 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-z8tqh" event={"ID":"c452acc1-8a32-47e7-9f61-4a7203b878c0","Type":"ContainerStarted","Data":"de47398d507828498677ef89a6fb7d0d7a858f58bfb54bfce982dc3fe793b9ad"} Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.828840 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-v4cvf" event={"ID":"32eb3af2-4f0e-43f7-99ca-60eb556894d9","Type":"ContainerStarted","Data":"2641764f0999a67b146b351ffe7f18b830967224aace5c7c1e737a1762fdae35"} Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.844585 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-tm59k" event={"ID":"c47b8dca-2e97-4aba-b303-00c0f2b36ecc","Type":"ContainerStarted","Data":"7fbae3dfdff68f7ff86a5f34b329bcabf85cbfbc9d860212244ab3536edaa6b3"} Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.857692 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zrt4t" event={"ID":"fe9caff3-df9d-4bf2-a811-c7bba9f5856e","Type":"ContainerStarted","Data":"223ed89649783d1869359ebb5610129edc9399d88e00ada85d3d734b17b5f35a"} Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.859551 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hpqxb" event={"ID":"2f159324-ddf2-49a4-b432-46e5efd09e16","Type":"ContainerStarted","Data":"7da1e3da9068b64f99b9f6fc69424bfdc435d67a4f2985e0d3cce316b7090bd0"} Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.865184 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-b6htg" event={"ID":"0c70928e-51e2-4c62-8ce6-c1f8ba489f8f","Type":"ContainerStarted","Data":"df1adeb10609b8b74c86bd9d94429774fd0010320024fe505467fb05b21d556f"} Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.868932 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9z9sg" event={"ID":"52515c5a-0f7d-42d4-90f8-97e66050f161","Type":"ContainerStarted","Data":"6e0dcca8c7fbfe071bb15c20d29a48a373efac881ee692bc82a9f57b8ff5f736"} Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.868993 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9z9sg" event={"ID":"52515c5a-0f7d-42d4-90f8-97e66050f161","Type":"ContainerStarted","Data":"8fef5022157a15387ce018df2c9c103785ce3274ce36cb61b8ff6052f1b23848"} Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.872424 4664 generic.go:334] "Generic (PLEG): container finished" podID="4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1" containerID="68789ee3468a7d8f0669fa53e623cf71efa11a905826399c6617d85ec9ab8db7" exitCode=0 Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.872501 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-87c8s" event={"ID":"4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1","Type":"ContainerDied","Data":"68789ee3468a7d8f0669fa53e623cf71efa11a905826399c6617d85ec9ab8db7"} Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.886197 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:50:43 crc kubenswrapper[4664]: E1003 07:50:43.886317 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 07:50:44.386295563 +0000 UTC m=+145.207486043 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.895818 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.896162 4664 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-sm42l container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.896299 4664 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sm42l" podUID="11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Oct 03 07:50:43 crc kubenswrapper[4664]: E1003 07:50:43.898135 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 07:50:44.398117494 +0000 UTC m=+145.219307984 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-szr58" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.926819 4664 generic.go:334] "Generic (PLEG): container finished" podID="171032ce-a5a4-4f30-bdc1-8c39e19efe99" containerID="da1197518f8cc9c15aff49d821d4686beccf7243e08911293a050743b6aff399" exitCode=0 Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.946921 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sm42l" event={"ID":"11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e","Type":"ContainerStarted","Data":"1133a9ae7f3f82170f7e6137864a76e7e56515b17c4c0e57529489e2ae0a0aec"} Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.946974 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sm42l" Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.946986 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tdjrr" event={"ID":"171032ce-a5a4-4f30-bdc1-8c39e19efe99","Type":"ContainerDied","Data":"da1197518f8cc9c15aff49d821d4686beccf7243e08911293a050743b6aff399"} Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.946997 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s7kfk" event={"ID":"2209b7cc-875e-4b43-94ac-555eefedcc58","Type":"ContainerStarted","Data":"8726d3c8334934a314c0e2085ab872663d6119175af48c4b647f68d2ee736b12"} Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.947009 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s7kfk" Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.954270 4664 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-s7kfk container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.954584 4664 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s7kfk" podUID="2209b7cc-875e-4b43-94ac-555eefedcc58" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.955830 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-gvjlm" event={"ID":"7cbc7be3-c40d-4679-995b-de87266f7587","Type":"ContainerStarted","Data":"cf4e8fdc174daa608f54c911132998cb1d7b7f2927d60b633c1d83c6905452c8"} Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.955877 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-gvjlm" event={"ID":"7cbc7be3-c40d-4679-995b-de87266f7587","Type":"ContainerStarted","Data":"854b5291e59050d417574e2c9fa65ef40c7ad2cb54b87c2c35e23d9a71e21a12"} Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.971078 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nzfrx" event={"ID":"a6aa30cf-1aa9-4c2a-a079-40182a1c51ef","Type":"ContainerStarted","Data":"56ce0b71e2c874ee2c47120521d3547341c31b22f29d3163e584f1c76d0b30c7"} Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.986569 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324625-nj78m" event={"ID":"5c3aff15-c0d3-4d20-8ee3-7d753d5f7c91","Type":"ContainerStarted","Data":"45f77438245ec43407abee2ee75046f6eb1389e179ca39cbe00dce89645fb880"} Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.994636 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-vsmt5" event={"ID":"ab24ce7c-1a43-44de-98c3-e9bbdb0c7b6c","Type":"ContainerStarted","Data":"19359886fe3104f050cb0c791a1ff9b3efc8d7f53d22f4e1fa77bb41d26c617c"} Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.995078 4664 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-sd2nj container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.995110 4664 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-sd2nj" podUID="1b63fb51-bfa4-4c92-a1b0-9044cf7cff03" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" Oct 03 07:50:43 crc kubenswrapper[4664]: I1003 07:50:43.996705 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:50:43 crc kubenswrapper[4664]: E1003 07:50:43.997202 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 07:50:44.497175173 +0000 UTC m=+145.318365663 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:44 crc kubenswrapper[4664]: I1003 07:50:44.008923 4664 patch_prober.go:28] interesting pod/console-operator-58897d9998-jtp75 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Oct 03 07:50:44 crc kubenswrapper[4664]: I1003 07:50:44.008997 4664 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-jtp75" podUID="33f7be8a-c6e0-47cd-b9bf-d42447f23980" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" Oct 03 07:50:44 crc kubenswrapper[4664]: I1003 07:50:44.039347 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-h46lz" Oct 03 07:50:44 crc kubenswrapper[4664]: I1003 07:50:44.117526 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:44 crc kubenswrapper[4664]: E1003 07:50:44.119580 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 07:50:44.619564563 +0000 UTC m=+145.440755053 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-szr58" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:44 crc kubenswrapper[4664]: I1003 07:50:44.193012 4664 patch_prober.go:28] interesting pod/router-default-5444994796-jxft6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 07:50:44 crc kubenswrapper[4664]: [-]has-synced failed: reason withheld Oct 03 07:50:44 crc kubenswrapper[4664]: [+]process-running ok Oct 03 07:50:44 crc kubenswrapper[4664]: healthz check failed Oct 03 07:50:44 crc kubenswrapper[4664]: I1003 07:50:44.193443 4664 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jxft6" podUID="bcb87980-5888-4a30-859f-a9ac5b95f2c0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 07:50:44 crc kubenswrapper[4664]: I1003 07:50:44.219581 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:50:44 crc kubenswrapper[4664]: E1003 07:50:44.219920 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 07:50:44.719905836 +0000 UTC m=+145.541096326 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:44 crc kubenswrapper[4664]: I1003 07:50:44.272703 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-gvjlm" podStartSLOduration=6.272686785 podStartE2EDuration="6.272686785s" podCreationTimestamp="2025-10-03 07:50:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:44.272125966 +0000 UTC m=+145.093316456" watchObservedRunningTime="2025-10-03 07:50:44.272686785 +0000 UTC m=+145.093877275" Oct 03 07:50:44 crc kubenswrapper[4664]: I1003 07:50:44.321809 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hpqxb" podStartSLOduration=119.32177466 podStartE2EDuration="1m59.32177466s" podCreationTimestamp="2025-10-03 07:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:44.320512247 +0000 UTC m=+145.141702737" watchObservedRunningTime="2025-10-03 07:50:44.32177466 +0000 UTC m=+145.142965160" Oct 03 07:50:44 crc kubenswrapper[4664]: I1003 07:50:44.323328 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:44 crc kubenswrapper[4664]: E1003 07:50:44.323756 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 07:50:44.823743047 +0000 UTC m=+145.644933537 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-szr58" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:44 crc kubenswrapper[4664]: I1003 07:50:44.381205 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-tm59k" podStartSLOduration=119.381178674 podStartE2EDuration="1m59.381178674s" podCreationTimestamp="2025-10-03 07:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:44.36513118 +0000 UTC m=+145.186321680" watchObservedRunningTime="2025-10-03 07:50:44.381178674 +0000 UTC m=+145.202369154" Oct 03 07:50:44 crc kubenswrapper[4664]: I1003 07:50:44.425014 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:50:44 crc kubenswrapper[4664]: E1003 07:50:44.425459 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 07:50:44.925442985 +0000 UTC m=+145.746633475 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:44 crc kubenswrapper[4664]: I1003 07:50:44.446333 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-zcngp" podStartSLOduration=119.446314393 podStartE2EDuration="1m59.446314393s" podCreationTimestamp="2025-10-03 07:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:44.443211137 +0000 UTC m=+145.264401637" watchObservedRunningTime="2025-10-03 07:50:44.446314393 +0000 UTC m=+145.267504883" Oct 03 07:50:44 crc kubenswrapper[4664]: I1003 07:50:44.491218 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9z9sg" podStartSLOduration=119.491197635 podStartE2EDuration="1m59.491197635s" podCreationTimestamp="2025-10-03 07:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:44.485425109 +0000 UTC m=+145.306615599" watchObservedRunningTime="2025-10-03 07:50:44.491197635 +0000 UTC m=+145.312388125" Oct 03 07:50:44 crc kubenswrapper[4664]: I1003 07:50:44.529752 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-42xd6" podStartSLOduration=119.529726531 podStartE2EDuration="1m59.529726531s" podCreationTimestamp="2025-10-03 07:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:44.52969488 +0000 UTC m=+145.350885370" watchObservedRunningTime="2025-10-03 07:50:44.529726531 +0000 UTC m=+145.350917041" Oct 03 07:50:44 crc kubenswrapper[4664]: I1003 07:50:44.530833 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:44 crc kubenswrapper[4664]: E1003 07:50:44.531138 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 07:50:45.031127058 +0000 UTC m=+145.852317548 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-szr58" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:44 crc kubenswrapper[4664]: I1003 07:50:44.590396 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s7kfk" podStartSLOduration=119.590380658 podStartE2EDuration="1m59.590380658s" podCreationTimestamp="2025-10-03 07:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:44.558949122 +0000 UTC m=+145.380139642" watchObservedRunningTime="2025-10-03 07:50:44.590380658 +0000 UTC m=+145.411571148" Oct 03 07:50:44 crc kubenswrapper[4664]: I1003 07:50:44.634410 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:50:44 crc kubenswrapper[4664]: E1003 07:50:44.635116 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 07:50:45.135095044 +0000 UTC m=+145.956285534 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:44 crc kubenswrapper[4664]: I1003 07:50:44.679844 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6t5kh" podStartSLOduration=120.67982136 podStartE2EDuration="2m0.67982136s" podCreationTimestamp="2025-10-03 07:48:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:44.677132199 +0000 UTC m=+145.498322699" watchObservedRunningTime="2025-10-03 07:50:44.67982136 +0000 UTC m=+145.501011870" Oct 03 07:50:44 crc kubenswrapper[4664]: I1003 07:50:44.717909 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dg8s7" podStartSLOduration=119.717886791 podStartE2EDuration="1m59.717886791s" podCreationTimestamp="2025-10-03 07:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:44.715075296 +0000 UTC m=+145.536265796" watchObservedRunningTime="2025-10-03 07:50:44.717886791 +0000 UTC m=+145.539077301" Oct 03 07:50:44 crc kubenswrapper[4664]: I1003 07:50:44.736762 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:44 crc kubenswrapper[4664]: E1003 07:50:44.737452 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 07:50:45.237437094 +0000 UTC m=+146.058627584 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-szr58" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:44 crc kubenswrapper[4664]: I1003 07:50:44.765032 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sm42l" podStartSLOduration=119.765011229 podStartE2EDuration="1m59.765011229s" podCreationTimestamp="2025-10-03 07:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:44.764793802 +0000 UTC m=+145.585984302" watchObservedRunningTime="2025-10-03 07:50:44.765011229 +0000 UTC m=+145.586201739" Oct 03 07:50:44 crc kubenswrapper[4664]: I1003 07:50:44.799797 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-gj7qw" podStartSLOduration=119.799778928 podStartE2EDuration="1m59.799778928s" podCreationTimestamp="2025-10-03 07:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:44.798773024 +0000 UTC m=+145.619963544" watchObservedRunningTime="2025-10-03 07:50:44.799778928 +0000 UTC m=+145.620969418" Oct 03 07:50:44 crc kubenswrapper[4664]: I1003 07:50:44.838719 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:50:44 crc kubenswrapper[4664]: E1003 07:50:44.839112 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 07:50:45.339081731 +0000 UTC m=+146.160272221 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:44 crc kubenswrapper[4664]: I1003 07:50:44.929003 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-b6htg" podStartSLOduration=119.928986839 podStartE2EDuration="1m59.928986839s" podCreationTimestamp="2025-10-03 07:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:44.901897651 +0000 UTC m=+145.723088161" watchObservedRunningTime="2025-10-03 07:50:44.928986839 +0000 UTC m=+145.750177329" Oct 03 07:50:44 crc kubenswrapper[4664]: I1003 07:50:44.929337 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-t9rqg" podStartSLOduration=119.929332881 podStartE2EDuration="1m59.929332881s" podCreationTimestamp="2025-10-03 07:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:44.929246018 +0000 UTC m=+145.750436518" watchObservedRunningTime="2025-10-03 07:50:44.929332881 +0000 UTC m=+145.750523371" Oct 03 07:50:44 crc kubenswrapper[4664]: I1003 07:50:44.943995 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:44 crc kubenswrapper[4664]: E1003 07:50:44.944432 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 07:50:45.444417052 +0000 UTC m=+146.265607542 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-szr58" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:44 crc kubenswrapper[4664]: I1003 07:50:44.973982 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29324625-nj78m" podStartSLOduration=120.973964154 podStartE2EDuration="2m0.973964154s" podCreationTimestamp="2025-10-03 07:48:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:44.972811615 +0000 UTC m=+145.794002115" watchObservedRunningTime="2025-10-03 07:50:44.973964154 +0000 UTC m=+145.795154654" Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.046481 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:50:45 crc kubenswrapper[4664]: E1003 07:50:45.048203 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 07:50:45.548171041 +0000 UTC m=+146.369361531 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.052031 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-j2cbs" event={"ID":"6547d936-f7d9-4373-92ba-08a7e610c2c3","Type":"ContainerStarted","Data":"a91064a80f593d345ef2b641946ac8d5dad3ca6b3d8e1942e39cd584568ee6b1"} Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.053011 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-j2cbs" Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.058865 4664 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-j2cbs container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" start-of-body= Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.058962 4664 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-j2cbs" podUID="6547d936-f7d9-4373-92ba-08a7e610c2c3" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.072541 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-smvlm" event={"ID":"2a7ecab1-cd44-4dee-810a-1fd601b96eb4","Type":"ContainerStarted","Data":"d26ac9bf4b7875d5e8a36dfc597c779f561da894a986aec67e7fe43748a44f02"} Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.120284 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-rlrp9" event={"ID":"e1622b1d-7974-49dd-aa02-14972bf9cb8a","Type":"ContainerStarted","Data":"6fe372b00f779e27c8af7c6b920a7d1faf101ded442b20a55744e55b7855591b"} Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.120340 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-rlrp9" event={"ID":"e1622b1d-7974-49dd-aa02-14972bf9cb8a","Type":"ContainerStarted","Data":"ecfe99544dddf8e7f624d2b173c749e96c9f0f43d9cbdcbc202ce14b871b4f09"} Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.127934 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-j2cbs" podStartSLOduration=120.127908884 podStartE2EDuration="2m0.127908884s" podCreationTimestamp="2025-10-03 07:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:45.083091255 +0000 UTC m=+145.904281755" watchObservedRunningTime="2025-10-03 07:50:45.127908884 +0000 UTC m=+145.949099374" Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.128311 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-smvlm" podStartSLOduration=120.128305638 podStartE2EDuration="2m0.128305638s" podCreationTimestamp="2025-10-03 07:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:45.122898134 +0000 UTC m=+145.944088634" watchObservedRunningTime="2025-10-03 07:50:45.128305638 +0000 UTC m=+145.949496128" Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.134842 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-z8tqh" event={"ID":"c452acc1-8a32-47e7-9f61-4a7203b878c0","Type":"ContainerStarted","Data":"98d4f765b50e231caa5c2e3d04c6c7138b2e114cf2d95ac902907b1bb8ac055e"} Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.151590 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:45 crc kubenswrapper[4664]: E1003 07:50:45.152986 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 07:50:45.652972734 +0000 UTC m=+146.474163274 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-szr58" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.154439 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zsdch" event={"ID":"b56b2f61-0758-42d4-a4c8-77c9848991ab","Type":"ContainerStarted","Data":"ab0e2df5511efeb43982c567fa3574dc4b0cb9d49d603ea52a6ef3b7f4637424"} Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.183364 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j66jg" event={"ID":"7cc4bac1-3b43-4b15-b1b6-7c738e625dd7","Type":"ContainerStarted","Data":"7d2a175c2387d52790cf3562d1e04d9def30934aff33c0e7bbc69c5e3871ad91"} Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.183413 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j66jg" event={"ID":"7cc4bac1-3b43-4b15-b1b6-7c738e625dd7","Type":"ContainerStarted","Data":"91f4c5288b8e9b31645ab1f08b2b6f5de8137dd431c53e01d36518b9d03de773"} Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.194022 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-z8tqh" podStartSLOduration=120.193989425 podStartE2EDuration="2m0.193989425s" podCreationTimestamp="2025-10-03 07:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:45.192791014 +0000 UTC m=+146.013981504" watchObservedRunningTime="2025-10-03 07:50:45.193989425 +0000 UTC m=+146.015179935" Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.194048 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-bs9r7" event={"ID":"74867a3b-68c8-4f94-9e0c-77c22db7b494","Type":"ContainerStarted","Data":"27bbc700ea45425444da9c2aa42386e36450156887f8598a0afabeadf259867a"} Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.194918 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-bs9r7" event={"ID":"74867a3b-68c8-4f94-9e0c-77c22db7b494","Type":"ContainerStarted","Data":"869e37b6c2ac37a6a08e34a7987878baaa197cd30b2baa07b02cfe87ec9528c4"} Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.195066 4664 patch_prober.go:28] interesting pod/router-default-5444994796-jxft6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 07:50:45 crc kubenswrapper[4664]: [-]has-synced failed: reason withheld Oct 03 07:50:45 crc kubenswrapper[4664]: [+]process-running ok Oct 03 07:50:45 crc kubenswrapper[4664]: healthz check failed Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.195096 4664 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jxft6" podUID="bcb87980-5888-4a30-859f-a9ac5b95f2c0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.195552 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-rlrp9" podStartSLOduration=120.195507687 podStartE2EDuration="2m0.195507687s" podCreationTimestamp="2025-10-03 07:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:45.156529245 +0000 UTC m=+145.977719745" watchObservedRunningTime="2025-10-03 07:50:45.195507687 +0000 UTC m=+146.016698177" Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.204882 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hpqxb" event={"ID":"2f159324-ddf2-49a4-b432-46e5efd09e16","Type":"ContainerStarted","Data":"608444b3f7862c4d0603f1601297cde1c9f0a4e018550b759fac883c5e50e8d4"} Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.207315 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nzfrx" event={"ID":"a6aa30cf-1aa9-4c2a-a079-40182a1c51ef","Type":"ContainerStarted","Data":"614f8a735e003d10d2c027f8ae252b52a924f6bece040e75243ae17512b0ad0f"} Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.217451 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zsdch" podStartSLOduration=120.21743557 podStartE2EDuration="2m0.21743557s" podCreationTimestamp="2025-10-03 07:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:45.216104525 +0000 UTC m=+146.037295025" watchObservedRunningTime="2025-10-03 07:50:45.21743557 +0000 UTC m=+146.038626060" Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.218893 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-h7lkc" event={"ID":"5c7c6d84-6feb-47fe-8f80-0cd9ca2e7686","Type":"ContainerStarted","Data":"4db20def79fbf23b89e1931be4a245535867dd38327526af476b75a00d6fcba7"} Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.224625 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-vsmt5" event={"ID":"ab24ce7c-1a43-44de-98c3-e9bbdb0c7b6c","Type":"ContainerStarted","Data":"0e1002593b0323d9414ab2f456576198a04f9fd4ceccc19f97b6a2d2cf41b093"} Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.252141 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j66jg" podStartSLOduration=120.252126516 podStartE2EDuration="2m0.252126516s" podCreationTimestamp="2025-10-03 07:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:45.249103894 +0000 UTC m=+146.070294404" watchObservedRunningTime="2025-10-03 07:50:45.252126516 +0000 UTC m=+146.073317006" Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.252898 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:50:45 crc kubenswrapper[4664]: E1003 07:50:45.254182 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 07:50:45.754156425 +0000 UTC m=+146.575346915 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.264003 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zrt4t" event={"ID":"fe9caff3-df9d-4bf2-a811-c7bba9f5856e","Type":"ContainerStarted","Data":"465feaa320e4b81a0da014e9fa42a16768dcdcbbae2ac553412c74fafa2b861b"} Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.294040 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c5ncx" event={"ID":"203737df-98aa-47e4-856a-88670695ce9c","Type":"ContainerStarted","Data":"d414f56acea449d67b91489bd790804fe0cea0182c0d03914538a6cd87b1989c"} Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.294099 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c5ncx" event={"ID":"203737df-98aa-47e4-856a-88670695ce9c","Type":"ContainerStarted","Data":"7fc16c9f61cc9876de35bd5877205765ce2b1ba463eea1e34470cf41a1d956cd"} Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.312773 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-bs9r7" podStartSLOduration=7.312752042 podStartE2EDuration="7.312752042s" podCreationTimestamp="2025-10-03 07:50:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:45.305066492 +0000 UTC m=+146.126257002" watchObservedRunningTime="2025-10-03 07:50:45.312752042 +0000 UTC m=+146.133942542" Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.354883 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:45 crc kubenswrapper[4664]: E1003 07:50:45.356957 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 07:50:45.85694048 +0000 UTC m=+146.678130970 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-szr58" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.361785 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-942tg" event={"ID":"5dba8f90-eac6-468f-91c7-b95d2d04b2ec","Type":"ContainerStarted","Data":"740b067f61bb5d59bd0c96316052cf9dc3daa4b156bc5ea6a86270aaea2c2ef1"} Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.362739 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-942tg" Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.384686 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nzfrx" podStartSLOduration=120.384665931 podStartE2EDuration="2m0.384665931s" podCreationTimestamp="2025-10-03 07:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:45.383999218 +0000 UTC m=+146.205189718" watchObservedRunningTime="2025-10-03 07:50:45.384665931 +0000 UTC m=+146.205856421" Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.407880 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-87c8s" event={"ID":"4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1","Type":"ContainerStarted","Data":"2a10345cb98625e5d84b0435f62028b5e75f164444f025ee72a552a20721e35f"} Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.413576 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xt6kz" event={"ID":"77a9c147-cee1-4a60-a2e1-8dd93096f35f","Type":"ContainerStarted","Data":"66dc07454c2b938d6081cbbb472944e8035263731b3f056c15bb12a228f976c7"} Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.414988 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xt6kz" Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.417046 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tdjrr" event={"ID":"171032ce-a5a4-4f30-bdc1-8c39e19efe99","Type":"ContainerStarted","Data":"a09fddb6b94f4eb24fa2f10bf7e59cbec3affa85f096532fbb815480d197782f"} Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.419171 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tdjrr" Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.460122 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.460715 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vcvhp" event={"ID":"5d0c882b-060d-4104-9d29-0a9576607a4a","Type":"ContainerStarted","Data":"3eabd3f14eef35c983e42cd4d7a0964080d4c57b93ed74bf7aa76f3db1b6e1b7"} Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.460775 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vcvhp" event={"ID":"5d0c882b-060d-4104-9d29-0a9576607a4a","Type":"ContainerStarted","Data":"ea5f8b954804c6cfbcda70b62d8aa26563db1af3072edf5c96fa26564fc606bd"} Oct 03 07:50:45 crc kubenswrapper[4664]: E1003 07:50:45.461510 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 07:50:45.961493376 +0000 UTC m=+146.782683866 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.462265 4664 patch_prober.go:28] interesting pod/downloads-7954f5f757-zcngp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.462296 4664 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zcngp" podUID="acf5523b-3f1d-495e-8014-0313925e8727" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.463315 4664 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-gj7qw container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.463339 4664 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-gj7qw" podUID="4efd784b-b02d-4298-a96b-ed5663641afa" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.466905 4664 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-xt6kz container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" start-of-body= Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.466966 4664 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xt6kz" podUID="77a9c147-cee1-4a60-a2e1-8dd93096f35f" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.492996 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-sd2nj" Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.502559 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s7kfk" Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.565502 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:45 crc kubenswrapper[4664]: E1003 07:50:45.571162 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 07:50:46.071147174 +0000 UTC m=+146.892337664 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-szr58" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.574243 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-942tg" podStartSLOduration=7.574224788 podStartE2EDuration="7.574224788s" podCreationTimestamp="2025-10-03 07:50:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:45.470391858 +0000 UTC m=+146.291582368" watchObservedRunningTime="2025-10-03 07:50:45.574224788 +0000 UTC m=+146.395415288" Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.624943 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c5ncx" podStartSLOduration=120.624921937 podStartE2EDuration="2m0.624921937s" podCreationTimestamp="2025-10-03 07:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:45.576008779 +0000 UTC m=+146.397199289" watchObservedRunningTime="2025-10-03 07:50:45.624921937 +0000 UTC m=+146.446112427" Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.626154 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-vsmt5" podStartSLOduration=120.626147399 podStartE2EDuration="2m0.626147399s" podCreationTimestamp="2025-10-03 07:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:45.624087549 +0000 UTC m=+146.445278059" watchObservedRunningTime="2025-10-03 07:50:45.626147399 +0000 UTC m=+146.447337899" Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.669213 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zrt4t" podStartSLOduration=120.669194469 podStartE2EDuration="2m0.669194469s" podCreationTimestamp="2025-10-03 07:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:45.668831806 +0000 UTC m=+146.490022306" watchObservedRunningTime="2025-10-03 07:50:45.669194469 +0000 UTC m=+146.490384959" Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.669454 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:50:45 crc kubenswrapper[4664]: E1003 07:50:45.669825 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 07:50:46.16981104 +0000 UTC m=+146.991001530 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.690947 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sm42l" Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.702887 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-h7lkc" podStartSLOduration=120.70285709 podStartE2EDuration="2m0.70285709s" podCreationTimestamp="2025-10-03 07:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:45.701922218 +0000 UTC m=+146.523112728" watchObservedRunningTime="2025-10-03 07:50:45.70285709 +0000 UTC m=+146.524047580" Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.745374 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xt6kz" podStartSLOduration=120.745353911 podStartE2EDuration="2m0.745353911s" podCreationTimestamp="2025-10-03 07:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:45.744580175 +0000 UTC m=+146.565770685" watchObservedRunningTime="2025-10-03 07:50:45.745353911 +0000 UTC m=+146.566544421" Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.774405 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:45 crc kubenswrapper[4664]: E1003 07:50:45.774871 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 07:50:46.274857391 +0000 UTC m=+147.096047881 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-szr58" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.807787 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-smvlm" Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.808153 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-smvlm" Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.820586 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-jtp75" Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.845856 4664 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-smvlm container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.14:8443/livez\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.845928 4664 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-smvlm" podUID="2a7ecab1-cd44-4dee-810a-1fd601b96eb4" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.14:8443/livez\": dial tcp 10.217.0.14:8443: connect: connection refused" Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.868185 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vcvhp" podStartSLOduration=120.868162175 podStartE2EDuration="2m0.868162175s" podCreationTimestamp="2025-10-03 07:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:45.865327209 +0000 UTC m=+146.686517729" watchObservedRunningTime="2025-10-03 07:50:45.868162175 +0000 UTC m=+146.689352665" Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.877243 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:50:45 crc kubenswrapper[4664]: E1003 07:50:45.877989 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 07:50:46.377969798 +0000 UTC m=+147.199160288 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:45 crc kubenswrapper[4664]: I1003 07:50:45.980457 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:45 crc kubenswrapper[4664]: E1003 07:50:45.980906 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 07:50:46.480885147 +0000 UTC m=+147.302075637 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-szr58" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:46 crc kubenswrapper[4664]: I1003 07:50:46.018694 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tdjrr" podStartSLOduration=121.018658278 podStartE2EDuration="2m1.018658278s" podCreationTimestamp="2025-10-03 07:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:45.966315953 +0000 UTC m=+146.787506463" watchObservedRunningTime="2025-10-03 07:50:46.018658278 +0000 UTC m=+146.839848788" Oct 03 07:50:46 crc kubenswrapper[4664]: I1003 07:50:46.085244 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:50:46 crc kubenswrapper[4664]: E1003 07:50:46.085639 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 07:50:46.585623739 +0000 UTC m=+147.406814229 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:46 crc kubenswrapper[4664]: I1003 07:50:46.187357 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:46 crc kubenswrapper[4664]: E1003 07:50:46.187700 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 07:50:46.6876852 +0000 UTC m=+147.508875690 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-szr58" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:46 crc kubenswrapper[4664]: I1003 07:50:46.193371 4664 patch_prober.go:28] interesting pod/router-default-5444994796-jxft6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 07:50:46 crc kubenswrapper[4664]: [-]has-synced failed: reason withheld Oct 03 07:50:46 crc kubenswrapper[4664]: [+]process-running ok Oct 03 07:50:46 crc kubenswrapper[4664]: healthz check failed Oct 03 07:50:46 crc kubenswrapper[4664]: I1003 07:50:46.193444 4664 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jxft6" podUID="bcb87980-5888-4a30-859f-a9ac5b95f2c0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 07:50:46 crc kubenswrapper[4664]: I1003 07:50:46.289395 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:50:46 crc kubenswrapper[4664]: E1003 07:50:46.289554 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 07:50:46.789528263 +0000 UTC m=+147.610718753 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:46 crc kubenswrapper[4664]: I1003 07:50:46.289801 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:46 crc kubenswrapper[4664]: E1003 07:50:46.290144 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 07:50:46.790132754 +0000 UTC m=+147.611323244 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-szr58" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:46 crc kubenswrapper[4664]: I1003 07:50:46.391309 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:50:46 crc kubenswrapper[4664]: E1003 07:50:46.391498 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 07:50:46.891472609 +0000 UTC m=+147.712663099 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:46 crc kubenswrapper[4664]: I1003 07:50:46.391636 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:46 crc kubenswrapper[4664]: E1003 07:50:46.391954 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 07:50:46.891942745 +0000 UTC m=+147.713133285 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-szr58" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:46 crc kubenswrapper[4664]: I1003 07:50:46.468195 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-942tg" event={"ID":"5dba8f90-eac6-468f-91c7-b95d2d04b2ec","Type":"ContainerStarted","Data":"35b53f1c7c55d42f20004f1018faf536866f43f2582ddfb10b653746350f0d87"} Oct 03 07:50:46 crc kubenswrapper[4664]: I1003 07:50:46.470008 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-v4cvf" event={"ID":"32eb3af2-4f0e-43f7-99ca-60eb556894d9","Type":"ContainerStarted","Data":"cb6f07492832216b1ca6f3c6c3aec7856f78156b46ea55b40aa594fe886fe332"} Oct 03 07:50:46 crc kubenswrapper[4664]: I1003 07:50:46.472175 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-87c8s" event={"ID":"4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1","Type":"ContainerStarted","Data":"90c957d2960864ecd4d18e63180aceefa6d635d8786f09cced4e4e9204dc61b5"} Oct 03 07:50:46 crc kubenswrapper[4664]: I1003 07:50:46.491988 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-gj7qw" Oct 03 07:50:46 crc kubenswrapper[4664]: I1003 07:50:46.492693 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:50:46 crc kubenswrapper[4664]: E1003 07:50:46.493216 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 07:50:46.993201148 +0000 UTC m=+147.814391638 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:46 crc kubenswrapper[4664]: I1003 07:50:46.495492 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-j2cbs" Oct 03 07:50:46 crc kubenswrapper[4664]: I1003 07:50:46.580629 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-87c8s" podStartSLOduration=122.580591942 podStartE2EDuration="2m2.580591942s" podCreationTimestamp="2025-10-03 07:48:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:46.577757976 +0000 UTC m=+147.398948486" watchObservedRunningTime="2025-10-03 07:50:46.580591942 +0000 UTC m=+147.401782432" Oct 03 07:50:46 crc kubenswrapper[4664]: I1003 07:50:46.597884 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:46 crc kubenswrapper[4664]: E1003 07:50:46.602830 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 07:50:47.102811525 +0000 UTC m=+147.924002025 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-szr58" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:46 crc kubenswrapper[4664]: E1003 07:50:46.699625 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 07:50:47.199591367 +0000 UTC m=+148.020781857 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:46 crc kubenswrapper[4664]: I1003 07:50:46.699901 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:50:46 crc kubenswrapper[4664]: I1003 07:50:46.700256 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:46 crc kubenswrapper[4664]: E1003 07:50:46.700624 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 07:50:47.200602091 +0000 UTC m=+148.021792591 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-szr58" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:46 crc kubenswrapper[4664]: I1003 07:50:46.802154 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:50:46 crc kubenswrapper[4664]: E1003 07:50:46.802989 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 07:50:47.302970182 +0000 UTC m=+148.124160672 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:46 crc kubenswrapper[4664]: I1003 07:50:46.904792 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:46 crc kubenswrapper[4664]: E1003 07:50:46.905337 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 07:50:47.405321203 +0000 UTC m=+148.226511693 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-szr58" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.006854 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:50:47 crc kubenswrapper[4664]: E1003 07:50:47.007288 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 07:50:47.50726802 +0000 UTC m=+148.328458510 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.107922 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:47 crc kubenswrapper[4664]: E1003 07:50:47.108439 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 07:50:47.608422679 +0000 UTC m=+148.429613169 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-szr58" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.190384 4664 patch_prober.go:28] interesting pod/router-default-5444994796-jxft6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 07:50:47 crc kubenswrapper[4664]: [-]has-synced failed: reason withheld Oct 03 07:50:47 crc kubenswrapper[4664]: [+]process-running ok Oct 03 07:50:47 crc kubenswrapper[4664]: healthz check failed Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.191016 4664 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jxft6" podUID="bcb87980-5888-4a30-859f-a9ac5b95f2c0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.209314 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:50:47 crc kubenswrapper[4664]: E1003 07:50:47.209527 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 07:50:47.709504277 +0000 UTC m=+148.530694767 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.209579 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:47 crc kubenswrapper[4664]: E1003 07:50:47.209986 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 07:50:47.709974743 +0000 UTC m=+148.531165233 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-szr58" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.311024 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:50:47 crc kubenswrapper[4664]: E1003 07:50:47.311371 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 07:50:47.81134808 +0000 UTC m=+148.632538570 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.311794 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:47 crc kubenswrapper[4664]: E1003 07:50:47.312164 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 07:50:47.812154908 +0000 UTC m=+148.633345398 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-szr58" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.412994 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:50:47 crc kubenswrapper[4664]: E1003 07:50:47.413374 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 07:50:47.913356639 +0000 UTC m=+148.734547129 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.472840 4664 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-xt6kz container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.472903 4664 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xt6kz" podUID="77a9c147-cee1-4a60-a2e1-8dd93096f35f" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.40:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.497911 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-v4cvf" event={"ID":"32eb3af2-4f0e-43f7-99ca-60eb556894d9","Type":"ContainerStarted","Data":"9c2dbf72aecacec5c2befb0b91ae50a60a71cc585a09be42ff6eaef8609f1ae5"} Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.505981 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-n4cpb"] Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.506998 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n4cpb" Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.517098 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.518502 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:47 crc kubenswrapper[4664]: E1003 07:50:47.518809 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 07:50:48.018799835 +0000 UTC m=+148.839990325 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-szr58" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.599732 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n4cpb"] Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.623550 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:50:47 crc kubenswrapper[4664]: E1003 07:50:47.623704 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 07:50:48.123682291 +0000 UTC m=+148.944872781 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.623996 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.624182 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea079e38-0970-4e57-af62-4910892ea04d-catalog-content\") pod \"community-operators-n4cpb\" (UID: \"ea079e38-0970-4e57-af62-4910892ea04d\") " pod="openshift-marketplace/community-operators-n4cpb" Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.624259 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcm94\" (UniqueName: \"kubernetes.io/projected/ea079e38-0970-4e57-af62-4910892ea04d-kube-api-access-jcm94\") pod \"community-operators-n4cpb\" (UID: \"ea079e38-0970-4e57-af62-4910892ea04d\") " pod="openshift-marketplace/community-operators-n4cpb" Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.624287 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea079e38-0970-4e57-af62-4910892ea04d-utilities\") pod \"community-operators-n4cpb\" (UID: \"ea079e38-0970-4e57-af62-4910892ea04d\") " pod="openshift-marketplace/community-operators-n4cpb" Oct 03 07:50:47 crc kubenswrapper[4664]: E1003 07:50:47.627291 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 07:50:48.127278883 +0000 UTC m=+148.948469373 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-szr58" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.697146 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2t2bc"] Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.698148 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2t2bc" Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.699686 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.714205 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2t2bc"] Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.725291 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.725469 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea079e38-0970-4e57-af62-4910892ea04d-catalog-content\") pod \"community-operators-n4cpb\" (UID: \"ea079e38-0970-4e57-af62-4910892ea04d\") " pod="openshift-marketplace/community-operators-n4cpb" Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.725494 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.725512 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.725531 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcm94\" (UniqueName: \"kubernetes.io/projected/ea079e38-0970-4e57-af62-4910892ea04d-kube-api-access-jcm94\") pod \"community-operators-n4cpb\" (UID: \"ea079e38-0970-4e57-af62-4910892ea04d\") " pod="openshift-marketplace/community-operators-n4cpb" Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.725554 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.725569 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea079e38-0970-4e57-af62-4910892ea04d-utilities\") pod \"community-operators-n4cpb\" (UID: \"ea079e38-0970-4e57-af62-4910892ea04d\") " pod="openshift-marketplace/community-operators-n4cpb" Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.725600 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.727888 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea079e38-0970-4e57-af62-4910892ea04d-catalog-content\") pod \"community-operators-n4cpb\" (UID: \"ea079e38-0970-4e57-af62-4910892ea04d\") " pod="openshift-marketplace/community-operators-n4cpb" Oct 03 07:50:47 crc kubenswrapper[4664]: E1003 07:50:47.728007 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 07:50:48.227984848 +0000 UTC m=+149.049175388 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.729237 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea079e38-0970-4e57-af62-4910892ea04d-utilities\") pod \"community-operators-n4cpb\" (UID: \"ea079e38-0970-4e57-af62-4910892ea04d\") " pod="openshift-marketplace/community-operators-n4cpb" Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.731008 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.737263 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.737322 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.738845 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.783749 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcm94\" (UniqueName: \"kubernetes.io/projected/ea079e38-0970-4e57-af62-4910892ea04d-kube-api-access-jcm94\") pod \"community-operators-n4cpb\" (UID: \"ea079e38-0970-4e57-af62-4910892ea04d\") " pod="openshift-marketplace/community-operators-n4cpb" Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.826862 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4m48\" (UniqueName: \"kubernetes.io/projected/5c238baa-b35f-404b-b6e3-ebec940e30be-kube-api-access-b4m48\") pod \"certified-operators-2t2bc\" (UID: \"5c238baa-b35f-404b-b6e3-ebec940e30be\") " pod="openshift-marketplace/certified-operators-2t2bc" Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.826947 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.826989 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c238baa-b35f-404b-b6e3-ebec940e30be-utilities\") pod \"certified-operators-2t2bc\" (UID: \"5c238baa-b35f-404b-b6e3-ebec940e30be\") " pod="openshift-marketplace/certified-operators-2t2bc" Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.827034 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c238baa-b35f-404b-b6e3-ebec940e30be-catalog-content\") pod \"certified-operators-2t2bc\" (UID: \"5c238baa-b35f-404b-b6e3-ebec940e30be\") " pod="openshift-marketplace/certified-operators-2t2bc" Oct 03 07:50:47 crc kubenswrapper[4664]: E1003 07:50:47.827357 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 07:50:48.327337627 +0000 UTC m=+149.148528167 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-szr58" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.835567 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n4cpb" Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.899186 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.902901 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.911385 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.913247 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-swwsk"] Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.914987 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-swwsk" Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.928228 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.928671 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4m48\" (UniqueName: \"kubernetes.io/projected/5c238baa-b35f-404b-b6e3-ebec940e30be-kube-api-access-b4m48\") pod \"certified-operators-2t2bc\" (UID: \"5c238baa-b35f-404b-b6e3-ebec940e30be\") " pod="openshift-marketplace/certified-operators-2t2bc" Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.928746 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c238baa-b35f-404b-b6e3-ebec940e30be-utilities\") pod \"certified-operators-2t2bc\" (UID: \"5c238baa-b35f-404b-b6e3-ebec940e30be\") " pod="openshift-marketplace/certified-operators-2t2bc" Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.928785 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c238baa-b35f-404b-b6e3-ebec940e30be-catalog-content\") pod \"certified-operators-2t2bc\" (UID: \"5c238baa-b35f-404b-b6e3-ebec940e30be\") " pod="openshift-marketplace/certified-operators-2t2bc" Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.929183 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c238baa-b35f-404b-b6e3-ebec940e30be-catalog-content\") pod \"certified-operators-2t2bc\" (UID: \"5c238baa-b35f-404b-b6e3-ebec940e30be\") " pod="openshift-marketplace/certified-operators-2t2bc" Oct 03 07:50:47 crc kubenswrapper[4664]: E1003 07:50:47.929264 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 07:50:48.429248232 +0000 UTC m=+149.250438722 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.929845 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c238baa-b35f-404b-b6e3-ebec940e30be-utilities\") pod \"certified-operators-2t2bc\" (UID: \"5c238baa-b35f-404b-b6e3-ebec940e30be\") " pod="openshift-marketplace/certified-operators-2t2bc" Oct 03 07:50:47 crc kubenswrapper[4664]: I1003 07:50:47.938945 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-swwsk"] Oct 03 07:50:48 crc kubenswrapper[4664]: I1003 07:50:48.017033 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4m48\" (UniqueName: \"kubernetes.io/projected/5c238baa-b35f-404b-b6e3-ebec940e30be-kube-api-access-b4m48\") pod \"certified-operators-2t2bc\" (UID: \"5c238baa-b35f-404b-b6e3-ebec940e30be\") " pod="openshift-marketplace/certified-operators-2t2bc" Oct 03 07:50:48 crc kubenswrapper[4664]: I1003 07:50:48.031940 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2t2bc" Oct 03 07:50:48 crc kubenswrapper[4664]: I1003 07:50:48.035029 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78dc4ca9-6ff9-428f-b341-a02f0a85dfec-catalog-content\") pod \"community-operators-swwsk\" (UID: \"78dc4ca9-6ff9-428f-b341-a02f0a85dfec\") " pod="openshift-marketplace/community-operators-swwsk" Oct 03 07:50:48 crc kubenswrapper[4664]: I1003 07:50:48.035356 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9f5t\" (UniqueName: \"kubernetes.io/projected/78dc4ca9-6ff9-428f-b341-a02f0a85dfec-kube-api-access-p9f5t\") pod \"community-operators-swwsk\" (UID: \"78dc4ca9-6ff9-428f-b341-a02f0a85dfec\") " pod="openshift-marketplace/community-operators-swwsk" Oct 03 07:50:48 crc kubenswrapper[4664]: I1003 07:50:48.035384 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78dc4ca9-6ff9-428f-b341-a02f0a85dfec-utilities\") pod \"community-operators-swwsk\" (UID: \"78dc4ca9-6ff9-428f-b341-a02f0a85dfec\") " pod="openshift-marketplace/community-operators-swwsk" Oct 03 07:50:48 crc kubenswrapper[4664]: I1003 07:50:48.035423 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:48 crc kubenswrapper[4664]: E1003 07:50:48.035757 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 07:50:48.535745733 +0000 UTC m=+149.356936223 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-szr58" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:48 crc kubenswrapper[4664]: I1003 07:50:48.135962 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:50:48 crc kubenswrapper[4664]: I1003 07:50:48.136344 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78dc4ca9-6ff9-428f-b341-a02f0a85dfec-catalog-content\") pod \"community-operators-swwsk\" (UID: \"78dc4ca9-6ff9-428f-b341-a02f0a85dfec\") " pod="openshift-marketplace/community-operators-swwsk" Oct 03 07:50:48 crc kubenswrapper[4664]: I1003 07:50:48.136447 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9f5t\" (UniqueName: \"kubernetes.io/projected/78dc4ca9-6ff9-428f-b341-a02f0a85dfec-kube-api-access-p9f5t\") pod \"community-operators-swwsk\" (UID: \"78dc4ca9-6ff9-428f-b341-a02f0a85dfec\") " pod="openshift-marketplace/community-operators-swwsk" Oct 03 07:50:48 crc kubenswrapper[4664]: I1003 07:50:48.136494 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78dc4ca9-6ff9-428f-b341-a02f0a85dfec-utilities\") pod \"community-operators-swwsk\" (UID: \"78dc4ca9-6ff9-428f-b341-a02f0a85dfec\") " pod="openshift-marketplace/community-operators-swwsk" Oct 03 07:50:48 crc kubenswrapper[4664]: I1003 07:50:48.136936 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78dc4ca9-6ff9-428f-b341-a02f0a85dfec-utilities\") pod \"community-operators-swwsk\" (UID: \"78dc4ca9-6ff9-428f-b341-a02f0a85dfec\") " pod="openshift-marketplace/community-operators-swwsk" Oct 03 07:50:48 crc kubenswrapper[4664]: E1003 07:50:48.137012 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 07:50:48.636997327 +0000 UTC m=+149.458187817 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:48 crc kubenswrapper[4664]: I1003 07:50:48.137224 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78dc4ca9-6ff9-428f-b341-a02f0a85dfec-catalog-content\") pod \"community-operators-swwsk\" (UID: \"78dc4ca9-6ff9-428f-b341-a02f0a85dfec\") " pod="openshift-marketplace/community-operators-swwsk" Oct 03 07:50:48 crc kubenswrapper[4664]: I1003 07:50:48.147005 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ftqrw"] Oct 03 07:50:48 crc kubenswrapper[4664]: I1003 07:50:48.148247 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ftqrw" Oct 03 07:50:48 crc kubenswrapper[4664]: I1003 07:50:48.188393 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9f5t\" (UniqueName: \"kubernetes.io/projected/78dc4ca9-6ff9-428f-b341-a02f0a85dfec-kube-api-access-p9f5t\") pod \"community-operators-swwsk\" (UID: \"78dc4ca9-6ff9-428f-b341-a02f0a85dfec\") " pod="openshift-marketplace/community-operators-swwsk" Oct 03 07:50:48 crc kubenswrapper[4664]: I1003 07:50:48.220202 4664 patch_prober.go:28] interesting pod/router-default-5444994796-jxft6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 07:50:48 crc kubenswrapper[4664]: [-]has-synced failed: reason withheld Oct 03 07:50:48 crc kubenswrapper[4664]: [+]process-running ok Oct 03 07:50:48 crc kubenswrapper[4664]: healthz check failed Oct 03 07:50:48 crc kubenswrapper[4664]: I1003 07:50:48.220258 4664 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jxft6" podUID="bcb87980-5888-4a30-859f-a9ac5b95f2c0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 07:50:48 crc kubenswrapper[4664]: I1003 07:50:48.241481 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:48 crc kubenswrapper[4664]: I1003 07:50:48.241554 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d56b337-1eb1-4e79-b6ef-bb2d85737a41-catalog-content\") pod \"certified-operators-ftqrw\" (UID: \"5d56b337-1eb1-4e79-b6ef-bb2d85737a41\") " pod="openshift-marketplace/certified-operators-ftqrw" Oct 03 07:50:48 crc kubenswrapper[4664]: I1003 07:50:48.241581 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d56b337-1eb1-4e79-b6ef-bb2d85737a41-utilities\") pod \"certified-operators-ftqrw\" (UID: \"5d56b337-1eb1-4e79-b6ef-bb2d85737a41\") " pod="openshift-marketplace/certified-operators-ftqrw" Oct 03 07:50:48 crc kubenswrapper[4664]: I1003 07:50:48.241649 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkgg7\" (UniqueName: \"kubernetes.io/projected/5d56b337-1eb1-4e79-b6ef-bb2d85737a41-kube-api-access-vkgg7\") pod \"certified-operators-ftqrw\" (UID: \"5d56b337-1eb1-4e79-b6ef-bb2d85737a41\") " pod="openshift-marketplace/certified-operators-ftqrw" Oct 03 07:50:48 crc kubenswrapper[4664]: E1003 07:50:48.242007 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 07:50:48.741991767 +0000 UTC m=+149.563182257 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-szr58" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:48 crc kubenswrapper[4664]: I1003 07:50:48.251100 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-swwsk" Oct 03 07:50:48 crc kubenswrapper[4664]: I1003 07:50:48.345192 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:50:48 crc kubenswrapper[4664]: I1003 07:50:48.345493 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d56b337-1eb1-4e79-b6ef-bb2d85737a41-catalog-content\") pod \"certified-operators-ftqrw\" (UID: \"5d56b337-1eb1-4e79-b6ef-bb2d85737a41\") " pod="openshift-marketplace/certified-operators-ftqrw" Oct 03 07:50:48 crc kubenswrapper[4664]: I1003 07:50:48.345519 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d56b337-1eb1-4e79-b6ef-bb2d85737a41-utilities\") pod \"certified-operators-ftqrw\" (UID: \"5d56b337-1eb1-4e79-b6ef-bb2d85737a41\") " pod="openshift-marketplace/certified-operators-ftqrw" Oct 03 07:50:48 crc kubenswrapper[4664]: I1003 07:50:48.345566 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkgg7\" (UniqueName: \"kubernetes.io/projected/5d56b337-1eb1-4e79-b6ef-bb2d85737a41-kube-api-access-vkgg7\") pod \"certified-operators-ftqrw\" (UID: \"5d56b337-1eb1-4e79-b6ef-bb2d85737a41\") " pod="openshift-marketplace/certified-operators-ftqrw" Oct 03 07:50:48 crc kubenswrapper[4664]: E1003 07:50:48.345945 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 07:50:48.845930091 +0000 UTC m=+149.667120571 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:48 crc kubenswrapper[4664]: I1003 07:50:48.346680 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d56b337-1eb1-4e79-b6ef-bb2d85737a41-catalog-content\") pod \"certified-operators-ftqrw\" (UID: \"5d56b337-1eb1-4e79-b6ef-bb2d85737a41\") " pod="openshift-marketplace/certified-operators-ftqrw" Oct 03 07:50:48 crc kubenswrapper[4664]: I1003 07:50:48.346907 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d56b337-1eb1-4e79-b6ef-bb2d85737a41-utilities\") pod \"certified-operators-ftqrw\" (UID: \"5d56b337-1eb1-4e79-b6ef-bb2d85737a41\") " pod="openshift-marketplace/certified-operators-ftqrw" Oct 03 07:50:48 crc kubenswrapper[4664]: I1003 07:50:48.408724 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ftqrw"] Oct 03 07:50:48 crc kubenswrapper[4664]: I1003 07:50:48.420757 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkgg7\" (UniqueName: \"kubernetes.io/projected/5d56b337-1eb1-4e79-b6ef-bb2d85737a41-kube-api-access-vkgg7\") pod \"certified-operators-ftqrw\" (UID: \"5d56b337-1eb1-4e79-b6ef-bb2d85737a41\") " pod="openshift-marketplace/certified-operators-ftqrw" Oct 03 07:50:48 crc kubenswrapper[4664]: I1003 07:50:48.446332 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:48 crc kubenswrapper[4664]: E1003 07:50:48.446696 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 07:50:48.946683868 +0000 UTC m=+149.767874348 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-szr58" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:48 crc kubenswrapper[4664]: I1003 07:50:48.476261 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ftqrw" Oct 03 07:50:48 crc kubenswrapper[4664]: I1003 07:50:48.512715 4664 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-xt6kz container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 03 07:50:48 crc kubenswrapper[4664]: I1003 07:50:48.512762 4664 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xt6kz" podUID="77a9c147-cee1-4a60-a2e1-8dd93096f35f" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.40:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 07:50:48 crc kubenswrapper[4664]: I1003 07:50:48.554124 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:50:48 crc kubenswrapper[4664]: E1003 07:50:48.554504 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 07:50:49.054486653 +0000 UTC m=+149.875677143 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:48 crc kubenswrapper[4664]: I1003 07:50:48.600699 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-v4cvf" event={"ID":"32eb3af2-4f0e-43f7-99ca-60eb556894d9","Type":"ContainerStarted","Data":"4e742843ece4761699c28f248d1972a00ebdcbba929af773bf2f75990f30ffe6"} Oct 03 07:50:48 crc kubenswrapper[4664]: I1003 07:50:48.656679 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:48 crc kubenswrapper[4664]: E1003 07:50:48.656959 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 07:50:49.156947917 +0000 UTC m=+149.978138407 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-szr58" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:48 crc kubenswrapper[4664]: I1003 07:50:48.757444 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:50:48 crc kubenswrapper[4664]: E1003 07:50:48.758965 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 07:50:49.258944436 +0000 UTC m=+150.080134926 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:48 crc kubenswrapper[4664]: I1003 07:50:48.859142 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:48 crc kubenswrapper[4664]: E1003 07:50:48.859504 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 07:50:49.359486615 +0000 UTC m=+150.180677105 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-szr58" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:48 crc kubenswrapper[4664]: I1003 07:50:48.966369 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:50:48 crc kubenswrapper[4664]: I1003 07:50:48.966747 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2t2bc"] Oct 03 07:50:48 crc kubenswrapper[4664]: E1003 07:50:48.966773 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 07:50:49.466753583 +0000 UTC m=+150.287944073 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:49 crc kubenswrapper[4664]: W1003 07:50:49.025647 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-1ad5a8b3ce2b0140c9b9d8bca0a8bcb41e49c01a52c993a03b104d358a32a0a1 WatchSource:0}: Error finding container 1ad5a8b3ce2b0140c9b9d8bca0a8bcb41e49c01a52c993a03b104d358a32a0a1: Status 404 returned error can't find the container with id 1ad5a8b3ce2b0140c9b9d8bca0a8bcb41e49c01a52c993a03b104d358a32a0a1 Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.071088 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:49 crc kubenswrapper[4664]: E1003 07:50:49.071585 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 07:50:49.571562037 +0000 UTC m=+150.392752527 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-szr58" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.149656 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n4cpb"] Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.172210 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:50:49 crc kubenswrapper[4664]: E1003 07:50:49.172308 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 07:50:49.672293232 +0000 UTC m=+150.493483722 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.172595 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:49 crc kubenswrapper[4664]: E1003 07:50:49.172860 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 07:50:49.672849771 +0000 UTC m=+150.494040261 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-szr58" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.189859 4664 patch_prober.go:28] interesting pod/router-default-5444994796-jxft6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 07:50:49 crc kubenswrapper[4664]: [-]has-synced failed: reason withheld Oct 03 07:50:49 crc kubenswrapper[4664]: [+]process-running ok Oct 03 07:50:49 crc kubenswrapper[4664]: healthz check failed Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.189937 4664 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jxft6" podUID="bcb87980-5888-4a30-859f-a9ac5b95f2c0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.275201 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:50:49 crc kubenswrapper[4664]: E1003 07:50:49.275569 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 07:50:49.775551124 +0000 UTC m=+150.596741624 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.377560 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:49 crc kubenswrapper[4664]: E1003 07:50:49.378215 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 07:50:49.878203245 +0000 UTC m=+150.699393735 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-szr58" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.394767 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ftqrw"] Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.409544 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.410355 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.415076 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.415272 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.430091 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 03 07:50:49 crc kubenswrapper[4664]: W1003 07:50:49.431315 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d56b337_1eb1_4e79_b6ef_bb2d85737a41.slice/crio-eea1b152f46022f0a187f932fe7a593b43617ea73d8e070b731e61a1396d6278 WatchSource:0}: Error finding container eea1b152f46022f0a187f932fe7a593b43617ea73d8e070b731e61a1396d6278: Status 404 returned error can't find the container with id eea1b152f46022f0a187f932fe7a593b43617ea73d8e070b731e61a1396d6278 Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.465368 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-swwsk"] Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.481908 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:50:49 crc kubenswrapper[4664]: E1003 07:50:49.482470 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 07:50:49.982452539 +0000 UTC m=+150.803643029 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.500538 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gtxvm"] Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.502174 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gtxvm" Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.510912 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.529133 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gtxvm"] Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.585339 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/583c3690-dbf8-4272-bb35-b5557b7a3e74-utilities\") pod \"redhat-marketplace-gtxvm\" (UID: \"583c3690-dbf8-4272-bb35-b5557b7a3e74\") " pod="openshift-marketplace/redhat-marketplace-gtxvm" Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.585442 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh7gc\" (UniqueName: \"kubernetes.io/projected/583c3690-dbf8-4272-bb35-b5557b7a3e74-kube-api-access-zh7gc\") pod \"redhat-marketplace-gtxvm\" (UID: \"583c3690-dbf8-4272-bb35-b5557b7a3e74\") " pod="openshift-marketplace/redhat-marketplace-gtxvm" Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.585492 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/63d05775-5de6-4f48-bfa5-96df1a0a8aa3-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"63d05775-5de6-4f48-bfa5-96df1a0a8aa3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.585517 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/583c3690-dbf8-4272-bb35-b5557b7a3e74-catalog-content\") pod \"redhat-marketplace-gtxvm\" (UID: \"583c3690-dbf8-4272-bb35-b5557b7a3e74\") " pod="openshift-marketplace/redhat-marketplace-gtxvm" Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.585591 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.585698 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/63d05775-5de6-4f48-bfa5-96df1a0a8aa3-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"63d05775-5de6-4f48-bfa5-96df1a0a8aa3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 07:50:49 crc kubenswrapper[4664]: E1003 07:50:49.586309 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 07:50:50.086235419 +0000 UTC m=+150.907425909 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-szr58" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.636741 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2t2bc" event={"ID":"5c238baa-b35f-404b-b6e3-ebec940e30be","Type":"ContainerStarted","Data":"407c6a0a92fd9e8c562d093d20e0749be92f57771ea2404294790fc4894a7adf"} Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.641425 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"ae6f52012e276e9fa8a048cc4ac052be053fd5ca22aab70811df2bb01428c337"} Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.648404 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ftqrw" event={"ID":"5d56b337-1eb1-4e79-b6ef-bb2d85737a41","Type":"ContainerStarted","Data":"eea1b152f46022f0a187f932fe7a593b43617ea73d8e070b731e61a1396d6278"} Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.650936 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n4cpb" event={"ID":"ea079e38-0970-4e57-af62-4910892ea04d","Type":"ContainerStarted","Data":"8e4fbdc4b3415f1af4bca9861846e19cf703a9a4390a6f44b8ea5e624c26e12b"} Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.652345 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-swwsk" event={"ID":"78dc4ca9-6ff9-428f-b341-a02f0a85dfec","Type":"ContainerStarted","Data":"e05f47359152ac2b3f8ae24e42c3e38dc6903ef306bd4df16b375778b197dab2"} Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.653410 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"f418545cb31c7c7a40b43bccc7f8e43a85f7dcca276459504602275424bb4756"} Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.672757 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"1ad5a8b3ce2b0140c9b9d8bca0a8bcb41e49c01a52c993a03b104d358a32a0a1"} Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.689128 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.689327 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/583c3690-dbf8-4272-bb35-b5557b7a3e74-utilities\") pod \"redhat-marketplace-gtxvm\" (UID: \"583c3690-dbf8-4272-bb35-b5557b7a3e74\") " pod="openshift-marketplace/redhat-marketplace-gtxvm" Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.689380 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh7gc\" (UniqueName: \"kubernetes.io/projected/583c3690-dbf8-4272-bb35-b5557b7a3e74-kube-api-access-zh7gc\") pod \"redhat-marketplace-gtxvm\" (UID: \"583c3690-dbf8-4272-bb35-b5557b7a3e74\") " pod="openshift-marketplace/redhat-marketplace-gtxvm" Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.689399 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/63d05775-5de6-4f48-bfa5-96df1a0a8aa3-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"63d05775-5de6-4f48-bfa5-96df1a0a8aa3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.689417 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/583c3690-dbf8-4272-bb35-b5557b7a3e74-catalog-content\") pod \"redhat-marketplace-gtxvm\" (UID: \"583c3690-dbf8-4272-bb35-b5557b7a3e74\") " pod="openshift-marketplace/redhat-marketplace-gtxvm" Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.689460 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/63d05775-5de6-4f48-bfa5-96df1a0a8aa3-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"63d05775-5de6-4f48-bfa5-96df1a0a8aa3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.689530 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/63d05775-5de6-4f48-bfa5-96df1a0a8aa3-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"63d05775-5de6-4f48-bfa5-96df1a0a8aa3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 07:50:49 crc kubenswrapper[4664]: E1003 07:50:49.689625 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 07:50:50.189582533 +0000 UTC m=+151.010773023 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.689989 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/583c3690-dbf8-4272-bb35-b5557b7a3e74-utilities\") pod \"redhat-marketplace-gtxvm\" (UID: \"583c3690-dbf8-4272-bb35-b5557b7a3e74\") " pod="openshift-marketplace/redhat-marketplace-gtxvm" Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.690535 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/583c3690-dbf8-4272-bb35-b5557b7a3e74-catalog-content\") pod \"redhat-marketplace-gtxvm\" (UID: \"583c3690-dbf8-4272-bb35-b5557b7a3e74\") " pod="openshift-marketplace/redhat-marketplace-gtxvm" Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.718516 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/63d05775-5de6-4f48-bfa5-96df1a0a8aa3-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"63d05775-5de6-4f48-bfa5-96df1a0a8aa3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.718551 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh7gc\" (UniqueName: \"kubernetes.io/projected/583c3690-dbf8-4272-bb35-b5557b7a3e74-kube-api-access-zh7gc\") pod \"redhat-marketplace-gtxvm\" (UID: \"583c3690-dbf8-4272-bb35-b5557b7a3e74\") " pod="openshift-marketplace/redhat-marketplace-gtxvm" Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.749083 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.790668 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:49 crc kubenswrapper[4664]: E1003 07:50:49.791001 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 07:50:50.290988521 +0000 UTC m=+151.112179011 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-szr58" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.892293 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:50:49 crc kubenswrapper[4664]: E1003 07:50:49.892857 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 07:50:50.392837695 +0000 UTC m=+151.214028185 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.893534 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dhpld"] Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.895646 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dhpld"] Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.895790 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dhpld" Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.910114 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gtxvm" Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.925270 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tdjrr" Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.932636 4664 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.994892 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18497250-f896-4630-98c4-c2915a50fd1b-catalog-content\") pod \"redhat-marketplace-dhpld\" (UID: \"18497250-f896-4630-98c4-c2915a50fd1b\") " pod="openshift-marketplace/redhat-marketplace-dhpld" Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.995032 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18497250-f896-4630-98c4-c2915a50fd1b-utilities\") pod \"redhat-marketplace-dhpld\" (UID: \"18497250-f896-4630-98c4-c2915a50fd1b\") " pod="openshift-marketplace/redhat-marketplace-dhpld" Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.995061 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjhfn\" (UniqueName: \"kubernetes.io/projected/18497250-f896-4630-98c4-c2915a50fd1b-kube-api-access-tjhfn\") pod \"redhat-marketplace-dhpld\" (UID: \"18497250-f896-4630-98c4-c2915a50fd1b\") " pod="openshift-marketplace/redhat-marketplace-dhpld" Oct 03 07:50:49 crc kubenswrapper[4664]: I1003 07:50:49.995105 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:49 crc kubenswrapper[4664]: E1003 07:50:49.995387 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 07:50:50.495375071 +0000 UTC m=+151.316565561 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-szr58" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.053716 4664 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-03T07:50:49.932687417Z","Handler":null,"Name":""} Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.083839 4664 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.083874 4664 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.095825 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.096064 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjhfn\" (UniqueName: \"kubernetes.io/projected/18497250-f896-4630-98c4-c2915a50fd1b-kube-api-access-tjhfn\") pod \"redhat-marketplace-dhpld\" (UID: \"18497250-f896-4630-98c4-c2915a50fd1b\") " pod="openshift-marketplace/redhat-marketplace-dhpld" Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.096136 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18497250-f896-4630-98c4-c2915a50fd1b-catalog-content\") pod \"redhat-marketplace-dhpld\" (UID: \"18497250-f896-4630-98c4-c2915a50fd1b\") " pod="openshift-marketplace/redhat-marketplace-dhpld" Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.096277 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18497250-f896-4630-98c4-c2915a50fd1b-utilities\") pod \"redhat-marketplace-dhpld\" (UID: \"18497250-f896-4630-98c4-c2915a50fd1b\") " pod="openshift-marketplace/redhat-marketplace-dhpld" Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.096703 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18497250-f896-4630-98c4-c2915a50fd1b-utilities\") pod \"redhat-marketplace-dhpld\" (UID: \"18497250-f896-4630-98c4-c2915a50fd1b\") " pod="openshift-marketplace/redhat-marketplace-dhpld" Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.096862 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18497250-f896-4630-98c4-c2915a50fd1b-catalog-content\") pod \"redhat-marketplace-dhpld\" (UID: \"18497250-f896-4630-98c4-c2915a50fd1b\") " pod="openshift-marketplace/redhat-marketplace-dhpld" Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.101279 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.167749 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjhfn\" (UniqueName: \"kubernetes.io/projected/18497250-f896-4630-98c4-c2915a50fd1b-kube-api-access-tjhfn\") pod \"redhat-marketplace-dhpld\" (UID: \"18497250-f896-4630-98c4-c2915a50fd1b\") " pod="openshift-marketplace/redhat-marketplace-dhpld" Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.189395 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.197121 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.197219 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.199219 4664 patch_prober.go:28] interesting pod/router-default-5444994796-jxft6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 07:50:50 crc kubenswrapper[4664]: [-]has-synced failed: reason withheld Oct 03 07:50:50 crc kubenswrapper[4664]: [+]process-running ok Oct 03 07:50:50 crc kubenswrapper[4664]: healthz check failed Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.199265 4664 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jxft6" podUID="bcb87980-5888-4a30-859f-a9ac5b95f2c0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.199406 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.199575 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.205149 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.222293 4664 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.222342 4664 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.230555 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dhpld" Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.250903 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.277402 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gtxvm"] Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.298374 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d52b7963-c3dd-44d1-b33f-14b98a2ef0f2-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"d52b7963-c3dd-44d1-b33f-14b98a2ef0f2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.298448 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d52b7963-c3dd-44d1-b33f-14b98a2ef0f2-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"d52b7963-c3dd-44d1-b33f-14b98a2ef0f2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.399546 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d52b7963-c3dd-44d1-b33f-14b98a2ef0f2-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"d52b7963-c3dd-44d1-b33f-14b98a2ef0f2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.399922 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d52b7963-c3dd-44d1-b33f-14b98a2ef0f2-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"d52b7963-c3dd-44d1-b33f-14b98a2ef0f2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.400078 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d52b7963-c3dd-44d1-b33f-14b98a2ef0f2-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"d52b7963-c3dd-44d1-b33f-14b98a2ef0f2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.416034 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-szr58\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.430568 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d52b7963-c3dd-44d1-b33f-14b98a2ef0f2-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"d52b7963-c3dd-44d1-b33f-14b98a2ef0f2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.464360 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dhpld"] Oct 03 07:50:50 crc kubenswrapper[4664]: W1003 07:50:50.485222 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18497250_f896_4630_98c4_c2915a50fd1b.slice/crio-e867e459054869347c7fed84cc7b077fc1fd8ca274e73a9c345925b2c7ba982c WatchSource:0}: Error finding container e867e459054869347c7fed84cc7b077fc1fd8ca274e73a9c345925b2c7ba982c: Status 404 returned error can't find the container with id e867e459054869347c7fed84cc7b077fc1fd8ca274e73a9c345925b2c7ba982c Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.533275 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.629386 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.703379 4664 generic.go:334] "Generic (PLEG): container finished" podID="5c238baa-b35f-404b-b6e3-ebec940e30be" containerID="fc5f5f61d2852dba7006aea91cc5d03bbf7b7df35f5b4aba10cbccb8ec21bcf4" exitCode=0 Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.703495 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2t2bc" event={"ID":"5c238baa-b35f-404b-b6e3-ebec940e30be","Type":"ContainerDied","Data":"fc5f5f61d2852dba7006aea91cc5d03bbf7b7df35f5b4aba10cbccb8ec21bcf4"} Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.705445 4664 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.711839 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"046945b989d2a3481f9cb97ea4b5c52333bee649aaf95e67beeed5409a3d5e7f"} Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.725223 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-2m6x7" Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.725256 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-2m6x7" Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.729480 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dhpld" event={"ID":"18497250-f896-4630-98c4-c2915a50fd1b","Type":"ContainerStarted","Data":"e867e459054869347c7fed84cc7b077fc1fd8ca274e73a9c345925b2c7ba982c"} Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.732842 4664 patch_prober.go:28] interesting pod/console-f9d7485db-2m6x7 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.733097 4664 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-2m6x7" podUID="ceef7ba8-f996-4b56-a477-23873e39cde7" containerName="console" probeResult="failure" output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.741302 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-87c8s" Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.741341 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-87c8s" Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.748155 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"70a031c451ae6dfff6ef4d9f80a0ece1dafc003c23ea2083dcd72d353becb50b"} Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.756392 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"23e17f2ac9a4be919daf090560a84e6ed92b962407a7b135c12692a66eb0cb91"} Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.756760 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.763143 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-v4cvf" event={"ID":"32eb3af2-4f0e-43f7-99ca-60eb556894d9","Type":"ContainerStarted","Data":"4460af6ce84c7191a9c1529dba10d0bb2fb68b91bf114f6c53df53d0f23af019"} Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.765713 4664 patch_prober.go:28] interesting pod/apiserver-76f77b778f-87c8s container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 03 07:50:50 crc kubenswrapper[4664]: [+]log ok Oct 03 07:50:50 crc kubenswrapper[4664]: [+]etcd ok Oct 03 07:50:50 crc kubenswrapper[4664]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 03 07:50:50 crc kubenswrapper[4664]: [+]poststarthook/generic-apiserver-start-informers ok Oct 03 07:50:50 crc kubenswrapper[4664]: [+]poststarthook/max-in-flight-filter ok Oct 03 07:50:50 crc kubenswrapper[4664]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 03 07:50:50 crc kubenswrapper[4664]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 03 07:50:50 crc kubenswrapper[4664]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Oct 03 07:50:50 crc kubenswrapper[4664]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Oct 03 07:50:50 crc kubenswrapper[4664]: [+]poststarthook/project.openshift.io-projectcache ok Oct 03 07:50:50 crc kubenswrapper[4664]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 03 07:50:50 crc kubenswrapper[4664]: [+]poststarthook/openshift.io-startinformers ok Oct 03 07:50:50 crc kubenswrapper[4664]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 03 07:50:50 crc kubenswrapper[4664]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 03 07:50:50 crc kubenswrapper[4664]: livez check failed Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.765777 4664 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-87c8s" podUID="4bb81c70-eeda-42ca-ae45-b0b2e9d0b7b1" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.773817 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gtxvm" event={"ID":"583c3690-dbf8-4272-bb35-b5557b7a3e74","Type":"ContainerStarted","Data":"a33fd0bebbb47d9df35316a4e49fb9bd9236cd12954709b0208c6a7e384293e1"} Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.797287 4664 generic.go:334] "Generic (PLEG): container finished" podID="ea079e38-0970-4e57-af62-4910892ea04d" containerID="e49f502612bef94c12e697bc00f79db50f7b907f80534b51ad099ce75963af10" exitCode=0 Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.797377 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n4cpb" event={"ID":"ea079e38-0970-4e57-af62-4910892ea04d","Type":"ContainerDied","Data":"e49f502612bef94c12e697bc00f79db50f7b907f80534b51ad099ce75963af10"} Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.800440 4664 generic.go:334] "Generic (PLEG): container finished" podID="5d56b337-1eb1-4e79-b6ef-bb2d85737a41" containerID="132246323cb672d259c14cf20694890722e4463a3702ec734ecdad2583077fc3" exitCode=0 Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.800507 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ftqrw" event={"ID":"5d56b337-1eb1-4e79-b6ef-bb2d85737a41","Type":"ContainerDied","Data":"132246323cb672d259c14cf20694890722e4463a3702ec734ecdad2583077fc3"} Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.807486 4664 generic.go:334] "Generic (PLEG): container finished" podID="78dc4ca9-6ff9-428f-b341-a02f0a85dfec" containerID="566d012fc3b7aed06eb82996c38f74d151e061fa2074bb49c04a6816f6a035e3" exitCode=0 Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.807688 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-swwsk" event={"ID":"78dc4ca9-6ff9-428f-b341-a02f0a85dfec","Type":"ContainerDied","Data":"566d012fc3b7aed06eb82996c38f74d151e061fa2074bb49c04a6816f6a035e3"} Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.813824 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"63d05775-5de6-4f48-bfa5-96df1a0a8aa3","Type":"ContainerStarted","Data":"af0f3c88e41d2ee87e5720f5802ba4a2509049168f2c93bfe41df10648108805"} Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.833184 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-smvlm" Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.834516 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-v4cvf" podStartSLOduration=12.834497934 podStartE2EDuration="12.834497934s" podCreationTimestamp="2025-10-03 07:50:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:50.814110102 +0000 UTC m=+151.635300602" watchObservedRunningTime="2025-10-03 07:50:50.834497934 +0000 UTC m=+151.655688424" Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.838684 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-smvlm" Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.894929 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rvmkh"] Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.900021 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rvmkh" Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.920003 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rvmkh"] Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.943454 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.945697 4664 patch_prober.go:28] interesting pod/downloads-7954f5f757-zcngp container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.945764 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-zcngp" podUID="acf5523b-3f1d-495e-8014-0313925e8727" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.946036 4664 patch_prober.go:28] interesting pod/downloads-7954f5f757-zcngp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Oct 03 07:50:50 crc kubenswrapper[4664]: I1003 07:50:50.946066 4664 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zcngp" podUID="acf5523b-3f1d-495e-8014-0313925e8727" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Oct 03 07:50:51 crc kubenswrapper[4664]: I1003 07:50:51.031481 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b855f400-75f9-44f7-9da3-1b4a850ac090-catalog-content\") pod \"redhat-operators-rvmkh\" (UID: \"b855f400-75f9-44f7-9da3-1b4a850ac090\") " pod="openshift-marketplace/redhat-operators-rvmkh" Oct 03 07:50:51 crc kubenswrapper[4664]: I1003 07:50:51.031871 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnbfr\" (UniqueName: \"kubernetes.io/projected/b855f400-75f9-44f7-9da3-1b4a850ac090-kube-api-access-qnbfr\") pod \"redhat-operators-rvmkh\" (UID: \"b855f400-75f9-44f7-9da3-1b4a850ac090\") " pod="openshift-marketplace/redhat-operators-rvmkh" Oct 03 07:50:51 crc kubenswrapper[4664]: I1003 07:50:51.032065 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b855f400-75f9-44f7-9da3-1b4a850ac090-utilities\") pod \"redhat-operators-rvmkh\" (UID: \"b855f400-75f9-44f7-9da3-1b4a850ac090\") " pod="openshift-marketplace/redhat-operators-rvmkh" Oct 03 07:50:51 crc kubenswrapper[4664]: I1003 07:50:51.056758 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 03 07:50:51 crc kubenswrapper[4664]: I1003 07:50:51.072215 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-szr58"] Oct 03 07:50:51 crc kubenswrapper[4664]: I1003 07:50:51.133757 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b855f400-75f9-44f7-9da3-1b4a850ac090-utilities\") pod \"redhat-operators-rvmkh\" (UID: \"b855f400-75f9-44f7-9da3-1b4a850ac090\") " pod="openshift-marketplace/redhat-operators-rvmkh" Oct 03 07:50:51 crc kubenswrapper[4664]: I1003 07:50:51.134126 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b855f400-75f9-44f7-9da3-1b4a850ac090-catalog-content\") pod \"redhat-operators-rvmkh\" (UID: \"b855f400-75f9-44f7-9da3-1b4a850ac090\") " pod="openshift-marketplace/redhat-operators-rvmkh" Oct 03 07:50:51 crc kubenswrapper[4664]: I1003 07:50:51.134163 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnbfr\" (UniqueName: \"kubernetes.io/projected/b855f400-75f9-44f7-9da3-1b4a850ac090-kube-api-access-qnbfr\") pod \"redhat-operators-rvmkh\" (UID: \"b855f400-75f9-44f7-9da3-1b4a850ac090\") " pod="openshift-marketplace/redhat-operators-rvmkh" Oct 03 07:50:51 crc kubenswrapper[4664]: I1003 07:50:51.134862 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b855f400-75f9-44f7-9da3-1b4a850ac090-utilities\") pod \"redhat-operators-rvmkh\" (UID: \"b855f400-75f9-44f7-9da3-1b4a850ac090\") " pod="openshift-marketplace/redhat-operators-rvmkh" Oct 03 07:50:51 crc kubenswrapper[4664]: I1003 07:50:51.135105 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b855f400-75f9-44f7-9da3-1b4a850ac090-catalog-content\") pod \"redhat-operators-rvmkh\" (UID: \"b855f400-75f9-44f7-9da3-1b4a850ac090\") " pod="openshift-marketplace/redhat-operators-rvmkh" Oct 03 07:50:51 crc kubenswrapper[4664]: I1003 07:50:51.184792 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-jxft6" Oct 03 07:50:51 crc kubenswrapper[4664]: I1003 07:50:51.189649 4664 patch_prober.go:28] interesting pod/router-default-5444994796-jxft6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 07:50:51 crc kubenswrapper[4664]: [-]has-synced failed: reason withheld Oct 03 07:50:51 crc kubenswrapper[4664]: [+]process-running ok Oct 03 07:50:51 crc kubenswrapper[4664]: healthz check failed Oct 03 07:50:51 crc kubenswrapper[4664]: I1003 07:50:51.189710 4664 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jxft6" podUID="bcb87980-5888-4a30-859f-a9ac5b95f2c0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 07:50:51 crc kubenswrapper[4664]: I1003 07:50:51.195395 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnbfr\" (UniqueName: \"kubernetes.io/projected/b855f400-75f9-44f7-9da3-1b4a850ac090-kube-api-access-qnbfr\") pod \"redhat-operators-rvmkh\" (UID: \"b855f400-75f9-44f7-9da3-1b4a850ac090\") " pod="openshift-marketplace/redhat-operators-rvmkh" Oct 03 07:50:51 crc kubenswrapper[4664]: I1003 07:50:51.275519 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qt6g9"] Oct 03 07:50:51 crc kubenswrapper[4664]: I1003 07:50:51.276969 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qt6g9" Oct 03 07:50:51 crc kubenswrapper[4664]: I1003 07:50:51.290934 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qt6g9"] Oct 03 07:50:51 crc kubenswrapper[4664]: I1003 07:50:51.326054 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rvmkh" Oct 03 07:50:51 crc kubenswrapper[4664]: I1003 07:50:51.438595 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/875a3a4c-cb52-4fb2-976d-0e795bbfcb4c-catalog-content\") pod \"redhat-operators-qt6g9\" (UID: \"875a3a4c-cb52-4fb2-976d-0e795bbfcb4c\") " pod="openshift-marketplace/redhat-operators-qt6g9" Oct 03 07:50:51 crc kubenswrapper[4664]: I1003 07:50:51.438924 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbxjg\" (UniqueName: \"kubernetes.io/projected/875a3a4c-cb52-4fb2-976d-0e795bbfcb4c-kube-api-access-fbxjg\") pod \"redhat-operators-qt6g9\" (UID: \"875a3a4c-cb52-4fb2-976d-0e795bbfcb4c\") " pod="openshift-marketplace/redhat-operators-qt6g9" Oct 03 07:50:51 crc kubenswrapper[4664]: I1003 07:50:51.438977 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/875a3a4c-cb52-4fb2-976d-0e795bbfcb4c-utilities\") pod \"redhat-operators-qt6g9\" (UID: \"875a3a4c-cb52-4fb2-976d-0e795bbfcb4c\") " pod="openshift-marketplace/redhat-operators-qt6g9" Oct 03 07:50:51 crc kubenswrapper[4664]: I1003 07:50:51.539902 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/875a3a4c-cb52-4fb2-976d-0e795bbfcb4c-catalog-content\") pod \"redhat-operators-qt6g9\" (UID: \"875a3a4c-cb52-4fb2-976d-0e795bbfcb4c\") " pod="openshift-marketplace/redhat-operators-qt6g9" Oct 03 07:50:51 crc kubenswrapper[4664]: I1003 07:50:51.539944 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbxjg\" (UniqueName: \"kubernetes.io/projected/875a3a4c-cb52-4fb2-976d-0e795bbfcb4c-kube-api-access-fbxjg\") pod \"redhat-operators-qt6g9\" (UID: \"875a3a4c-cb52-4fb2-976d-0e795bbfcb4c\") " pod="openshift-marketplace/redhat-operators-qt6g9" Oct 03 07:50:51 crc kubenswrapper[4664]: I1003 07:50:51.539988 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/875a3a4c-cb52-4fb2-976d-0e795bbfcb4c-utilities\") pod \"redhat-operators-qt6g9\" (UID: \"875a3a4c-cb52-4fb2-976d-0e795bbfcb4c\") " pod="openshift-marketplace/redhat-operators-qt6g9" Oct 03 07:50:51 crc kubenswrapper[4664]: I1003 07:50:51.540483 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/875a3a4c-cb52-4fb2-976d-0e795bbfcb4c-utilities\") pod \"redhat-operators-qt6g9\" (UID: \"875a3a4c-cb52-4fb2-976d-0e795bbfcb4c\") " pod="openshift-marketplace/redhat-operators-qt6g9" Oct 03 07:50:51 crc kubenswrapper[4664]: I1003 07:50:51.540735 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/875a3a4c-cb52-4fb2-976d-0e795bbfcb4c-catalog-content\") pod \"redhat-operators-qt6g9\" (UID: \"875a3a4c-cb52-4fb2-976d-0e795bbfcb4c\") " pod="openshift-marketplace/redhat-operators-qt6g9" Oct 03 07:50:51 crc kubenswrapper[4664]: I1003 07:50:51.570658 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbxjg\" (UniqueName: \"kubernetes.io/projected/875a3a4c-cb52-4fb2-976d-0e795bbfcb4c-kube-api-access-fbxjg\") pod \"redhat-operators-qt6g9\" (UID: \"875a3a4c-cb52-4fb2-976d-0e795bbfcb4c\") " pod="openshift-marketplace/redhat-operators-qt6g9" Oct 03 07:50:51 crc kubenswrapper[4664]: I1003 07:50:51.598351 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qt6g9" Oct 03 07:50:51 crc kubenswrapper[4664]: I1003 07:50:51.782969 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rvmkh"] Oct 03 07:50:51 crc kubenswrapper[4664]: I1003 07:50:51.845043 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qt6g9"] Oct 03 07:50:51 crc kubenswrapper[4664]: I1003 07:50:51.867061 4664 generic.go:334] "Generic (PLEG): container finished" podID="18497250-f896-4630-98c4-c2915a50fd1b" containerID="e5ccadc80da2b8886ced88a94dee899f25bdb2bbd59a3e93c1457a181a03d91f" exitCode=0 Oct 03 07:50:51 crc kubenswrapper[4664]: I1003 07:50:51.867908 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dhpld" event={"ID":"18497250-f896-4630-98c4-c2915a50fd1b","Type":"ContainerDied","Data":"e5ccadc80da2b8886ced88a94dee899f25bdb2bbd59a3e93c1457a181a03d91f"} Oct 03 07:50:51 crc kubenswrapper[4664]: I1003 07:50:51.881524 4664 generic.go:334] "Generic (PLEG): container finished" podID="583c3690-dbf8-4272-bb35-b5557b7a3e74" containerID="e9e355b6077feadbe86518ec74bafe3c4547c5c7215287c7e918864fc301967d" exitCode=0 Oct 03 07:50:51 crc kubenswrapper[4664]: I1003 07:50:51.887005 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 03 07:50:51 crc kubenswrapper[4664]: I1003 07:50:51.887736 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gtxvm" event={"ID":"583c3690-dbf8-4272-bb35-b5557b7a3e74","Type":"ContainerDied","Data":"e9e355b6077feadbe86518ec74bafe3c4547c5c7215287c7e918864fc301967d"} Oct 03 07:50:51 crc kubenswrapper[4664]: I1003 07:50:51.888932 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-szr58" event={"ID":"478f27ac-050e-4086-82c9-2e23559cf70b","Type":"ContainerStarted","Data":"bf0b9811e718078c691368635b828a300e6559d943528edc44e70aab1c4f9ba4"} Oct 03 07:50:51 crc kubenswrapper[4664]: I1003 07:50:51.888975 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:50:51 crc kubenswrapper[4664]: I1003 07:50:51.888987 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-szr58" event={"ID":"478f27ac-050e-4086-82c9-2e23559cf70b","Type":"ContainerStarted","Data":"778d2bcdb0801b4aaeca5861384f6171396b6cbb1e709bc89918b8f4533d0e0c"} Oct 03 07:50:51 crc kubenswrapper[4664]: I1003 07:50:51.901046 4664 generic.go:334] "Generic (PLEG): container finished" podID="63d05775-5de6-4f48-bfa5-96df1a0a8aa3" containerID="5229461f823475bbfac28c5d824a0657ed30b5a622e401c8b6e339bf52ce9782" exitCode=0 Oct 03 07:50:51 crc kubenswrapper[4664]: I1003 07:50:51.901131 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"63d05775-5de6-4f48-bfa5-96df1a0a8aa3","Type":"ContainerDied","Data":"5229461f823475bbfac28c5d824a0657ed30b5a622e401c8b6e339bf52ce9782"} Oct 03 07:50:51 crc kubenswrapper[4664]: I1003 07:50:51.909182 4664 generic.go:334] "Generic (PLEG): container finished" podID="5c3aff15-c0d3-4d20-8ee3-7d753d5f7c91" containerID="45f77438245ec43407abee2ee75046f6eb1389e179ca39cbe00dce89645fb880" exitCode=0 Oct 03 07:50:51 crc kubenswrapper[4664]: I1003 07:50:51.909314 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324625-nj78m" event={"ID":"5c3aff15-c0d3-4d20-8ee3-7d753d5f7c91","Type":"ContainerDied","Data":"45f77438245ec43407abee2ee75046f6eb1389e179ca39cbe00dce89645fb880"} Oct 03 07:50:51 crc kubenswrapper[4664]: I1003 07:50:51.923888 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d52b7963-c3dd-44d1-b33f-14b98a2ef0f2","Type":"ContainerStarted","Data":"b1055a4de0d032eb8f8d048286348cf3792f329f6d0b6c79642847f2c645fc9c"} Oct 03 07:50:51 crc kubenswrapper[4664]: I1003 07:50:51.923928 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d52b7963-c3dd-44d1-b33f-14b98a2ef0f2","Type":"ContainerStarted","Data":"bb71d360c16dd842d7a1fe1eead235637beeee539bf729c9009e749e251a353d"} Oct 03 07:50:51 crc kubenswrapper[4664]: I1003 07:50:51.989595 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-szr58" podStartSLOduration=126.98957294 podStartE2EDuration="2m6.98957294s" podCreationTimestamp="2025-10-03 07:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:51.950163434 +0000 UTC m=+152.771353934" watchObservedRunningTime="2025-10-03 07:50:51.98957294 +0000 UTC m=+152.810763430" Oct 03 07:50:52 crc kubenswrapper[4664]: I1003 07:50:52.046291 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.046271613 podStartE2EDuration="2.046271613s" podCreationTimestamp="2025-10-03 07:50:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:50:52.043422726 +0000 UTC m=+152.864613256" watchObservedRunningTime="2025-10-03 07:50:52.046271613 +0000 UTC m=+152.867462103" Oct 03 07:50:52 crc kubenswrapper[4664]: I1003 07:50:52.053917 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xt6kz" Oct 03 07:50:52 crc kubenswrapper[4664]: I1003 07:50:52.195684 4664 patch_prober.go:28] interesting pod/router-default-5444994796-jxft6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 07:50:52 crc kubenswrapper[4664]: [-]has-synced failed: reason withheld Oct 03 07:50:52 crc kubenswrapper[4664]: [+]process-running ok Oct 03 07:50:52 crc kubenswrapper[4664]: healthz check failed Oct 03 07:50:52 crc kubenswrapper[4664]: I1003 07:50:52.196055 4664 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jxft6" podUID="bcb87980-5888-4a30-859f-a9ac5b95f2c0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 07:50:52 crc kubenswrapper[4664]: I1003 07:50:52.934587 4664 generic.go:334] "Generic (PLEG): container finished" podID="b855f400-75f9-44f7-9da3-1b4a850ac090" containerID="b62562a28a58fabb9a5b01f537214d33128d05ad09af16996ffca85b151636d5" exitCode=0 Oct 03 07:50:52 crc kubenswrapper[4664]: I1003 07:50:52.934768 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rvmkh" event={"ID":"b855f400-75f9-44f7-9da3-1b4a850ac090","Type":"ContainerDied","Data":"b62562a28a58fabb9a5b01f537214d33128d05ad09af16996ffca85b151636d5"} Oct 03 07:50:52 crc kubenswrapper[4664]: I1003 07:50:52.934799 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rvmkh" event={"ID":"b855f400-75f9-44f7-9da3-1b4a850ac090","Type":"ContainerStarted","Data":"9f44bf72963849d735930f05db476a9c0207df8f26361baaa1690c872302be58"} Oct 03 07:50:52 crc kubenswrapper[4664]: I1003 07:50:52.938995 4664 generic.go:334] "Generic (PLEG): container finished" podID="d52b7963-c3dd-44d1-b33f-14b98a2ef0f2" containerID="b1055a4de0d032eb8f8d048286348cf3792f329f6d0b6c79642847f2c645fc9c" exitCode=0 Oct 03 07:50:52 crc kubenswrapper[4664]: I1003 07:50:52.939085 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d52b7963-c3dd-44d1-b33f-14b98a2ef0f2","Type":"ContainerDied","Data":"b1055a4de0d032eb8f8d048286348cf3792f329f6d0b6c79642847f2c645fc9c"} Oct 03 07:50:52 crc kubenswrapper[4664]: I1003 07:50:52.941021 4664 generic.go:334] "Generic (PLEG): container finished" podID="875a3a4c-cb52-4fb2-976d-0e795bbfcb4c" containerID="c5606cc81c1a3f77032f74f2702845e639ef6a8c9eeec5a1ce8fe6f1de25de92" exitCode=0 Oct 03 07:50:52 crc kubenswrapper[4664]: I1003 07:50:52.941049 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qt6g9" event={"ID":"875a3a4c-cb52-4fb2-976d-0e795bbfcb4c","Type":"ContainerDied","Data":"c5606cc81c1a3f77032f74f2702845e639ef6a8c9eeec5a1ce8fe6f1de25de92"} Oct 03 07:50:52 crc kubenswrapper[4664]: I1003 07:50:52.941075 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qt6g9" event={"ID":"875a3a4c-cb52-4fb2-976d-0e795bbfcb4c","Type":"ContainerStarted","Data":"467da52daa29e1b2dd71ba79db197b937b8e6923ea59df26108131d410984ff5"} Oct 03 07:50:53 crc kubenswrapper[4664]: I1003 07:50:53.187144 4664 patch_prober.go:28] interesting pod/router-default-5444994796-jxft6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 07:50:53 crc kubenswrapper[4664]: [-]has-synced failed: reason withheld Oct 03 07:50:53 crc kubenswrapper[4664]: [+]process-running ok Oct 03 07:50:53 crc kubenswrapper[4664]: healthz check failed Oct 03 07:50:53 crc kubenswrapper[4664]: I1003 07:50:53.187451 4664 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jxft6" podUID="bcb87980-5888-4a30-859f-a9ac5b95f2c0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 07:50:53 crc kubenswrapper[4664]: I1003 07:50:53.248901 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 07:50:53 crc kubenswrapper[4664]: I1003 07:50:53.297651 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324625-nj78m" Oct 03 07:50:53 crc kubenswrapper[4664]: I1003 07:50:53.370740 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cchd\" (UniqueName: \"kubernetes.io/projected/5c3aff15-c0d3-4d20-8ee3-7d753d5f7c91-kube-api-access-9cchd\") pod \"5c3aff15-c0d3-4d20-8ee3-7d753d5f7c91\" (UID: \"5c3aff15-c0d3-4d20-8ee3-7d753d5f7c91\") " Oct 03 07:50:53 crc kubenswrapper[4664]: I1003 07:50:53.370787 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/63d05775-5de6-4f48-bfa5-96df1a0a8aa3-kubelet-dir\") pod \"63d05775-5de6-4f48-bfa5-96df1a0a8aa3\" (UID: \"63d05775-5de6-4f48-bfa5-96df1a0a8aa3\") " Oct 03 07:50:53 crc kubenswrapper[4664]: I1003 07:50:53.370828 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c3aff15-c0d3-4d20-8ee3-7d753d5f7c91-config-volume\") pod \"5c3aff15-c0d3-4d20-8ee3-7d753d5f7c91\" (UID: \"5c3aff15-c0d3-4d20-8ee3-7d753d5f7c91\") " Oct 03 07:50:53 crc kubenswrapper[4664]: I1003 07:50:53.370893 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5c3aff15-c0d3-4d20-8ee3-7d753d5f7c91-secret-volume\") pod \"5c3aff15-c0d3-4d20-8ee3-7d753d5f7c91\" (UID: \"5c3aff15-c0d3-4d20-8ee3-7d753d5f7c91\") " Oct 03 07:50:53 crc kubenswrapper[4664]: I1003 07:50:53.370975 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/63d05775-5de6-4f48-bfa5-96df1a0a8aa3-kube-api-access\") pod \"63d05775-5de6-4f48-bfa5-96df1a0a8aa3\" (UID: \"63d05775-5de6-4f48-bfa5-96df1a0a8aa3\") " Oct 03 07:50:53 crc kubenswrapper[4664]: I1003 07:50:53.371634 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c3aff15-c0d3-4d20-8ee3-7d753d5f7c91-config-volume" (OuterVolumeSpecName: "config-volume") pod "5c3aff15-c0d3-4d20-8ee3-7d753d5f7c91" (UID: "5c3aff15-c0d3-4d20-8ee3-7d753d5f7c91"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:50:53 crc kubenswrapper[4664]: I1003 07:50:53.370969 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/63d05775-5de6-4f48-bfa5-96df1a0a8aa3-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "63d05775-5de6-4f48-bfa5-96df1a0a8aa3" (UID: "63d05775-5de6-4f48-bfa5-96df1a0a8aa3"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 07:50:53 crc kubenswrapper[4664]: I1003 07:50:53.378106 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c3aff15-c0d3-4d20-8ee3-7d753d5f7c91-kube-api-access-9cchd" (OuterVolumeSpecName: "kube-api-access-9cchd") pod "5c3aff15-c0d3-4d20-8ee3-7d753d5f7c91" (UID: "5c3aff15-c0d3-4d20-8ee3-7d753d5f7c91"). InnerVolumeSpecName "kube-api-access-9cchd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:50:53 crc kubenswrapper[4664]: I1003 07:50:53.378763 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63d05775-5de6-4f48-bfa5-96df1a0a8aa3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "63d05775-5de6-4f48-bfa5-96df1a0a8aa3" (UID: "63d05775-5de6-4f48-bfa5-96df1a0a8aa3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:50:53 crc kubenswrapper[4664]: I1003 07:50:53.382425 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c3aff15-c0d3-4d20-8ee3-7d753d5f7c91-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5c3aff15-c0d3-4d20-8ee3-7d753d5f7c91" (UID: "5c3aff15-c0d3-4d20-8ee3-7d753d5f7c91"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:50:53 crc kubenswrapper[4664]: I1003 07:50:53.472807 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/63d05775-5de6-4f48-bfa5-96df1a0a8aa3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 07:50:53 crc kubenswrapper[4664]: I1003 07:50:53.472848 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9cchd\" (UniqueName: \"kubernetes.io/projected/5c3aff15-c0d3-4d20-8ee3-7d753d5f7c91-kube-api-access-9cchd\") on node \"crc\" DevicePath \"\"" Oct 03 07:50:53 crc kubenswrapper[4664]: I1003 07:50:53.472864 4664 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/63d05775-5de6-4f48-bfa5-96df1a0a8aa3-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 03 07:50:53 crc kubenswrapper[4664]: I1003 07:50:53.472877 4664 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c3aff15-c0d3-4d20-8ee3-7d753d5f7c91-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 07:50:53 crc kubenswrapper[4664]: I1003 07:50:53.472888 4664 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5c3aff15-c0d3-4d20-8ee3-7d753d5f7c91-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 07:50:53 crc kubenswrapper[4664]: I1003 07:50:53.952707 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 07:50:53 crc kubenswrapper[4664]: I1003 07:50:53.952714 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"63d05775-5de6-4f48-bfa5-96df1a0a8aa3","Type":"ContainerDied","Data":"af0f3c88e41d2ee87e5720f5802ba4a2509049168f2c93bfe41df10648108805"} Oct 03 07:50:53 crc kubenswrapper[4664]: I1003 07:50:53.952811 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af0f3c88e41d2ee87e5720f5802ba4a2509049168f2c93bfe41df10648108805" Oct 03 07:50:53 crc kubenswrapper[4664]: I1003 07:50:53.955032 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324625-nj78m" event={"ID":"5c3aff15-c0d3-4d20-8ee3-7d753d5f7c91","Type":"ContainerDied","Data":"e55ddbdee2a0207580bbf04cd051441d8f90da78dbb4a6116d66741cfbf9388e"} Oct 03 07:50:53 crc kubenswrapper[4664]: I1003 07:50:53.955077 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324625-nj78m" Oct 03 07:50:53 crc kubenswrapper[4664]: I1003 07:50:53.955085 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e55ddbdee2a0207580bbf04cd051441d8f90da78dbb4a6116d66741cfbf9388e" Oct 03 07:50:54 crc kubenswrapper[4664]: I1003 07:50:54.134590 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-942tg" Oct 03 07:50:54 crc kubenswrapper[4664]: I1003 07:50:54.188240 4664 patch_prober.go:28] interesting pod/router-default-5444994796-jxft6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 07:50:54 crc kubenswrapper[4664]: [-]has-synced failed: reason withheld Oct 03 07:50:54 crc kubenswrapper[4664]: [+]process-running ok Oct 03 07:50:54 crc kubenswrapper[4664]: healthz check failed Oct 03 07:50:54 crc kubenswrapper[4664]: I1003 07:50:54.188305 4664 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jxft6" podUID="bcb87980-5888-4a30-859f-a9ac5b95f2c0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 07:50:54 crc kubenswrapper[4664]: I1003 07:50:54.270771 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 07:50:54 crc kubenswrapper[4664]: I1003 07:50:54.391482 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d52b7963-c3dd-44d1-b33f-14b98a2ef0f2-kubelet-dir\") pod \"d52b7963-c3dd-44d1-b33f-14b98a2ef0f2\" (UID: \"d52b7963-c3dd-44d1-b33f-14b98a2ef0f2\") " Oct 03 07:50:54 crc kubenswrapper[4664]: I1003 07:50:54.391560 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d52b7963-c3dd-44d1-b33f-14b98a2ef0f2-kube-api-access\") pod \"d52b7963-c3dd-44d1-b33f-14b98a2ef0f2\" (UID: \"d52b7963-c3dd-44d1-b33f-14b98a2ef0f2\") " Oct 03 07:50:54 crc kubenswrapper[4664]: I1003 07:50:54.391635 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d52b7963-c3dd-44d1-b33f-14b98a2ef0f2-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d52b7963-c3dd-44d1-b33f-14b98a2ef0f2" (UID: "d52b7963-c3dd-44d1-b33f-14b98a2ef0f2"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 07:50:54 crc kubenswrapper[4664]: I1003 07:50:54.391910 4664 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d52b7963-c3dd-44d1-b33f-14b98a2ef0f2-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 03 07:50:54 crc kubenswrapper[4664]: I1003 07:50:54.398263 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d52b7963-c3dd-44d1-b33f-14b98a2ef0f2-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d52b7963-c3dd-44d1-b33f-14b98a2ef0f2" (UID: "d52b7963-c3dd-44d1-b33f-14b98a2ef0f2"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:50:54 crc kubenswrapper[4664]: I1003 07:50:54.493228 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d52b7963-c3dd-44d1-b33f-14b98a2ef0f2-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 07:50:54 crc kubenswrapper[4664]: I1003 07:50:54.964260 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d52b7963-c3dd-44d1-b33f-14b98a2ef0f2","Type":"ContainerDied","Data":"bb71d360c16dd842d7a1fe1eead235637beeee539bf729c9009e749e251a353d"} Oct 03 07:50:54 crc kubenswrapper[4664]: I1003 07:50:54.964305 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb71d360c16dd842d7a1fe1eead235637beeee539bf729c9009e749e251a353d" Oct 03 07:50:54 crc kubenswrapper[4664]: I1003 07:50:54.964334 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 07:50:55 crc kubenswrapper[4664]: I1003 07:50:55.186421 4664 patch_prober.go:28] interesting pod/router-default-5444994796-jxft6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 07:50:55 crc kubenswrapper[4664]: [-]has-synced failed: reason withheld Oct 03 07:50:55 crc kubenswrapper[4664]: [+]process-running ok Oct 03 07:50:55 crc kubenswrapper[4664]: healthz check failed Oct 03 07:50:55 crc kubenswrapper[4664]: I1003 07:50:55.186500 4664 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jxft6" podUID="bcb87980-5888-4a30-859f-a9ac5b95f2c0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 07:50:55 crc kubenswrapper[4664]: I1003 07:50:55.746963 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-87c8s" Oct 03 07:50:55 crc kubenswrapper[4664]: I1003 07:50:55.762064 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-87c8s" Oct 03 07:50:56 crc kubenswrapper[4664]: I1003 07:50:56.187182 4664 patch_prober.go:28] interesting pod/router-default-5444994796-jxft6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 07:50:56 crc kubenswrapper[4664]: [-]has-synced failed: reason withheld Oct 03 07:50:56 crc kubenswrapper[4664]: [+]process-running ok Oct 03 07:50:56 crc kubenswrapper[4664]: healthz check failed Oct 03 07:50:56 crc kubenswrapper[4664]: I1003 07:50:56.187238 4664 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jxft6" podUID="bcb87980-5888-4a30-859f-a9ac5b95f2c0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 07:50:57 crc kubenswrapper[4664]: I1003 07:50:57.187226 4664 patch_prober.go:28] interesting pod/router-default-5444994796-jxft6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 07:50:57 crc kubenswrapper[4664]: [-]has-synced failed: reason withheld Oct 03 07:50:57 crc kubenswrapper[4664]: [+]process-running ok Oct 03 07:50:57 crc kubenswrapper[4664]: healthz check failed Oct 03 07:50:57 crc kubenswrapper[4664]: I1003 07:50:57.187301 4664 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jxft6" podUID="bcb87980-5888-4a30-859f-a9ac5b95f2c0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 07:50:58 crc kubenswrapper[4664]: I1003 07:50:58.186308 4664 patch_prober.go:28] interesting pod/router-default-5444994796-jxft6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 07:50:58 crc kubenswrapper[4664]: [-]has-synced failed: reason withheld Oct 03 07:50:58 crc kubenswrapper[4664]: [+]process-running ok Oct 03 07:50:58 crc kubenswrapper[4664]: healthz check failed Oct 03 07:50:58 crc kubenswrapper[4664]: I1003 07:50:58.186365 4664 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jxft6" podUID="bcb87980-5888-4a30-859f-a9ac5b95f2c0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 07:50:59 crc kubenswrapper[4664]: I1003 07:50:59.187638 4664 patch_prober.go:28] interesting pod/router-default-5444994796-jxft6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 07:50:59 crc kubenswrapper[4664]: [-]has-synced failed: reason withheld Oct 03 07:50:59 crc kubenswrapper[4664]: [+]process-running ok Oct 03 07:50:59 crc kubenswrapper[4664]: healthz check failed Oct 03 07:50:59 crc kubenswrapper[4664]: I1003 07:50:59.188169 4664 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jxft6" podUID="bcb87980-5888-4a30-859f-a9ac5b95f2c0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 07:51:00 crc kubenswrapper[4664]: I1003 07:51:00.186269 4664 patch_prober.go:28] interesting pod/router-default-5444994796-jxft6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 07:51:00 crc kubenswrapper[4664]: [-]has-synced failed: reason withheld Oct 03 07:51:00 crc kubenswrapper[4664]: [+]process-running ok Oct 03 07:51:00 crc kubenswrapper[4664]: healthz check failed Oct 03 07:51:00 crc kubenswrapper[4664]: I1003 07:51:00.186323 4664 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jxft6" podUID="bcb87980-5888-4a30-859f-a9ac5b95f2c0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 07:51:00 crc kubenswrapper[4664]: I1003 07:51:00.724777 4664 patch_prober.go:28] interesting pod/console-f9d7485db-2m6x7 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Oct 03 07:51:00 crc kubenswrapper[4664]: I1003 07:51:00.724831 4664 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-2m6x7" podUID="ceef7ba8-f996-4b56-a477-23873e39cde7" containerName="console" probeResult="failure" output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" Oct 03 07:51:00 crc kubenswrapper[4664]: I1003 07:51:00.945084 4664 patch_prober.go:28] interesting pod/downloads-7954f5f757-zcngp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Oct 03 07:51:00 crc kubenswrapper[4664]: I1003 07:51:00.945144 4664 patch_prober.go:28] interesting pod/downloads-7954f5f757-zcngp container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Oct 03 07:51:00 crc kubenswrapper[4664]: I1003 07:51:00.945204 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-zcngp" podUID="acf5523b-3f1d-495e-8014-0313925e8727" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Oct 03 07:51:00 crc kubenswrapper[4664]: I1003 07:51:00.945147 4664 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zcngp" podUID="acf5523b-3f1d-495e-8014-0313925e8727" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Oct 03 07:51:01 crc kubenswrapper[4664]: I1003 07:51:01.186913 4664 patch_prober.go:28] interesting pod/router-default-5444994796-jxft6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 07:51:01 crc kubenswrapper[4664]: [-]has-synced failed: reason withheld Oct 03 07:51:01 crc kubenswrapper[4664]: [+]process-running ok Oct 03 07:51:01 crc kubenswrapper[4664]: healthz check failed Oct 03 07:51:01 crc kubenswrapper[4664]: I1003 07:51:01.186980 4664 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jxft6" podUID="bcb87980-5888-4a30-859f-a9ac5b95f2c0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 07:51:02 crc kubenswrapper[4664]: I1003 07:51:02.186967 4664 patch_prober.go:28] interesting pod/router-default-5444994796-jxft6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 07:51:02 crc kubenswrapper[4664]: [-]has-synced failed: reason withheld Oct 03 07:51:02 crc kubenswrapper[4664]: [+]process-running ok Oct 03 07:51:02 crc kubenswrapper[4664]: healthz check failed Oct 03 07:51:02 crc kubenswrapper[4664]: I1003 07:51:02.187283 4664 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jxft6" podUID="bcb87980-5888-4a30-859f-a9ac5b95f2c0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 07:51:03 crc kubenswrapper[4664]: I1003 07:51:03.186655 4664 patch_prober.go:28] interesting pod/router-default-5444994796-jxft6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 07:51:03 crc kubenswrapper[4664]: [-]has-synced failed: reason withheld Oct 03 07:51:03 crc kubenswrapper[4664]: [+]process-running ok Oct 03 07:51:03 crc kubenswrapper[4664]: healthz check failed Oct 03 07:51:03 crc kubenswrapper[4664]: I1003 07:51:03.186740 4664 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jxft6" podUID="bcb87980-5888-4a30-859f-a9ac5b95f2c0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 07:51:04 crc kubenswrapper[4664]: I1003 07:51:04.186666 4664 patch_prober.go:28] interesting pod/router-default-5444994796-jxft6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 07:51:04 crc kubenswrapper[4664]: [-]has-synced failed: reason withheld Oct 03 07:51:04 crc kubenswrapper[4664]: [+]process-running ok Oct 03 07:51:04 crc kubenswrapper[4664]: healthz check failed Oct 03 07:51:04 crc kubenswrapper[4664]: I1003 07:51:04.186771 4664 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jxft6" podUID="bcb87980-5888-4a30-859f-a9ac5b95f2c0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 07:51:05 crc kubenswrapper[4664]: I1003 07:51:05.186776 4664 patch_prober.go:28] interesting pod/router-default-5444994796-jxft6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 07:51:05 crc kubenswrapper[4664]: [+]has-synced ok Oct 03 07:51:05 crc kubenswrapper[4664]: [+]process-running ok Oct 03 07:51:05 crc kubenswrapper[4664]: healthz check failed Oct 03 07:51:05 crc kubenswrapper[4664]: I1003 07:51:05.186840 4664 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jxft6" podUID="bcb87980-5888-4a30-859f-a9ac5b95f2c0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 07:51:06 crc kubenswrapper[4664]: I1003 07:51:06.188855 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-jxft6" Oct 03 07:51:06 crc kubenswrapper[4664]: I1003 07:51:06.194431 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-jxft6" Oct 03 07:51:07 crc kubenswrapper[4664]: I1003 07:51:07.599643 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7f2800e0-b66e-4ab2-ad4f-37c5ffe60120-metrics-certs\") pod \"network-metrics-daemon-l687s\" (UID: \"7f2800e0-b66e-4ab2-ad4f-37c5ffe60120\") " pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:51:07 crc kubenswrapper[4664]: I1003 07:51:07.605566 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7f2800e0-b66e-4ab2-ad4f-37c5ffe60120-metrics-certs\") pod \"network-metrics-daemon-l687s\" (UID: \"7f2800e0-b66e-4ab2-ad4f-37c5ffe60120\") " pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:51:07 crc kubenswrapper[4664]: I1003 07:51:07.718122 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l687s" Oct 03 07:51:10 crc kubenswrapper[4664]: I1003 07:51:10.636553 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:51:10 crc kubenswrapper[4664]: I1003 07:51:10.734305 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-2m6x7" Oct 03 07:51:10 crc kubenswrapper[4664]: I1003 07:51:10.739332 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-2m6x7" Oct 03 07:51:10 crc kubenswrapper[4664]: I1003 07:51:10.964669 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-zcngp" Oct 03 07:51:11 crc kubenswrapper[4664]: I1003 07:51:11.987497 4664 patch_prober.go:28] interesting pod/machine-config-daemon-x9dgm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 07:51:11 crc kubenswrapper[4664]: I1003 07:51:11.987858 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 07:51:21 crc kubenswrapper[4664]: I1003 07:51:21.197866 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dg8s7" Oct 03 07:51:21 crc kubenswrapper[4664]: E1003 07:51:21.570646 4664 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 03 07:51:21 crc kubenswrapper[4664]: E1003 07:51:21.570828 4664 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p9f5t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-swwsk_openshift-marketplace(78dc4ca9-6ff9-428f-b341-a02f0a85dfec): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 07:51:21 crc kubenswrapper[4664]: E1003 07:51:21.571993 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-swwsk" podUID="78dc4ca9-6ff9-428f-b341-a02f0a85dfec" Oct 03 07:51:24 crc kubenswrapper[4664]: E1003 07:51:24.538698 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-swwsk" podUID="78dc4ca9-6ff9-428f-b341-a02f0a85dfec" Oct 03 07:51:24 crc kubenswrapper[4664]: E1003 07:51:24.610285 4664 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 03 07:51:24 crc kubenswrapper[4664]: E1003 07:51:24.610523 4664 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vkgg7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-ftqrw_openshift-marketplace(5d56b337-1eb1-4e79-b6ef-bb2d85737a41): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 07:51:24 crc kubenswrapper[4664]: E1003 07:51:24.611827 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-ftqrw" podUID="5d56b337-1eb1-4e79-b6ef-bb2d85737a41" Oct 03 07:51:25 crc kubenswrapper[4664]: E1003 07:51:25.216778 4664 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 03 07:51:25 crc kubenswrapper[4664]: E1003 07:51:25.216965 4664 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b4m48,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-2t2bc_openshift-marketplace(5c238baa-b35f-404b-b6e3-ebec940e30be): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 07:51:25 crc kubenswrapper[4664]: E1003 07:51:25.218215 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-2t2bc" podUID="5c238baa-b35f-404b-b6e3-ebec940e30be" Oct 03 07:51:27 crc kubenswrapper[4664]: E1003 07:51:27.395485 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-ftqrw" podUID="5d56b337-1eb1-4e79-b6ef-bb2d85737a41" Oct 03 07:51:27 crc kubenswrapper[4664]: E1003 07:51:27.396085 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-2t2bc" podUID="5c238baa-b35f-404b-b6e3-ebec940e30be" Oct 03 07:51:27 crc kubenswrapper[4664]: E1003 07:51:27.475254 4664 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 03 07:51:27 crc kubenswrapper[4664]: E1003 07:51:27.475396 4664 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jcm94,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-n4cpb_openshift-marketplace(ea079e38-0970-4e57-af62-4910892ea04d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 07:51:27 crc kubenswrapper[4664]: E1003 07:51:27.476645 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-n4cpb" podUID="ea079e38-0970-4e57-af62-4910892ea04d" Oct 03 07:51:27 crc kubenswrapper[4664]: E1003 07:51:27.477843 4664 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 03 07:51:27 crc kubenswrapper[4664]: E1003 07:51:27.478855 4664 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qnbfr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-rvmkh_openshift-marketplace(b855f400-75f9-44f7-9da3-1b4a850ac090): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 07:51:27 crc kubenswrapper[4664]: E1003 07:51:27.480746 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-rvmkh" podUID="b855f400-75f9-44f7-9da3-1b4a850ac090" Oct 03 07:51:27 crc kubenswrapper[4664]: I1003 07:51:27.920389 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 07:51:29 crc kubenswrapper[4664]: E1003 07:51:29.805671 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-rvmkh" podUID="b855f400-75f9-44f7-9da3-1b4a850ac090" Oct 03 07:51:29 crc kubenswrapper[4664]: E1003 07:51:29.806718 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-n4cpb" podUID="ea079e38-0970-4e57-af62-4910892ea04d" Oct 03 07:51:29 crc kubenswrapper[4664]: E1003 07:51:29.833389 4664 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 03 07:51:29 crc kubenswrapper[4664]: E1003 07:51:29.833551 4664 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fbxjg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-qt6g9_openshift-marketplace(875a3a4c-cb52-4fb2-976d-0e795bbfcb4c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 07:51:29 crc kubenswrapper[4664]: E1003 07:51:29.834677 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-qt6g9" podUID="875a3a4c-cb52-4fb2-976d-0e795bbfcb4c" Oct 03 07:51:30 crc kubenswrapper[4664]: E1003 07:51:30.483063 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-qt6g9" podUID="875a3a4c-cb52-4fb2-976d-0e795bbfcb4c" Oct 03 07:51:30 crc kubenswrapper[4664]: E1003 07:51:30.539660 4664 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 03 07:51:30 crc kubenswrapper[4664]: E1003 07:51:30.539820 4664 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zh7gc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-gtxvm_openshift-marketplace(583c3690-dbf8-4272-bb35-b5557b7a3e74): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 07:51:30 crc kubenswrapper[4664]: E1003 07:51:30.541016 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-gtxvm" podUID="583c3690-dbf8-4272-bb35-b5557b7a3e74" Oct 03 07:51:30 crc kubenswrapper[4664]: E1003 07:51:30.544452 4664 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 03 07:51:30 crc kubenswrapper[4664]: E1003 07:51:30.544572 4664 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tjhfn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-dhpld_openshift-marketplace(18497250-f896-4630-98c4-c2915a50fd1b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 07:51:30 crc kubenswrapper[4664]: E1003 07:51:30.547534 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-dhpld" podUID="18497250-f896-4630-98c4-c2915a50fd1b" Oct 03 07:51:30 crc kubenswrapper[4664]: I1003 07:51:30.857573 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-l687s"] Oct 03 07:51:31 crc kubenswrapper[4664]: I1003 07:51:31.151859 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-l687s" event={"ID":"7f2800e0-b66e-4ab2-ad4f-37c5ffe60120","Type":"ContainerStarted","Data":"b6bea96dac8cdd0b0148cc8d605b289d7534589ac88eb8d6e4354ccda41ce925"} Oct 03 07:51:31 crc kubenswrapper[4664]: I1003 07:51:31.152103 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-l687s" event={"ID":"7f2800e0-b66e-4ab2-ad4f-37c5ffe60120","Type":"ContainerStarted","Data":"0fe430741eef1f66150fb53b5cf6a08b1890eb60355180b292f7808a3fd90198"} Oct 03 07:51:31 crc kubenswrapper[4664]: E1003 07:51:31.152828 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-gtxvm" podUID="583c3690-dbf8-4272-bb35-b5557b7a3e74" Oct 03 07:51:31 crc kubenswrapper[4664]: E1003 07:51:31.153548 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-dhpld" podUID="18497250-f896-4630-98c4-c2915a50fd1b" Oct 03 07:51:32 crc kubenswrapper[4664]: I1003 07:51:32.158048 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-l687s" event={"ID":"7f2800e0-b66e-4ab2-ad4f-37c5ffe60120","Type":"ContainerStarted","Data":"60537dec2d8010b51913fe38f564c6116cadbb8c55a367a0b5509b895fbc6c5f"} Oct 03 07:51:39 crc kubenswrapper[4664]: I1003 07:51:39.899099 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-l687s" podStartSLOduration=174.899080468 podStartE2EDuration="2m54.899080468s" podCreationTimestamp="2025-10-03 07:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:51:32.175879506 +0000 UTC m=+192.997070006" watchObservedRunningTime="2025-10-03 07:51:39.899080468 +0000 UTC m=+200.720270968" Oct 03 07:51:41 crc kubenswrapper[4664]: I1003 07:51:41.206786 4664 generic.go:334] "Generic (PLEG): container finished" podID="78dc4ca9-6ff9-428f-b341-a02f0a85dfec" containerID="f836dde346eea3ec0c840d7c59dd6ea0cee61cb2f73f7723d38436417d10e5fc" exitCode=0 Oct 03 07:51:41 crc kubenswrapper[4664]: I1003 07:51:41.206962 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-swwsk" event={"ID":"78dc4ca9-6ff9-428f-b341-a02f0a85dfec","Type":"ContainerDied","Data":"f836dde346eea3ec0c840d7c59dd6ea0cee61cb2f73f7723d38436417d10e5fc"} Oct 03 07:51:41 crc kubenswrapper[4664]: I1003 07:51:41.987825 4664 patch_prober.go:28] interesting pod/machine-config-daemon-x9dgm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 07:51:41 crc kubenswrapper[4664]: I1003 07:51:41.988634 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 07:51:44 crc kubenswrapper[4664]: I1003 07:51:44.227886 4664 generic.go:334] "Generic (PLEG): container finished" podID="ea079e38-0970-4e57-af62-4910892ea04d" containerID="19eeb749c947f19a3a372484ff897b2591519ebcc04afc0ca2b4f331bc4bd138" exitCode=0 Oct 03 07:51:44 crc kubenswrapper[4664]: I1003 07:51:44.227988 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n4cpb" event={"ID":"ea079e38-0970-4e57-af62-4910892ea04d","Type":"ContainerDied","Data":"19eeb749c947f19a3a372484ff897b2591519ebcc04afc0ca2b4f331bc4bd138"} Oct 03 07:51:44 crc kubenswrapper[4664]: I1003 07:51:44.232390 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qt6g9" event={"ID":"875a3a4c-cb52-4fb2-976d-0e795bbfcb4c","Type":"ContainerStarted","Data":"17d73ef300e08f93b334db06c1f68447693e3e2b171d10ca55fc4d744d9b2f8b"} Oct 03 07:51:44 crc kubenswrapper[4664]: I1003 07:51:44.235119 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-swwsk" event={"ID":"78dc4ca9-6ff9-428f-b341-a02f0a85dfec","Type":"ContainerStarted","Data":"52ef1ca08d8b6bac463859c0cacd275467a32220128b477d64af1d2756587609"} Oct 03 07:51:44 crc kubenswrapper[4664]: I1003 07:51:44.239477 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rvmkh" event={"ID":"b855f400-75f9-44f7-9da3-1b4a850ac090","Type":"ContainerStarted","Data":"09af6798fe0e74acd7e15256ad0a0f97c7bc9f7679278439751f7bae1494d955"} Oct 03 07:51:44 crc kubenswrapper[4664]: I1003 07:51:44.241289 4664 generic.go:334] "Generic (PLEG): container finished" podID="5c238baa-b35f-404b-b6e3-ebec940e30be" containerID="3f327c4c349eaf2f06061e3674b98765fa8604fdb5f0889cb36c04cad955e6f5" exitCode=0 Oct 03 07:51:44 crc kubenswrapper[4664]: I1003 07:51:44.241354 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2t2bc" event={"ID":"5c238baa-b35f-404b-b6e3-ebec940e30be","Type":"ContainerDied","Data":"3f327c4c349eaf2f06061e3674b98765fa8604fdb5f0889cb36c04cad955e6f5"} Oct 03 07:51:44 crc kubenswrapper[4664]: I1003 07:51:44.243774 4664 generic.go:334] "Generic (PLEG): container finished" podID="5d56b337-1eb1-4e79-b6ef-bb2d85737a41" containerID="614cbc1c4d5cfeeb06a9ddce7bb890feb739f9c9dbf1ef89af44e4fb688a84c7" exitCode=0 Oct 03 07:51:44 crc kubenswrapper[4664]: I1003 07:51:44.243811 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ftqrw" event={"ID":"5d56b337-1eb1-4e79-b6ef-bb2d85737a41","Type":"ContainerDied","Data":"614cbc1c4d5cfeeb06a9ddce7bb890feb739f9c9dbf1ef89af44e4fb688a84c7"} Oct 03 07:51:44 crc kubenswrapper[4664]: I1003 07:51:44.292731 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-swwsk" podStartSLOduration=4.8942417559999996 podStartE2EDuration="57.292708424s" podCreationTimestamp="2025-10-03 07:50:47 +0000 UTC" firstStartedPulling="2025-10-03 07:50:50.809462135 +0000 UTC m=+151.630652625" lastFinishedPulling="2025-10-03 07:51:43.207928803 +0000 UTC m=+204.029119293" observedRunningTime="2025-10-03 07:51:44.282875947 +0000 UTC m=+205.104066457" watchObservedRunningTime="2025-10-03 07:51:44.292708424 +0000 UTC m=+205.113898934" Oct 03 07:51:45 crc kubenswrapper[4664]: I1003 07:51:45.249494 4664 generic.go:334] "Generic (PLEG): container finished" podID="b855f400-75f9-44f7-9da3-1b4a850ac090" containerID="09af6798fe0e74acd7e15256ad0a0f97c7bc9f7679278439751f7bae1494d955" exitCode=0 Oct 03 07:51:45 crc kubenswrapper[4664]: I1003 07:51:45.249826 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rvmkh" event={"ID":"b855f400-75f9-44f7-9da3-1b4a850ac090","Type":"ContainerDied","Data":"09af6798fe0e74acd7e15256ad0a0f97c7bc9f7679278439751f7bae1494d955"} Oct 03 07:51:45 crc kubenswrapper[4664]: I1003 07:51:45.253615 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2t2bc" event={"ID":"5c238baa-b35f-404b-b6e3-ebec940e30be","Type":"ContainerStarted","Data":"47e075c99e8cb43fd41f83ca04ac2ab2317548da290bb0374f603bc8d2b82130"} Oct 03 07:51:45 crc kubenswrapper[4664]: I1003 07:51:45.256092 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ftqrw" event={"ID":"5d56b337-1eb1-4e79-b6ef-bb2d85737a41","Type":"ContainerStarted","Data":"4212b2bf969e9c4b3b3e50c91c4264d5c29482a317b9546db694fd0662669046"} Oct 03 07:51:45 crc kubenswrapper[4664]: I1003 07:51:45.258283 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n4cpb" event={"ID":"ea079e38-0970-4e57-af62-4910892ea04d","Type":"ContainerStarted","Data":"1ecf1bc8ddb7df3a2e5880d5ae0f1041a7702b1c88649b4085a79b4eab9989f2"} Oct 03 07:51:45 crc kubenswrapper[4664]: I1003 07:51:45.260924 4664 generic.go:334] "Generic (PLEG): container finished" podID="875a3a4c-cb52-4fb2-976d-0e795bbfcb4c" containerID="17d73ef300e08f93b334db06c1f68447693e3e2b171d10ca55fc4d744d9b2f8b" exitCode=0 Oct 03 07:51:45 crc kubenswrapper[4664]: I1003 07:51:45.260975 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qt6g9" event={"ID":"875a3a4c-cb52-4fb2-976d-0e795bbfcb4c","Type":"ContainerDied","Data":"17d73ef300e08f93b334db06c1f68447693e3e2b171d10ca55fc4d744d9b2f8b"} Oct 03 07:51:45 crc kubenswrapper[4664]: I1003 07:51:45.304811 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2t2bc" podStartSLOduration=4.24511492 podStartE2EDuration="58.304793509s" podCreationTimestamp="2025-10-03 07:50:47 +0000 UTC" firstStartedPulling="2025-10-03 07:50:50.705129977 +0000 UTC m=+151.526320467" lastFinishedPulling="2025-10-03 07:51:44.764808566 +0000 UTC m=+205.585999056" observedRunningTime="2025-10-03 07:51:45.300649355 +0000 UTC m=+206.121839855" watchObservedRunningTime="2025-10-03 07:51:45.304793509 +0000 UTC m=+206.125983999" Oct 03 07:51:45 crc kubenswrapper[4664]: I1003 07:51:45.325815 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ftqrw" podStartSLOduration=3.415199978 podStartE2EDuration="57.325792547s" podCreationTimestamp="2025-10-03 07:50:48 +0000 UTC" firstStartedPulling="2025-10-03 07:50:50.801866347 +0000 UTC m=+151.623056847" lastFinishedPulling="2025-10-03 07:51:44.712458926 +0000 UTC m=+205.533649416" observedRunningTime="2025-10-03 07:51:45.321972684 +0000 UTC m=+206.143163184" watchObservedRunningTime="2025-10-03 07:51:45.325792547 +0000 UTC m=+206.146983037" Oct 03 07:51:45 crc kubenswrapper[4664]: I1003 07:51:45.898435 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-n4cpb" podStartSLOduration=4.950640929 podStartE2EDuration="58.898412384s" podCreationTimestamp="2025-10-03 07:50:47 +0000 UTC" firstStartedPulling="2025-10-03 07:50:50.798510603 +0000 UTC m=+151.619701083" lastFinishedPulling="2025-10-03 07:51:44.746282048 +0000 UTC m=+205.567472538" observedRunningTime="2025-10-03 07:51:45.366060877 +0000 UTC m=+206.187251397" watchObservedRunningTime="2025-10-03 07:51:45.898412384 +0000 UTC m=+206.719602874" Oct 03 07:51:46 crc kubenswrapper[4664]: I1003 07:51:46.267436 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rvmkh" event={"ID":"b855f400-75f9-44f7-9da3-1b4a850ac090","Type":"ContainerStarted","Data":"58006fba632e0bf618bc78808f16fcb633942aca32c33b55d32489b5cfa6e9c3"} Oct 03 07:51:46 crc kubenswrapper[4664]: I1003 07:51:46.269144 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qt6g9" event={"ID":"875a3a4c-cb52-4fb2-976d-0e795bbfcb4c","Type":"ContainerStarted","Data":"d5ee632c1c95a8d1bc2d7e2d5763fe6513421030c1f4a02e65d2494b26605606"} Oct 03 07:51:46 crc kubenswrapper[4664]: I1003 07:51:46.317798 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rvmkh" podStartSLOduration=3.623443188 podStartE2EDuration="56.317783523s" podCreationTimestamp="2025-10-03 07:50:50 +0000 UTC" firstStartedPulling="2025-10-03 07:50:52.937224614 +0000 UTC m=+153.758415104" lastFinishedPulling="2025-10-03 07:51:45.631564949 +0000 UTC m=+206.452755439" observedRunningTime="2025-10-03 07:51:46.294649597 +0000 UTC m=+207.115840107" watchObservedRunningTime="2025-10-03 07:51:46.317783523 +0000 UTC m=+207.138974003" Oct 03 07:51:47 crc kubenswrapper[4664]: I1003 07:51:47.275302 4664 generic.go:334] "Generic (PLEG): container finished" podID="583c3690-dbf8-4272-bb35-b5557b7a3e74" containerID="5f9bd733440713c370f0f03c6b638cb3bdbf57b2f7c529be624663a75bf7d830" exitCode=0 Oct 03 07:51:47 crc kubenswrapper[4664]: I1003 07:51:47.275476 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gtxvm" event={"ID":"583c3690-dbf8-4272-bb35-b5557b7a3e74","Type":"ContainerDied","Data":"5f9bd733440713c370f0f03c6b638cb3bdbf57b2f7c529be624663a75bf7d830"} Oct 03 07:51:47 crc kubenswrapper[4664]: I1003 07:51:47.278364 4664 generic.go:334] "Generic (PLEG): container finished" podID="18497250-f896-4630-98c4-c2915a50fd1b" containerID="fbcf5ae0b5c1bb4e0cf908b26053a6a40a6550f97be733a20ee3243a1655dafe" exitCode=0 Oct 03 07:51:47 crc kubenswrapper[4664]: I1003 07:51:47.278401 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dhpld" event={"ID":"18497250-f896-4630-98c4-c2915a50fd1b","Type":"ContainerDied","Data":"fbcf5ae0b5c1bb4e0cf908b26053a6a40a6550f97be733a20ee3243a1655dafe"} Oct 03 07:51:47 crc kubenswrapper[4664]: I1003 07:51:47.314104 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qt6g9" podStartSLOduration=3.383060502 podStartE2EDuration="56.314082018s" podCreationTimestamp="2025-10-03 07:50:51 +0000 UTC" firstStartedPulling="2025-10-03 07:50:52.9433016 +0000 UTC m=+153.764492090" lastFinishedPulling="2025-10-03 07:51:45.874323116 +0000 UTC m=+206.695513606" observedRunningTime="2025-10-03 07:51:46.320243083 +0000 UTC m=+207.141433593" watchObservedRunningTime="2025-10-03 07:51:47.314082018 +0000 UTC m=+208.135272508" Oct 03 07:51:47 crc kubenswrapper[4664]: I1003 07:51:47.836152 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-n4cpb" Oct 03 07:51:47 crc kubenswrapper[4664]: I1003 07:51:47.836191 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-n4cpb" Oct 03 07:51:47 crc kubenswrapper[4664]: I1003 07:51:47.981212 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-n4cpb" Oct 03 07:51:48 crc kubenswrapper[4664]: I1003 07:51:48.033647 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2t2bc" Oct 03 07:51:48 crc kubenswrapper[4664]: I1003 07:51:48.033692 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2t2bc" Oct 03 07:51:48 crc kubenswrapper[4664]: I1003 07:51:48.069433 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2t2bc" Oct 03 07:51:48 crc kubenswrapper[4664]: I1003 07:51:48.252378 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-swwsk" Oct 03 07:51:48 crc kubenswrapper[4664]: I1003 07:51:48.252532 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-swwsk" Oct 03 07:51:48 crc kubenswrapper[4664]: I1003 07:51:48.286046 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dhpld" event={"ID":"18497250-f896-4630-98c4-c2915a50fd1b","Type":"ContainerStarted","Data":"3d8244b72b4b0e4d55f044846fb054cfa642f33d57d94631ca6380b09a7b670c"} Oct 03 07:51:48 crc kubenswrapper[4664]: I1003 07:51:48.293229 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gtxvm" event={"ID":"583c3690-dbf8-4272-bb35-b5557b7a3e74","Type":"ContainerStarted","Data":"27dd716a1f125c3785120296322d9108be131c2470b085d701985b1c8d2d00b7"} Oct 03 07:51:48 crc kubenswrapper[4664]: I1003 07:51:48.298724 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-swwsk" Oct 03 07:51:48 crc kubenswrapper[4664]: I1003 07:51:48.329833 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dhpld" podStartSLOduration=3.52333417 podStartE2EDuration="59.329813671s" podCreationTimestamp="2025-10-03 07:50:49 +0000 UTC" firstStartedPulling="2025-10-03 07:50:51.889447475 +0000 UTC m=+152.710637965" lastFinishedPulling="2025-10-03 07:51:47.695926976 +0000 UTC m=+208.517117466" observedRunningTime="2025-10-03 07:51:48.312981837 +0000 UTC m=+209.134172347" watchObservedRunningTime="2025-10-03 07:51:48.329813671 +0000 UTC m=+209.151004161" Oct 03 07:51:48 crc kubenswrapper[4664]: I1003 07:51:48.477204 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ftqrw" Oct 03 07:51:48 crc kubenswrapper[4664]: I1003 07:51:48.477511 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ftqrw" Oct 03 07:51:48 crc kubenswrapper[4664]: I1003 07:51:48.522958 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ftqrw" Oct 03 07:51:48 crc kubenswrapper[4664]: I1003 07:51:48.544693 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gtxvm" podStartSLOduration=3.751329841 podStartE2EDuration="59.544676178s" podCreationTimestamp="2025-10-03 07:50:49 +0000 UTC" firstStartedPulling="2025-10-03 07:50:51.889360162 +0000 UTC m=+152.710550652" lastFinishedPulling="2025-10-03 07:51:47.682706499 +0000 UTC m=+208.503896989" observedRunningTime="2025-10-03 07:51:48.351838652 +0000 UTC m=+209.173029152" watchObservedRunningTime="2025-10-03 07:51:48.544676178 +0000 UTC m=+209.365866668" Oct 03 07:51:49 crc kubenswrapper[4664]: I1003 07:51:49.336700 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ftqrw" Oct 03 07:51:49 crc kubenswrapper[4664]: I1003 07:51:49.349415 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-swwsk" Oct 03 07:51:49 crc kubenswrapper[4664]: I1003 07:51:49.910736 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gtxvm" Oct 03 07:51:49 crc kubenswrapper[4664]: I1003 07:51:49.911002 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gtxvm" Oct 03 07:51:49 crc kubenswrapper[4664]: I1003 07:51:49.952718 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gtxvm" Oct 03 07:51:50 crc kubenswrapper[4664]: I1003 07:51:50.231805 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dhpld" Oct 03 07:51:50 crc kubenswrapper[4664]: I1003 07:51:50.231975 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dhpld" Oct 03 07:51:50 crc kubenswrapper[4664]: I1003 07:51:50.279883 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dhpld" Oct 03 07:51:50 crc kubenswrapper[4664]: I1003 07:51:50.508027 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-swwsk"] Oct 03 07:51:51 crc kubenswrapper[4664]: I1003 07:51:51.306145 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-swwsk" podUID="78dc4ca9-6ff9-428f-b341-a02f0a85dfec" containerName="registry-server" containerID="cri-o://52ef1ca08d8b6bac463859c0cacd275467a32220128b477d64af1d2756587609" gracePeriod=2 Oct 03 07:51:51 crc kubenswrapper[4664]: I1003 07:51:51.326428 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rvmkh" Oct 03 07:51:51 crc kubenswrapper[4664]: I1003 07:51:51.326475 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rvmkh" Oct 03 07:51:51 crc kubenswrapper[4664]: I1003 07:51:51.368047 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rvmkh" Oct 03 07:51:51 crc kubenswrapper[4664]: I1003 07:51:51.599459 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qt6g9" Oct 03 07:51:51 crc kubenswrapper[4664]: I1003 07:51:51.599855 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qt6g9" Oct 03 07:51:51 crc kubenswrapper[4664]: I1003 07:51:51.645325 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qt6g9" Oct 03 07:51:51 crc kubenswrapper[4664]: I1003 07:51:51.680888 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-swwsk" Oct 03 07:51:51 crc kubenswrapper[4664]: I1003 07:51:51.792600 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9f5t\" (UniqueName: \"kubernetes.io/projected/78dc4ca9-6ff9-428f-b341-a02f0a85dfec-kube-api-access-p9f5t\") pod \"78dc4ca9-6ff9-428f-b341-a02f0a85dfec\" (UID: \"78dc4ca9-6ff9-428f-b341-a02f0a85dfec\") " Oct 03 07:51:51 crc kubenswrapper[4664]: I1003 07:51:51.793062 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78dc4ca9-6ff9-428f-b341-a02f0a85dfec-catalog-content\") pod \"78dc4ca9-6ff9-428f-b341-a02f0a85dfec\" (UID: \"78dc4ca9-6ff9-428f-b341-a02f0a85dfec\") " Oct 03 07:51:51 crc kubenswrapper[4664]: I1003 07:51:51.793279 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78dc4ca9-6ff9-428f-b341-a02f0a85dfec-utilities\") pod \"78dc4ca9-6ff9-428f-b341-a02f0a85dfec\" (UID: \"78dc4ca9-6ff9-428f-b341-a02f0a85dfec\") " Oct 03 07:51:51 crc kubenswrapper[4664]: I1003 07:51:51.794191 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78dc4ca9-6ff9-428f-b341-a02f0a85dfec-utilities" (OuterVolumeSpecName: "utilities") pod "78dc4ca9-6ff9-428f-b341-a02f0a85dfec" (UID: "78dc4ca9-6ff9-428f-b341-a02f0a85dfec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:51:51 crc kubenswrapper[4664]: I1003 07:51:51.798319 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78dc4ca9-6ff9-428f-b341-a02f0a85dfec-kube-api-access-p9f5t" (OuterVolumeSpecName: "kube-api-access-p9f5t") pod "78dc4ca9-6ff9-428f-b341-a02f0a85dfec" (UID: "78dc4ca9-6ff9-428f-b341-a02f0a85dfec"). InnerVolumeSpecName "kube-api-access-p9f5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:51:51 crc kubenswrapper[4664]: I1003 07:51:51.894991 4664 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78dc4ca9-6ff9-428f-b341-a02f0a85dfec-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 07:51:51 crc kubenswrapper[4664]: I1003 07:51:51.895050 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9f5t\" (UniqueName: \"kubernetes.io/projected/78dc4ca9-6ff9-428f-b341-a02f0a85dfec-kube-api-access-p9f5t\") on node \"crc\" DevicePath \"\"" Oct 03 07:51:51 crc kubenswrapper[4664]: I1003 07:51:51.908453 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ftqrw"] Oct 03 07:51:52 crc kubenswrapper[4664]: I1003 07:51:52.112255 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78dc4ca9-6ff9-428f-b341-a02f0a85dfec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "78dc4ca9-6ff9-428f-b341-a02f0a85dfec" (UID: "78dc4ca9-6ff9-428f-b341-a02f0a85dfec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:51:52 crc kubenswrapper[4664]: I1003 07:51:52.199426 4664 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78dc4ca9-6ff9-428f-b341-a02f0a85dfec-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 07:51:52 crc kubenswrapper[4664]: I1003 07:51:52.312493 4664 generic.go:334] "Generic (PLEG): container finished" podID="78dc4ca9-6ff9-428f-b341-a02f0a85dfec" containerID="52ef1ca08d8b6bac463859c0cacd275467a32220128b477d64af1d2756587609" exitCode=0 Oct 03 07:51:52 crc kubenswrapper[4664]: I1003 07:51:52.312584 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-swwsk" event={"ID":"78dc4ca9-6ff9-428f-b341-a02f0a85dfec","Type":"ContainerDied","Data":"52ef1ca08d8b6bac463859c0cacd275467a32220128b477d64af1d2756587609"} Oct 03 07:51:52 crc kubenswrapper[4664]: I1003 07:51:52.312841 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-swwsk" event={"ID":"78dc4ca9-6ff9-428f-b341-a02f0a85dfec","Type":"ContainerDied","Data":"e05f47359152ac2b3f8ae24e42c3e38dc6903ef306bd4df16b375778b197dab2"} Oct 03 07:51:52 crc kubenswrapper[4664]: I1003 07:51:52.312601 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-swwsk" Oct 03 07:51:52 crc kubenswrapper[4664]: I1003 07:51:52.312865 4664 scope.go:117] "RemoveContainer" containerID="52ef1ca08d8b6bac463859c0cacd275467a32220128b477d64af1d2756587609" Oct 03 07:51:52 crc kubenswrapper[4664]: I1003 07:51:52.313547 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ftqrw" podUID="5d56b337-1eb1-4e79-b6ef-bb2d85737a41" containerName="registry-server" containerID="cri-o://4212b2bf969e9c4b3b3e50c91c4264d5c29482a317b9546db694fd0662669046" gracePeriod=2 Oct 03 07:51:52 crc kubenswrapper[4664]: I1003 07:51:52.332695 4664 scope.go:117] "RemoveContainer" containerID="f836dde346eea3ec0c840d7c59dd6ea0cee61cb2f73f7723d38436417d10e5fc" Oct 03 07:51:52 crc kubenswrapper[4664]: I1003 07:51:52.346434 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-swwsk"] Oct 03 07:51:52 crc kubenswrapper[4664]: I1003 07:51:52.355248 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-swwsk"] Oct 03 07:51:52 crc kubenswrapper[4664]: I1003 07:51:52.364820 4664 scope.go:117] "RemoveContainer" containerID="566d012fc3b7aed06eb82996c38f74d151e061fa2074bb49c04a6816f6a035e3" Oct 03 07:51:52 crc kubenswrapper[4664]: I1003 07:51:52.366338 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qt6g9" Oct 03 07:51:52 crc kubenswrapper[4664]: I1003 07:51:52.373021 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rvmkh" Oct 03 07:51:52 crc kubenswrapper[4664]: I1003 07:51:52.455275 4664 scope.go:117] "RemoveContainer" containerID="52ef1ca08d8b6bac463859c0cacd275467a32220128b477d64af1d2756587609" Oct 03 07:51:52 crc kubenswrapper[4664]: E1003 07:51:52.456027 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52ef1ca08d8b6bac463859c0cacd275467a32220128b477d64af1d2756587609\": container with ID starting with 52ef1ca08d8b6bac463859c0cacd275467a32220128b477d64af1d2756587609 not found: ID does not exist" containerID="52ef1ca08d8b6bac463859c0cacd275467a32220128b477d64af1d2756587609" Oct 03 07:51:52 crc kubenswrapper[4664]: I1003 07:51:52.456075 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52ef1ca08d8b6bac463859c0cacd275467a32220128b477d64af1d2756587609"} err="failed to get container status \"52ef1ca08d8b6bac463859c0cacd275467a32220128b477d64af1d2756587609\": rpc error: code = NotFound desc = could not find container \"52ef1ca08d8b6bac463859c0cacd275467a32220128b477d64af1d2756587609\": container with ID starting with 52ef1ca08d8b6bac463859c0cacd275467a32220128b477d64af1d2756587609 not found: ID does not exist" Oct 03 07:51:52 crc kubenswrapper[4664]: I1003 07:51:52.456134 4664 scope.go:117] "RemoveContainer" containerID="f836dde346eea3ec0c840d7c59dd6ea0cee61cb2f73f7723d38436417d10e5fc" Oct 03 07:51:52 crc kubenswrapper[4664]: E1003 07:51:52.456536 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f836dde346eea3ec0c840d7c59dd6ea0cee61cb2f73f7723d38436417d10e5fc\": container with ID starting with f836dde346eea3ec0c840d7c59dd6ea0cee61cb2f73f7723d38436417d10e5fc not found: ID does not exist" containerID="f836dde346eea3ec0c840d7c59dd6ea0cee61cb2f73f7723d38436417d10e5fc" Oct 03 07:51:52 crc kubenswrapper[4664]: I1003 07:51:52.456570 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f836dde346eea3ec0c840d7c59dd6ea0cee61cb2f73f7723d38436417d10e5fc"} err="failed to get container status \"f836dde346eea3ec0c840d7c59dd6ea0cee61cb2f73f7723d38436417d10e5fc\": rpc error: code = NotFound desc = could not find container \"f836dde346eea3ec0c840d7c59dd6ea0cee61cb2f73f7723d38436417d10e5fc\": container with ID starting with f836dde346eea3ec0c840d7c59dd6ea0cee61cb2f73f7723d38436417d10e5fc not found: ID does not exist" Oct 03 07:51:52 crc kubenswrapper[4664]: I1003 07:51:52.456595 4664 scope.go:117] "RemoveContainer" containerID="566d012fc3b7aed06eb82996c38f74d151e061fa2074bb49c04a6816f6a035e3" Oct 03 07:51:52 crc kubenswrapper[4664]: E1003 07:51:52.456958 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"566d012fc3b7aed06eb82996c38f74d151e061fa2074bb49c04a6816f6a035e3\": container with ID starting with 566d012fc3b7aed06eb82996c38f74d151e061fa2074bb49c04a6816f6a035e3 not found: ID does not exist" containerID="566d012fc3b7aed06eb82996c38f74d151e061fa2074bb49c04a6816f6a035e3" Oct 03 07:51:52 crc kubenswrapper[4664]: I1003 07:51:52.456989 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"566d012fc3b7aed06eb82996c38f74d151e061fa2074bb49c04a6816f6a035e3"} err="failed to get container status \"566d012fc3b7aed06eb82996c38f74d151e061fa2074bb49c04a6816f6a035e3\": rpc error: code = NotFound desc = could not find container \"566d012fc3b7aed06eb82996c38f74d151e061fa2074bb49c04a6816f6a035e3\": container with ID starting with 566d012fc3b7aed06eb82996c38f74d151e061fa2074bb49c04a6816f6a035e3 not found: ID does not exist" Oct 03 07:51:52 crc kubenswrapper[4664]: I1003 07:51:52.718431 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ftqrw" Oct 03 07:51:52 crc kubenswrapper[4664]: I1003 07:51:52.806825 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d56b337-1eb1-4e79-b6ef-bb2d85737a41-catalog-content\") pod \"5d56b337-1eb1-4e79-b6ef-bb2d85737a41\" (UID: \"5d56b337-1eb1-4e79-b6ef-bb2d85737a41\") " Oct 03 07:51:52 crc kubenswrapper[4664]: I1003 07:51:52.806970 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d56b337-1eb1-4e79-b6ef-bb2d85737a41-utilities\") pod \"5d56b337-1eb1-4e79-b6ef-bb2d85737a41\" (UID: \"5d56b337-1eb1-4e79-b6ef-bb2d85737a41\") " Oct 03 07:51:52 crc kubenswrapper[4664]: I1003 07:51:52.807062 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkgg7\" (UniqueName: \"kubernetes.io/projected/5d56b337-1eb1-4e79-b6ef-bb2d85737a41-kube-api-access-vkgg7\") pod \"5d56b337-1eb1-4e79-b6ef-bb2d85737a41\" (UID: \"5d56b337-1eb1-4e79-b6ef-bb2d85737a41\") " Oct 03 07:51:52 crc kubenswrapper[4664]: I1003 07:51:52.807966 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d56b337-1eb1-4e79-b6ef-bb2d85737a41-utilities" (OuterVolumeSpecName: "utilities") pod "5d56b337-1eb1-4e79-b6ef-bb2d85737a41" (UID: "5d56b337-1eb1-4e79-b6ef-bb2d85737a41"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:51:52 crc kubenswrapper[4664]: I1003 07:51:52.812821 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d56b337-1eb1-4e79-b6ef-bb2d85737a41-kube-api-access-vkgg7" (OuterVolumeSpecName: "kube-api-access-vkgg7") pod "5d56b337-1eb1-4e79-b6ef-bb2d85737a41" (UID: "5d56b337-1eb1-4e79-b6ef-bb2d85737a41"). InnerVolumeSpecName "kube-api-access-vkgg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:51:52 crc kubenswrapper[4664]: I1003 07:51:52.858151 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d56b337-1eb1-4e79-b6ef-bb2d85737a41-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5d56b337-1eb1-4e79-b6ef-bb2d85737a41" (UID: "5d56b337-1eb1-4e79-b6ef-bb2d85737a41"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:51:52 crc kubenswrapper[4664]: I1003 07:51:52.908960 4664 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d56b337-1eb1-4e79-b6ef-bb2d85737a41-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 07:51:52 crc kubenswrapper[4664]: I1003 07:51:52.908993 4664 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d56b337-1eb1-4e79-b6ef-bb2d85737a41-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 07:51:52 crc kubenswrapper[4664]: I1003 07:51:52.909006 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkgg7\" (UniqueName: \"kubernetes.io/projected/5d56b337-1eb1-4e79-b6ef-bb2d85737a41-kube-api-access-vkgg7\") on node \"crc\" DevicePath \"\"" Oct 03 07:51:52 crc kubenswrapper[4664]: I1003 07:51:52.909533 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qt6g9"] Oct 03 07:51:53 crc kubenswrapper[4664]: I1003 07:51:53.319573 4664 generic.go:334] "Generic (PLEG): container finished" podID="5d56b337-1eb1-4e79-b6ef-bb2d85737a41" containerID="4212b2bf969e9c4b3b3e50c91c4264d5c29482a317b9546db694fd0662669046" exitCode=0 Oct 03 07:51:53 crc kubenswrapper[4664]: I1003 07:51:53.319662 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ftqrw" Oct 03 07:51:53 crc kubenswrapper[4664]: I1003 07:51:53.319659 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ftqrw" event={"ID":"5d56b337-1eb1-4e79-b6ef-bb2d85737a41","Type":"ContainerDied","Data":"4212b2bf969e9c4b3b3e50c91c4264d5c29482a317b9546db694fd0662669046"} Oct 03 07:51:53 crc kubenswrapper[4664]: I1003 07:51:53.320094 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ftqrw" event={"ID":"5d56b337-1eb1-4e79-b6ef-bb2d85737a41","Type":"ContainerDied","Data":"eea1b152f46022f0a187f932fe7a593b43617ea73d8e070b731e61a1396d6278"} Oct 03 07:51:53 crc kubenswrapper[4664]: I1003 07:51:53.320114 4664 scope.go:117] "RemoveContainer" containerID="4212b2bf969e9c4b3b3e50c91c4264d5c29482a317b9546db694fd0662669046" Oct 03 07:51:53 crc kubenswrapper[4664]: I1003 07:51:53.347182 4664 scope.go:117] "RemoveContainer" containerID="614cbc1c4d5cfeeb06a9ddce7bb890feb739f9c9dbf1ef89af44e4fb688a84c7" Oct 03 07:51:53 crc kubenswrapper[4664]: I1003 07:51:53.349557 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ftqrw"] Oct 03 07:51:53 crc kubenswrapper[4664]: I1003 07:51:53.352317 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ftqrw"] Oct 03 07:51:53 crc kubenswrapper[4664]: I1003 07:51:53.362822 4664 scope.go:117] "RemoveContainer" containerID="132246323cb672d259c14cf20694890722e4463a3702ec734ecdad2583077fc3" Oct 03 07:51:53 crc kubenswrapper[4664]: I1003 07:51:53.380408 4664 scope.go:117] "RemoveContainer" containerID="4212b2bf969e9c4b3b3e50c91c4264d5c29482a317b9546db694fd0662669046" Oct 03 07:51:53 crc kubenswrapper[4664]: E1003 07:51:53.380867 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4212b2bf969e9c4b3b3e50c91c4264d5c29482a317b9546db694fd0662669046\": container with ID starting with 4212b2bf969e9c4b3b3e50c91c4264d5c29482a317b9546db694fd0662669046 not found: ID does not exist" containerID="4212b2bf969e9c4b3b3e50c91c4264d5c29482a317b9546db694fd0662669046" Oct 03 07:51:53 crc kubenswrapper[4664]: I1003 07:51:53.380914 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4212b2bf969e9c4b3b3e50c91c4264d5c29482a317b9546db694fd0662669046"} err="failed to get container status \"4212b2bf969e9c4b3b3e50c91c4264d5c29482a317b9546db694fd0662669046\": rpc error: code = NotFound desc = could not find container \"4212b2bf969e9c4b3b3e50c91c4264d5c29482a317b9546db694fd0662669046\": container with ID starting with 4212b2bf969e9c4b3b3e50c91c4264d5c29482a317b9546db694fd0662669046 not found: ID does not exist" Oct 03 07:51:53 crc kubenswrapper[4664]: I1003 07:51:53.380945 4664 scope.go:117] "RemoveContainer" containerID="614cbc1c4d5cfeeb06a9ddce7bb890feb739f9c9dbf1ef89af44e4fb688a84c7" Oct 03 07:51:53 crc kubenswrapper[4664]: E1003 07:51:53.381894 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"614cbc1c4d5cfeeb06a9ddce7bb890feb739f9c9dbf1ef89af44e4fb688a84c7\": container with ID starting with 614cbc1c4d5cfeeb06a9ddce7bb890feb739f9c9dbf1ef89af44e4fb688a84c7 not found: ID does not exist" containerID="614cbc1c4d5cfeeb06a9ddce7bb890feb739f9c9dbf1ef89af44e4fb688a84c7" Oct 03 07:51:53 crc kubenswrapper[4664]: I1003 07:51:53.381922 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"614cbc1c4d5cfeeb06a9ddce7bb890feb739f9c9dbf1ef89af44e4fb688a84c7"} err="failed to get container status \"614cbc1c4d5cfeeb06a9ddce7bb890feb739f9c9dbf1ef89af44e4fb688a84c7\": rpc error: code = NotFound desc = could not find container \"614cbc1c4d5cfeeb06a9ddce7bb890feb739f9c9dbf1ef89af44e4fb688a84c7\": container with ID starting with 614cbc1c4d5cfeeb06a9ddce7bb890feb739f9c9dbf1ef89af44e4fb688a84c7 not found: ID does not exist" Oct 03 07:51:53 crc kubenswrapper[4664]: I1003 07:51:53.381940 4664 scope.go:117] "RemoveContainer" containerID="132246323cb672d259c14cf20694890722e4463a3702ec734ecdad2583077fc3" Oct 03 07:51:53 crc kubenswrapper[4664]: E1003 07:51:53.382416 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"132246323cb672d259c14cf20694890722e4463a3702ec734ecdad2583077fc3\": container with ID starting with 132246323cb672d259c14cf20694890722e4463a3702ec734ecdad2583077fc3 not found: ID does not exist" containerID="132246323cb672d259c14cf20694890722e4463a3702ec734ecdad2583077fc3" Oct 03 07:51:53 crc kubenswrapper[4664]: I1003 07:51:53.382445 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"132246323cb672d259c14cf20694890722e4463a3702ec734ecdad2583077fc3"} err="failed to get container status \"132246323cb672d259c14cf20694890722e4463a3702ec734ecdad2583077fc3\": rpc error: code = NotFound desc = could not find container \"132246323cb672d259c14cf20694890722e4463a3702ec734ecdad2583077fc3\": container with ID starting with 132246323cb672d259c14cf20694890722e4463a3702ec734ecdad2583077fc3 not found: ID does not exist" Oct 03 07:51:53 crc kubenswrapper[4664]: I1003 07:51:53.883360 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d56b337-1eb1-4e79-b6ef-bb2d85737a41" path="/var/lib/kubelet/pods/5d56b337-1eb1-4e79-b6ef-bb2d85737a41/volumes" Oct 03 07:51:53 crc kubenswrapper[4664]: I1003 07:51:53.884251 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78dc4ca9-6ff9-428f-b341-a02f0a85dfec" path="/var/lib/kubelet/pods/78dc4ca9-6ff9-428f-b341-a02f0a85dfec/volumes" Oct 03 07:51:54 crc kubenswrapper[4664]: I1003 07:51:54.328715 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qt6g9" podUID="875a3a4c-cb52-4fb2-976d-0e795bbfcb4c" containerName="registry-server" containerID="cri-o://d5ee632c1c95a8d1bc2d7e2d5763fe6513421030c1f4a02e65d2494b26605606" gracePeriod=2 Oct 03 07:51:55 crc kubenswrapper[4664]: I1003 07:51:55.338161 4664 generic.go:334] "Generic (PLEG): container finished" podID="875a3a4c-cb52-4fb2-976d-0e795bbfcb4c" containerID="d5ee632c1c95a8d1bc2d7e2d5763fe6513421030c1f4a02e65d2494b26605606" exitCode=0 Oct 03 07:51:55 crc kubenswrapper[4664]: I1003 07:51:55.338215 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qt6g9" event={"ID":"875a3a4c-cb52-4fb2-976d-0e795bbfcb4c","Type":"ContainerDied","Data":"d5ee632c1c95a8d1bc2d7e2d5763fe6513421030c1f4a02e65d2494b26605606"} Oct 03 07:51:55 crc kubenswrapper[4664]: I1003 07:51:55.536503 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qt6g9" Oct 03 07:51:55 crc kubenswrapper[4664]: I1003 07:51:55.640928 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/875a3a4c-cb52-4fb2-976d-0e795bbfcb4c-catalog-content\") pod \"875a3a4c-cb52-4fb2-976d-0e795bbfcb4c\" (UID: \"875a3a4c-cb52-4fb2-976d-0e795bbfcb4c\") " Oct 03 07:51:55 crc kubenswrapper[4664]: I1003 07:51:55.641034 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/875a3a4c-cb52-4fb2-976d-0e795bbfcb4c-utilities\") pod \"875a3a4c-cb52-4fb2-976d-0e795bbfcb4c\" (UID: \"875a3a4c-cb52-4fb2-976d-0e795bbfcb4c\") " Oct 03 07:51:55 crc kubenswrapper[4664]: I1003 07:51:55.641107 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbxjg\" (UniqueName: \"kubernetes.io/projected/875a3a4c-cb52-4fb2-976d-0e795bbfcb4c-kube-api-access-fbxjg\") pod \"875a3a4c-cb52-4fb2-976d-0e795bbfcb4c\" (UID: \"875a3a4c-cb52-4fb2-976d-0e795bbfcb4c\") " Oct 03 07:51:55 crc kubenswrapper[4664]: I1003 07:51:55.644190 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/875a3a4c-cb52-4fb2-976d-0e795bbfcb4c-utilities" (OuterVolumeSpecName: "utilities") pod "875a3a4c-cb52-4fb2-976d-0e795bbfcb4c" (UID: "875a3a4c-cb52-4fb2-976d-0e795bbfcb4c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:51:55 crc kubenswrapper[4664]: I1003 07:51:55.647250 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/875a3a4c-cb52-4fb2-976d-0e795bbfcb4c-kube-api-access-fbxjg" (OuterVolumeSpecName: "kube-api-access-fbxjg") pod "875a3a4c-cb52-4fb2-976d-0e795bbfcb4c" (UID: "875a3a4c-cb52-4fb2-976d-0e795bbfcb4c"). InnerVolumeSpecName "kube-api-access-fbxjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:51:55 crc kubenswrapper[4664]: I1003 07:51:55.742274 4664 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/875a3a4c-cb52-4fb2-976d-0e795bbfcb4c-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 07:51:55 crc kubenswrapper[4664]: I1003 07:51:55.742310 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbxjg\" (UniqueName: \"kubernetes.io/projected/875a3a4c-cb52-4fb2-976d-0e795bbfcb4c-kube-api-access-fbxjg\") on node \"crc\" DevicePath \"\"" Oct 03 07:51:56 crc kubenswrapper[4664]: I1003 07:51:56.345199 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qt6g9" Oct 03 07:51:56 crc kubenswrapper[4664]: I1003 07:51:56.345218 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qt6g9" event={"ID":"875a3a4c-cb52-4fb2-976d-0e795bbfcb4c","Type":"ContainerDied","Data":"467da52daa29e1b2dd71ba79db197b937b8e6923ea59df26108131d410984ff5"} Oct 03 07:51:56 crc kubenswrapper[4664]: I1003 07:51:56.345379 4664 scope.go:117] "RemoveContainer" containerID="d5ee632c1c95a8d1bc2d7e2d5763fe6513421030c1f4a02e65d2494b26605606" Oct 03 07:51:56 crc kubenswrapper[4664]: I1003 07:51:56.367205 4664 scope.go:117] "RemoveContainer" containerID="17d73ef300e08f93b334db06c1f68447693e3e2b171d10ca55fc4d744d9b2f8b" Oct 03 07:51:56 crc kubenswrapper[4664]: I1003 07:51:56.382174 4664 scope.go:117] "RemoveContainer" containerID="c5606cc81c1a3f77032f74f2702845e639ef6a8c9eeec5a1ce8fe6f1de25de92" Oct 03 07:51:56 crc kubenswrapper[4664]: I1003 07:51:56.776512 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/875a3a4c-cb52-4fb2-976d-0e795bbfcb4c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "875a3a4c-cb52-4fb2-976d-0e795bbfcb4c" (UID: "875a3a4c-cb52-4fb2-976d-0e795bbfcb4c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:51:56 crc kubenswrapper[4664]: I1003 07:51:56.860353 4664 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/875a3a4c-cb52-4fb2-976d-0e795bbfcb4c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 07:51:56 crc kubenswrapper[4664]: I1003 07:51:56.974143 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qt6g9"] Oct 03 07:51:56 crc kubenswrapper[4664]: I1003 07:51:56.978490 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qt6g9"] Oct 03 07:51:57 crc kubenswrapper[4664]: I1003 07:51:57.883547 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="875a3a4c-cb52-4fb2-976d-0e795bbfcb4c" path="/var/lib/kubelet/pods/875a3a4c-cb52-4fb2-976d-0e795bbfcb4c/volumes" Oct 03 07:51:57 crc kubenswrapper[4664]: I1003 07:51:57.885261 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-n4cpb" Oct 03 07:51:58 crc kubenswrapper[4664]: I1003 07:51:58.078735 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2t2bc" Oct 03 07:51:59 crc kubenswrapper[4664]: I1003 07:51:59.956223 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gtxvm" Oct 03 07:52:00 crc kubenswrapper[4664]: I1003 07:52:00.268401 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dhpld" Oct 03 07:52:01 crc kubenswrapper[4664]: I1003 07:52:01.707834 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dhpld"] Oct 03 07:52:01 crc kubenswrapper[4664]: I1003 07:52:01.708369 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dhpld" podUID="18497250-f896-4630-98c4-c2915a50fd1b" containerName="registry-server" containerID="cri-o://3d8244b72b4b0e4d55f044846fb054cfa642f33d57d94631ca6380b09a7b670c" gracePeriod=2 Oct 03 07:52:02 crc kubenswrapper[4664]: I1003 07:52:02.061039 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dhpld" Oct 03 07:52:02 crc kubenswrapper[4664]: I1003 07:52:02.130724 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18497250-f896-4630-98c4-c2915a50fd1b-utilities\") pod \"18497250-f896-4630-98c4-c2915a50fd1b\" (UID: \"18497250-f896-4630-98c4-c2915a50fd1b\") " Oct 03 07:52:02 crc kubenswrapper[4664]: I1003 07:52:02.130890 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjhfn\" (UniqueName: \"kubernetes.io/projected/18497250-f896-4630-98c4-c2915a50fd1b-kube-api-access-tjhfn\") pod \"18497250-f896-4630-98c4-c2915a50fd1b\" (UID: \"18497250-f896-4630-98c4-c2915a50fd1b\") " Oct 03 07:52:02 crc kubenswrapper[4664]: I1003 07:52:02.130950 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18497250-f896-4630-98c4-c2915a50fd1b-catalog-content\") pod \"18497250-f896-4630-98c4-c2915a50fd1b\" (UID: \"18497250-f896-4630-98c4-c2915a50fd1b\") " Oct 03 07:52:02 crc kubenswrapper[4664]: I1003 07:52:02.131823 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18497250-f896-4630-98c4-c2915a50fd1b-utilities" (OuterVolumeSpecName: "utilities") pod "18497250-f896-4630-98c4-c2915a50fd1b" (UID: "18497250-f896-4630-98c4-c2915a50fd1b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:52:02 crc kubenswrapper[4664]: I1003 07:52:02.136944 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18497250-f896-4630-98c4-c2915a50fd1b-kube-api-access-tjhfn" (OuterVolumeSpecName: "kube-api-access-tjhfn") pod "18497250-f896-4630-98c4-c2915a50fd1b" (UID: "18497250-f896-4630-98c4-c2915a50fd1b"). InnerVolumeSpecName "kube-api-access-tjhfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:52:02 crc kubenswrapper[4664]: I1003 07:52:02.144465 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18497250-f896-4630-98c4-c2915a50fd1b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "18497250-f896-4630-98c4-c2915a50fd1b" (UID: "18497250-f896-4630-98c4-c2915a50fd1b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:52:02 crc kubenswrapper[4664]: I1003 07:52:02.232891 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjhfn\" (UniqueName: \"kubernetes.io/projected/18497250-f896-4630-98c4-c2915a50fd1b-kube-api-access-tjhfn\") on node \"crc\" DevicePath \"\"" Oct 03 07:52:02 crc kubenswrapper[4664]: I1003 07:52:02.232924 4664 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18497250-f896-4630-98c4-c2915a50fd1b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 07:52:02 crc kubenswrapper[4664]: I1003 07:52:02.232934 4664 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18497250-f896-4630-98c4-c2915a50fd1b-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 07:52:02 crc kubenswrapper[4664]: I1003 07:52:02.385459 4664 generic.go:334] "Generic (PLEG): container finished" podID="18497250-f896-4630-98c4-c2915a50fd1b" containerID="3d8244b72b4b0e4d55f044846fb054cfa642f33d57d94631ca6380b09a7b670c" exitCode=0 Oct 03 07:52:02 crc kubenswrapper[4664]: I1003 07:52:02.385513 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dhpld" event={"ID":"18497250-f896-4630-98c4-c2915a50fd1b","Type":"ContainerDied","Data":"3d8244b72b4b0e4d55f044846fb054cfa642f33d57d94631ca6380b09a7b670c"} Oct 03 07:52:02 crc kubenswrapper[4664]: I1003 07:52:02.385545 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dhpld" event={"ID":"18497250-f896-4630-98c4-c2915a50fd1b","Type":"ContainerDied","Data":"e867e459054869347c7fed84cc7b077fc1fd8ca274e73a9c345925b2c7ba982c"} Oct 03 07:52:02 crc kubenswrapper[4664]: I1003 07:52:02.385563 4664 scope.go:117] "RemoveContainer" containerID="3d8244b72b4b0e4d55f044846fb054cfa642f33d57d94631ca6380b09a7b670c" Oct 03 07:52:02 crc kubenswrapper[4664]: I1003 07:52:02.385709 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dhpld" Oct 03 07:52:02 crc kubenswrapper[4664]: I1003 07:52:02.404491 4664 scope.go:117] "RemoveContainer" containerID="fbcf5ae0b5c1bb4e0cf908b26053a6a40a6550f97be733a20ee3243a1655dafe" Oct 03 07:52:02 crc kubenswrapper[4664]: I1003 07:52:02.414798 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dhpld"] Oct 03 07:52:02 crc kubenswrapper[4664]: I1003 07:52:02.417521 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dhpld"] Oct 03 07:52:02 crc kubenswrapper[4664]: I1003 07:52:02.443001 4664 scope.go:117] "RemoveContainer" containerID="e5ccadc80da2b8886ced88a94dee899f25bdb2bbd59a3e93c1457a181a03d91f" Oct 03 07:52:02 crc kubenswrapper[4664]: I1003 07:52:02.457462 4664 scope.go:117] "RemoveContainer" containerID="3d8244b72b4b0e4d55f044846fb054cfa642f33d57d94631ca6380b09a7b670c" Oct 03 07:52:02 crc kubenswrapper[4664]: E1003 07:52:02.458111 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d8244b72b4b0e4d55f044846fb054cfa642f33d57d94631ca6380b09a7b670c\": container with ID starting with 3d8244b72b4b0e4d55f044846fb054cfa642f33d57d94631ca6380b09a7b670c not found: ID does not exist" containerID="3d8244b72b4b0e4d55f044846fb054cfa642f33d57d94631ca6380b09a7b670c" Oct 03 07:52:02 crc kubenswrapper[4664]: I1003 07:52:02.458167 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d8244b72b4b0e4d55f044846fb054cfa642f33d57d94631ca6380b09a7b670c"} err="failed to get container status \"3d8244b72b4b0e4d55f044846fb054cfa642f33d57d94631ca6380b09a7b670c\": rpc error: code = NotFound desc = could not find container \"3d8244b72b4b0e4d55f044846fb054cfa642f33d57d94631ca6380b09a7b670c\": container with ID starting with 3d8244b72b4b0e4d55f044846fb054cfa642f33d57d94631ca6380b09a7b670c not found: ID does not exist" Oct 03 07:52:02 crc kubenswrapper[4664]: I1003 07:52:02.458199 4664 scope.go:117] "RemoveContainer" containerID="fbcf5ae0b5c1bb4e0cf908b26053a6a40a6550f97be733a20ee3243a1655dafe" Oct 03 07:52:02 crc kubenswrapper[4664]: E1003 07:52:02.458711 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbcf5ae0b5c1bb4e0cf908b26053a6a40a6550f97be733a20ee3243a1655dafe\": container with ID starting with fbcf5ae0b5c1bb4e0cf908b26053a6a40a6550f97be733a20ee3243a1655dafe not found: ID does not exist" containerID="fbcf5ae0b5c1bb4e0cf908b26053a6a40a6550f97be733a20ee3243a1655dafe" Oct 03 07:52:02 crc kubenswrapper[4664]: I1003 07:52:02.458729 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbcf5ae0b5c1bb4e0cf908b26053a6a40a6550f97be733a20ee3243a1655dafe"} err="failed to get container status \"fbcf5ae0b5c1bb4e0cf908b26053a6a40a6550f97be733a20ee3243a1655dafe\": rpc error: code = NotFound desc = could not find container \"fbcf5ae0b5c1bb4e0cf908b26053a6a40a6550f97be733a20ee3243a1655dafe\": container with ID starting with fbcf5ae0b5c1bb4e0cf908b26053a6a40a6550f97be733a20ee3243a1655dafe not found: ID does not exist" Oct 03 07:52:02 crc kubenswrapper[4664]: I1003 07:52:02.458744 4664 scope.go:117] "RemoveContainer" containerID="e5ccadc80da2b8886ced88a94dee899f25bdb2bbd59a3e93c1457a181a03d91f" Oct 03 07:52:02 crc kubenswrapper[4664]: E1003 07:52:02.458996 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5ccadc80da2b8886ced88a94dee899f25bdb2bbd59a3e93c1457a181a03d91f\": container with ID starting with e5ccadc80da2b8886ced88a94dee899f25bdb2bbd59a3e93c1457a181a03d91f not found: ID does not exist" containerID="e5ccadc80da2b8886ced88a94dee899f25bdb2bbd59a3e93c1457a181a03d91f" Oct 03 07:52:02 crc kubenswrapper[4664]: I1003 07:52:02.459013 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5ccadc80da2b8886ced88a94dee899f25bdb2bbd59a3e93c1457a181a03d91f"} err="failed to get container status \"e5ccadc80da2b8886ced88a94dee899f25bdb2bbd59a3e93c1457a181a03d91f\": rpc error: code = NotFound desc = could not find container \"e5ccadc80da2b8886ced88a94dee899f25bdb2bbd59a3e93c1457a181a03d91f\": container with ID starting with e5ccadc80da2b8886ced88a94dee899f25bdb2bbd59a3e93c1457a181a03d91f not found: ID does not exist" Oct 03 07:52:03 crc kubenswrapper[4664]: I1003 07:52:03.883392 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18497250-f896-4630-98c4-c2915a50fd1b" path="/var/lib/kubelet/pods/18497250-f896-4630-98c4-c2915a50fd1b/volumes" Oct 03 07:52:10 crc kubenswrapper[4664]: I1003 07:52:10.745839 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-h46lz"] Oct 03 07:52:11 crc kubenswrapper[4664]: I1003 07:52:11.987205 4664 patch_prober.go:28] interesting pod/machine-config-daemon-x9dgm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 07:52:11 crc kubenswrapper[4664]: I1003 07:52:11.987262 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 07:52:11 crc kubenswrapper[4664]: I1003 07:52:11.987311 4664 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" Oct 03 07:52:11 crc kubenswrapper[4664]: I1003 07:52:11.987927 4664 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b446e000609801e974810aa233005f7c2810cc977c7a13155452de76e1268427"} pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 07:52:11 crc kubenswrapper[4664]: I1003 07:52:11.987974 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" containerID="cri-o://b446e000609801e974810aa233005f7c2810cc977c7a13155452de76e1268427" gracePeriod=600 Oct 03 07:52:12 crc kubenswrapper[4664]: I1003 07:52:12.435708 4664 generic.go:334] "Generic (PLEG): container finished" podID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerID="b446e000609801e974810aa233005f7c2810cc977c7a13155452de76e1268427" exitCode=0 Oct 03 07:52:12 crc kubenswrapper[4664]: I1003 07:52:12.436063 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" event={"ID":"598b81ce-0ce7-498f-9337-ae5e6e64682b","Type":"ContainerDied","Data":"b446e000609801e974810aa233005f7c2810cc977c7a13155452de76e1268427"} Oct 03 07:52:12 crc kubenswrapper[4664]: I1003 07:52:12.436094 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" event={"ID":"598b81ce-0ce7-498f-9337-ae5e6e64682b","Type":"ContainerStarted","Data":"67bcc74759e35616c9e66561e791cb4182af2c1e5e890cdf15bef9f1f05e339f"} Oct 03 07:52:13 crc kubenswrapper[4664]: I1003 07:52:13.578157 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-k5wnw"] Oct 03 07:52:13 crc kubenswrapper[4664]: E1003 07:52:13.578696 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63d05775-5de6-4f48-bfa5-96df1a0a8aa3" containerName="pruner" Oct 03 07:52:13 crc kubenswrapper[4664]: I1003 07:52:13.578708 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="63d05775-5de6-4f48-bfa5-96df1a0a8aa3" containerName="pruner" Oct 03 07:52:13 crc kubenswrapper[4664]: E1003 07:52:13.578719 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d56b337-1eb1-4e79-b6ef-bb2d85737a41" containerName="registry-server" Oct 03 07:52:13 crc kubenswrapper[4664]: I1003 07:52:13.578725 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d56b337-1eb1-4e79-b6ef-bb2d85737a41" containerName="registry-server" Oct 03 07:52:13 crc kubenswrapper[4664]: E1003 07:52:13.578732 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78dc4ca9-6ff9-428f-b341-a02f0a85dfec" containerName="registry-server" Oct 03 07:52:13 crc kubenswrapper[4664]: I1003 07:52:13.578739 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="78dc4ca9-6ff9-428f-b341-a02f0a85dfec" containerName="registry-server" Oct 03 07:52:13 crc kubenswrapper[4664]: E1003 07:52:13.578768 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="875a3a4c-cb52-4fb2-976d-0e795bbfcb4c" containerName="extract-utilities" Oct 03 07:52:13 crc kubenswrapper[4664]: I1003 07:52:13.578776 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="875a3a4c-cb52-4fb2-976d-0e795bbfcb4c" containerName="extract-utilities" Oct 03 07:52:13 crc kubenswrapper[4664]: E1003 07:52:13.578787 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18497250-f896-4630-98c4-c2915a50fd1b" containerName="extract-content" Oct 03 07:52:13 crc kubenswrapper[4664]: I1003 07:52:13.578795 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="18497250-f896-4630-98c4-c2915a50fd1b" containerName="extract-content" Oct 03 07:52:13 crc kubenswrapper[4664]: E1003 07:52:13.578807 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="875a3a4c-cb52-4fb2-976d-0e795bbfcb4c" containerName="extract-content" Oct 03 07:52:13 crc kubenswrapper[4664]: I1003 07:52:13.578831 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="875a3a4c-cb52-4fb2-976d-0e795bbfcb4c" containerName="extract-content" Oct 03 07:52:13 crc kubenswrapper[4664]: E1003 07:52:13.578845 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18497250-f896-4630-98c4-c2915a50fd1b" containerName="extract-utilities" Oct 03 07:52:13 crc kubenswrapper[4664]: I1003 07:52:13.578851 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="18497250-f896-4630-98c4-c2915a50fd1b" containerName="extract-utilities" Oct 03 07:52:13 crc kubenswrapper[4664]: E1003 07:52:13.578861 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="875a3a4c-cb52-4fb2-976d-0e795bbfcb4c" containerName="registry-server" Oct 03 07:52:13 crc kubenswrapper[4664]: I1003 07:52:13.578867 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="875a3a4c-cb52-4fb2-976d-0e795bbfcb4c" containerName="registry-server" Oct 03 07:52:13 crc kubenswrapper[4664]: E1003 07:52:13.578875 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c3aff15-c0d3-4d20-8ee3-7d753d5f7c91" containerName="collect-profiles" Oct 03 07:52:13 crc kubenswrapper[4664]: I1003 07:52:13.578880 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c3aff15-c0d3-4d20-8ee3-7d753d5f7c91" containerName="collect-profiles" Oct 03 07:52:13 crc kubenswrapper[4664]: E1003 07:52:13.578887 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d52b7963-c3dd-44d1-b33f-14b98a2ef0f2" containerName="pruner" Oct 03 07:52:13 crc kubenswrapper[4664]: I1003 07:52:13.578893 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="d52b7963-c3dd-44d1-b33f-14b98a2ef0f2" containerName="pruner" Oct 03 07:52:13 crc kubenswrapper[4664]: E1003 07:52:13.578900 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d56b337-1eb1-4e79-b6ef-bb2d85737a41" containerName="extract-content" Oct 03 07:52:13 crc kubenswrapper[4664]: I1003 07:52:13.578906 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d56b337-1eb1-4e79-b6ef-bb2d85737a41" containerName="extract-content" Oct 03 07:52:13 crc kubenswrapper[4664]: E1003 07:52:13.578916 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78dc4ca9-6ff9-428f-b341-a02f0a85dfec" containerName="extract-content" Oct 03 07:52:13 crc kubenswrapper[4664]: I1003 07:52:13.578921 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="78dc4ca9-6ff9-428f-b341-a02f0a85dfec" containerName="extract-content" Oct 03 07:52:13 crc kubenswrapper[4664]: E1003 07:52:13.578930 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d56b337-1eb1-4e79-b6ef-bb2d85737a41" containerName="extract-utilities" Oct 03 07:52:13 crc kubenswrapper[4664]: I1003 07:52:13.578936 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d56b337-1eb1-4e79-b6ef-bb2d85737a41" containerName="extract-utilities" Oct 03 07:52:13 crc kubenswrapper[4664]: E1003 07:52:13.578944 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78dc4ca9-6ff9-428f-b341-a02f0a85dfec" containerName="extract-utilities" Oct 03 07:52:13 crc kubenswrapper[4664]: I1003 07:52:13.578949 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="78dc4ca9-6ff9-428f-b341-a02f0a85dfec" containerName="extract-utilities" Oct 03 07:52:13 crc kubenswrapper[4664]: E1003 07:52:13.578957 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18497250-f896-4630-98c4-c2915a50fd1b" containerName="registry-server" Oct 03 07:52:13 crc kubenswrapper[4664]: I1003 07:52:13.578963 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="18497250-f896-4630-98c4-c2915a50fd1b" containerName="registry-server" Oct 03 07:52:13 crc kubenswrapper[4664]: I1003 07:52:13.579050 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="d52b7963-c3dd-44d1-b33f-14b98a2ef0f2" containerName="pruner" Oct 03 07:52:13 crc kubenswrapper[4664]: I1003 07:52:13.579060 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="63d05775-5de6-4f48-bfa5-96df1a0a8aa3" containerName="pruner" Oct 03 07:52:13 crc kubenswrapper[4664]: I1003 07:52:13.579067 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c3aff15-c0d3-4d20-8ee3-7d753d5f7c91" containerName="collect-profiles" Oct 03 07:52:13 crc kubenswrapper[4664]: I1003 07:52:13.579080 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="875a3a4c-cb52-4fb2-976d-0e795bbfcb4c" containerName="registry-server" Oct 03 07:52:13 crc kubenswrapper[4664]: I1003 07:52:13.579089 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="18497250-f896-4630-98c4-c2915a50fd1b" containerName="registry-server" Oct 03 07:52:13 crc kubenswrapper[4664]: I1003 07:52:13.579099 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="78dc4ca9-6ff9-428f-b341-a02f0a85dfec" containerName="registry-server" Oct 03 07:52:13 crc kubenswrapper[4664]: I1003 07:52:13.579109 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d56b337-1eb1-4e79-b6ef-bb2d85737a41" containerName="registry-server" Oct 03 07:52:13 crc kubenswrapper[4664]: I1003 07:52:13.579482 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-k5wnw" Oct 03 07:52:13 crc kubenswrapper[4664]: I1003 07:52:13.596462 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-k5wnw"] Oct 03 07:52:13 crc kubenswrapper[4664]: I1003 07:52:13.680338 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/56e896e4-7a34-4857-9f05-068e492a4960-ca-trust-extracted\") pod \"image-registry-66df7c8f76-k5wnw\" (UID: \"56e896e4-7a34-4857-9f05-068e492a4960\") " pod="openshift-image-registry/image-registry-66df7c8f76-k5wnw" Oct 03 07:52:13 crc kubenswrapper[4664]: I1003 07:52:13.680397 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/56e896e4-7a34-4857-9f05-068e492a4960-installation-pull-secrets\") pod \"image-registry-66df7c8f76-k5wnw\" (UID: \"56e896e4-7a34-4857-9f05-068e492a4960\") " pod="openshift-image-registry/image-registry-66df7c8f76-k5wnw" Oct 03 07:52:13 crc kubenswrapper[4664]: I1003 07:52:13.680442 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/56e896e4-7a34-4857-9f05-068e492a4960-bound-sa-token\") pod \"image-registry-66df7c8f76-k5wnw\" (UID: \"56e896e4-7a34-4857-9f05-068e492a4960\") " pod="openshift-image-registry/image-registry-66df7c8f76-k5wnw" Oct 03 07:52:13 crc kubenswrapper[4664]: I1003 07:52:13.680484 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/56e896e4-7a34-4857-9f05-068e492a4960-registry-certificates\") pod \"image-registry-66df7c8f76-k5wnw\" (UID: \"56e896e4-7a34-4857-9f05-068e492a4960\") " pod="openshift-image-registry/image-registry-66df7c8f76-k5wnw" Oct 03 07:52:13 crc kubenswrapper[4664]: I1003 07:52:13.680673 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/56e896e4-7a34-4857-9f05-068e492a4960-registry-tls\") pod \"image-registry-66df7c8f76-k5wnw\" (UID: \"56e896e4-7a34-4857-9f05-068e492a4960\") " pod="openshift-image-registry/image-registry-66df7c8f76-k5wnw" Oct 03 07:52:13 crc kubenswrapper[4664]: I1003 07:52:13.680727 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-k5wnw\" (UID: \"56e896e4-7a34-4857-9f05-068e492a4960\") " pod="openshift-image-registry/image-registry-66df7c8f76-k5wnw" Oct 03 07:52:13 crc kubenswrapper[4664]: I1003 07:52:13.680780 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/56e896e4-7a34-4857-9f05-068e492a4960-trusted-ca\") pod \"image-registry-66df7c8f76-k5wnw\" (UID: \"56e896e4-7a34-4857-9f05-068e492a4960\") " pod="openshift-image-registry/image-registry-66df7c8f76-k5wnw" Oct 03 07:52:13 crc kubenswrapper[4664]: I1003 07:52:13.680865 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pcmb\" (UniqueName: \"kubernetes.io/projected/56e896e4-7a34-4857-9f05-068e492a4960-kube-api-access-7pcmb\") pod \"image-registry-66df7c8f76-k5wnw\" (UID: \"56e896e4-7a34-4857-9f05-068e492a4960\") " pod="openshift-image-registry/image-registry-66df7c8f76-k5wnw" Oct 03 07:52:13 crc kubenswrapper[4664]: I1003 07:52:13.704231 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-k5wnw\" (UID: \"56e896e4-7a34-4857-9f05-068e492a4960\") " pod="openshift-image-registry/image-registry-66df7c8f76-k5wnw" Oct 03 07:52:13 crc kubenswrapper[4664]: I1003 07:52:13.781959 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/56e896e4-7a34-4857-9f05-068e492a4960-ca-trust-extracted\") pod \"image-registry-66df7c8f76-k5wnw\" (UID: \"56e896e4-7a34-4857-9f05-068e492a4960\") " pod="openshift-image-registry/image-registry-66df7c8f76-k5wnw" Oct 03 07:52:13 crc kubenswrapper[4664]: I1003 07:52:13.782494 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/56e896e4-7a34-4857-9f05-068e492a4960-installation-pull-secrets\") pod \"image-registry-66df7c8f76-k5wnw\" (UID: \"56e896e4-7a34-4857-9f05-068e492a4960\") " pod="openshift-image-registry/image-registry-66df7c8f76-k5wnw" Oct 03 07:52:13 crc kubenswrapper[4664]: I1003 07:52:13.782637 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/56e896e4-7a34-4857-9f05-068e492a4960-bound-sa-token\") pod \"image-registry-66df7c8f76-k5wnw\" (UID: \"56e896e4-7a34-4857-9f05-068e492a4960\") " pod="openshift-image-registry/image-registry-66df7c8f76-k5wnw" Oct 03 07:52:13 crc kubenswrapper[4664]: I1003 07:52:13.782880 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/56e896e4-7a34-4857-9f05-068e492a4960-registry-certificates\") pod \"image-registry-66df7c8f76-k5wnw\" (UID: \"56e896e4-7a34-4857-9f05-068e492a4960\") " pod="openshift-image-registry/image-registry-66df7c8f76-k5wnw" Oct 03 07:52:13 crc kubenswrapper[4664]: I1003 07:52:13.783007 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/56e896e4-7a34-4857-9f05-068e492a4960-registry-tls\") pod \"image-registry-66df7c8f76-k5wnw\" (UID: \"56e896e4-7a34-4857-9f05-068e492a4960\") " pod="openshift-image-registry/image-registry-66df7c8f76-k5wnw" Oct 03 07:52:13 crc kubenswrapper[4664]: I1003 07:52:13.782449 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/56e896e4-7a34-4857-9f05-068e492a4960-ca-trust-extracted\") pod \"image-registry-66df7c8f76-k5wnw\" (UID: \"56e896e4-7a34-4857-9f05-068e492a4960\") " pod="openshift-image-registry/image-registry-66df7c8f76-k5wnw" Oct 03 07:52:13 crc kubenswrapper[4664]: I1003 07:52:13.783252 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/56e896e4-7a34-4857-9f05-068e492a4960-trusted-ca\") pod \"image-registry-66df7c8f76-k5wnw\" (UID: \"56e896e4-7a34-4857-9f05-068e492a4960\") " pod="openshift-image-registry/image-registry-66df7c8f76-k5wnw" Oct 03 07:52:13 crc kubenswrapper[4664]: I1003 07:52:13.783383 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pcmb\" (UniqueName: \"kubernetes.io/projected/56e896e4-7a34-4857-9f05-068e492a4960-kube-api-access-7pcmb\") pod \"image-registry-66df7c8f76-k5wnw\" (UID: \"56e896e4-7a34-4857-9f05-068e492a4960\") " pod="openshift-image-registry/image-registry-66df7c8f76-k5wnw" Oct 03 07:52:13 crc kubenswrapper[4664]: I1003 07:52:13.784550 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/56e896e4-7a34-4857-9f05-068e492a4960-registry-certificates\") pod \"image-registry-66df7c8f76-k5wnw\" (UID: \"56e896e4-7a34-4857-9f05-068e492a4960\") " pod="openshift-image-registry/image-registry-66df7c8f76-k5wnw" Oct 03 07:52:13 crc kubenswrapper[4664]: I1003 07:52:13.784920 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/56e896e4-7a34-4857-9f05-068e492a4960-trusted-ca\") pod \"image-registry-66df7c8f76-k5wnw\" (UID: \"56e896e4-7a34-4857-9f05-068e492a4960\") " pod="openshift-image-registry/image-registry-66df7c8f76-k5wnw" Oct 03 07:52:13 crc kubenswrapper[4664]: I1003 07:52:13.790395 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/56e896e4-7a34-4857-9f05-068e492a4960-installation-pull-secrets\") pod \"image-registry-66df7c8f76-k5wnw\" (UID: \"56e896e4-7a34-4857-9f05-068e492a4960\") " pod="openshift-image-registry/image-registry-66df7c8f76-k5wnw" Oct 03 07:52:13 crc kubenswrapper[4664]: I1003 07:52:13.791022 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/56e896e4-7a34-4857-9f05-068e492a4960-registry-tls\") pod \"image-registry-66df7c8f76-k5wnw\" (UID: \"56e896e4-7a34-4857-9f05-068e492a4960\") " pod="openshift-image-registry/image-registry-66df7c8f76-k5wnw" Oct 03 07:52:13 crc kubenswrapper[4664]: I1003 07:52:13.799922 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/56e896e4-7a34-4857-9f05-068e492a4960-bound-sa-token\") pod \"image-registry-66df7c8f76-k5wnw\" (UID: \"56e896e4-7a34-4857-9f05-068e492a4960\") " pod="openshift-image-registry/image-registry-66df7c8f76-k5wnw" Oct 03 07:52:13 crc kubenswrapper[4664]: I1003 07:52:13.802029 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pcmb\" (UniqueName: \"kubernetes.io/projected/56e896e4-7a34-4857-9f05-068e492a4960-kube-api-access-7pcmb\") pod \"image-registry-66df7c8f76-k5wnw\" (UID: \"56e896e4-7a34-4857-9f05-068e492a4960\") " pod="openshift-image-registry/image-registry-66df7c8f76-k5wnw" Oct 03 07:52:13 crc kubenswrapper[4664]: I1003 07:52:13.895709 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-k5wnw" Oct 03 07:52:14 crc kubenswrapper[4664]: I1003 07:52:14.312695 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-k5wnw"] Oct 03 07:52:14 crc kubenswrapper[4664]: I1003 07:52:14.448214 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-k5wnw" event={"ID":"56e896e4-7a34-4857-9f05-068e492a4960","Type":"ContainerStarted","Data":"49df259bfca2d8edbaafbc6c59ecc674acb34c6291694dbb6f6c2106ddbbb996"} Oct 03 07:52:14 crc kubenswrapper[4664]: I1003 07:52:14.448264 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-k5wnw" event={"ID":"56e896e4-7a34-4857-9f05-068e492a4960","Type":"ContainerStarted","Data":"3eddf18c10e4f6fcdd31bd54b109d9decd6e1b04b7281ac1d539dbcb1ffeaa62"} Oct 03 07:52:14 crc kubenswrapper[4664]: I1003 07:52:14.449174 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-k5wnw" Oct 03 07:52:33 crc kubenswrapper[4664]: I1003 07:52:33.899670 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-k5wnw" Oct 03 07:52:33 crc kubenswrapper[4664]: I1003 07:52:33.917558 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-k5wnw" podStartSLOduration=20.917544629 podStartE2EDuration="20.917544629s" podCreationTimestamp="2025-10-03 07:52:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:52:14.484232605 +0000 UTC m=+235.305423115" watchObservedRunningTime="2025-10-03 07:52:33.917544629 +0000 UTC m=+254.738735119" Oct 03 07:52:33 crc kubenswrapper[4664]: I1003 07:52:33.946200 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-szr58"] Oct 03 07:52:35 crc kubenswrapper[4664]: I1003 07:52:35.781350 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-h46lz" podUID="cf947105-e97d-4a1c-9b59-bf6b37461c1e" containerName="oauth-openshift" containerID="cri-o://5c9e2acb46f4810eaf773f4cd064187a66df4475f0cdad646fdd494e5a5711d5" gracePeriod=15 Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.172084 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-h46lz" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.208040 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5dcd86cbbd-59n9d"] Oct 03 07:52:36 crc kubenswrapper[4664]: E1003 07:52:36.208406 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf947105-e97d-4a1c-9b59-bf6b37461c1e" containerName="oauth-openshift" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.208431 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf947105-e97d-4a1c-9b59-bf6b37461c1e" containerName="oauth-openshift" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.208564 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf947105-e97d-4a1c-9b59-bf6b37461c1e" containerName="oauth-openshift" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.211268 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5dcd86cbbd-59n9d" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.220231 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5dcd86cbbd-59n9d"] Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.317207 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-user-template-login\") pod \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.317264 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-system-cliconfig\") pod \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.317326 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-system-trusted-ca-bundle\") pod \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.317370 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-user-template-error\") pod \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.317403 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-system-serving-cert\") pod \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.317427 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-user-idp-0-file-data\") pod \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.317658 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cf947105-e97d-4a1c-9b59-bf6b37461c1e-audit-dir\") pod \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.317683 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-system-router-certs\") pod \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.317708 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-system-service-ca\") pod \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.317700 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cf947105-e97d-4a1c-9b59-bf6b37461c1e-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "cf947105-e97d-4a1c-9b59-bf6b37461c1e" (UID: "cf947105-e97d-4a1c-9b59-bf6b37461c1e"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.318383 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6npk\" (UniqueName: \"kubernetes.io/projected/cf947105-e97d-4a1c-9b59-bf6b37461c1e-kube-api-access-w6npk\") pod \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.318461 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "cf947105-e97d-4a1c-9b59-bf6b37461c1e" (UID: "cf947105-e97d-4a1c-9b59-bf6b37461c1e"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.318845 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "cf947105-e97d-4a1c-9b59-bf6b37461c1e" (UID: "cf947105-e97d-4a1c-9b59-bf6b37461c1e"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.319063 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "cf947105-e97d-4a1c-9b59-bf6b37461c1e" (UID: "cf947105-e97d-4a1c-9b59-bf6b37461c1e"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.320016 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-system-session\") pod \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.320071 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-user-template-provider-selection\") pod \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.320095 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cf947105-e97d-4a1c-9b59-bf6b37461c1e-audit-policies\") pod \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.320151 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-system-ocp-branding-template\") pod \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\" (UID: \"cf947105-e97d-4a1c-9b59-bf6b37461c1e\") " Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.320375 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ed4919fc-1df3-4328-b8d5-d17bdbd202f7-audit-policies\") pod \"oauth-openshift-5dcd86cbbd-59n9d\" (UID: \"ed4919fc-1df3-4328-b8d5-d17bdbd202f7\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-59n9d" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.320457 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ed4919fc-1df3-4328-b8d5-d17bdbd202f7-v4-0-config-system-session\") pod \"oauth-openshift-5dcd86cbbd-59n9d\" (UID: \"ed4919fc-1df3-4328-b8d5-d17bdbd202f7\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-59n9d" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.320681 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ed4919fc-1df3-4328-b8d5-d17bdbd202f7-v4-0-config-user-template-login\") pod \"oauth-openshift-5dcd86cbbd-59n9d\" (UID: \"ed4919fc-1df3-4328-b8d5-d17bdbd202f7\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-59n9d" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.320763 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frqvj\" (UniqueName: \"kubernetes.io/projected/ed4919fc-1df3-4328-b8d5-d17bdbd202f7-kube-api-access-frqvj\") pod \"oauth-openshift-5dcd86cbbd-59n9d\" (UID: \"ed4919fc-1df3-4328-b8d5-d17bdbd202f7\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-59n9d" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.321229 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ed4919fc-1df3-4328-b8d5-d17bdbd202f7-v4-0-config-user-template-error\") pod \"oauth-openshift-5dcd86cbbd-59n9d\" (UID: \"ed4919fc-1df3-4328-b8d5-d17bdbd202f7\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-59n9d" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.321162 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf947105-e97d-4a1c-9b59-bf6b37461c1e-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "cf947105-e97d-4a1c-9b59-bf6b37461c1e" (UID: "cf947105-e97d-4a1c-9b59-bf6b37461c1e"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.321258 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ed4919fc-1df3-4328-b8d5-d17bdbd202f7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5dcd86cbbd-59n9d\" (UID: \"ed4919fc-1df3-4328-b8d5-d17bdbd202f7\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-59n9d" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.321362 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ed4919fc-1df3-4328-b8d5-d17bdbd202f7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5dcd86cbbd-59n9d\" (UID: \"ed4919fc-1df3-4328-b8d5-d17bdbd202f7\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-59n9d" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.321410 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ed4919fc-1df3-4328-b8d5-d17bdbd202f7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5dcd86cbbd-59n9d\" (UID: \"ed4919fc-1df3-4328-b8d5-d17bdbd202f7\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-59n9d" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.321447 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ed4919fc-1df3-4328-b8d5-d17bdbd202f7-v4-0-config-system-service-ca\") pod \"oauth-openshift-5dcd86cbbd-59n9d\" (UID: \"ed4919fc-1df3-4328-b8d5-d17bdbd202f7\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-59n9d" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.321475 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ed4919fc-1df3-4328-b8d5-d17bdbd202f7-v4-0-config-system-router-certs\") pod \"oauth-openshift-5dcd86cbbd-59n9d\" (UID: \"ed4919fc-1df3-4328-b8d5-d17bdbd202f7\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-59n9d" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.321504 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ed4919fc-1df3-4328-b8d5-d17bdbd202f7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5dcd86cbbd-59n9d\" (UID: \"ed4919fc-1df3-4328-b8d5-d17bdbd202f7\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-59n9d" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.321535 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed4919fc-1df3-4328-b8d5-d17bdbd202f7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5dcd86cbbd-59n9d\" (UID: \"ed4919fc-1df3-4328-b8d5-d17bdbd202f7\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-59n9d" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.321577 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ed4919fc-1df3-4328-b8d5-d17bdbd202f7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5dcd86cbbd-59n9d\" (UID: \"ed4919fc-1df3-4328-b8d5-d17bdbd202f7\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-59n9d" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.321628 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ed4919fc-1df3-4328-b8d5-d17bdbd202f7-audit-dir\") pod \"oauth-openshift-5dcd86cbbd-59n9d\" (UID: \"ed4919fc-1df3-4328-b8d5-d17bdbd202f7\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-59n9d" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.321739 4664 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.321772 4664 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.321795 4664 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cf947105-e97d-4a1c-9b59-bf6b37461c1e-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.321807 4664 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.321817 4664 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cf947105-e97d-4a1c-9b59-bf6b37461c1e-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.324352 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "cf947105-e97d-4a1c-9b59-bf6b37461c1e" (UID: "cf947105-e97d-4a1c-9b59-bf6b37461c1e"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.325117 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "cf947105-e97d-4a1c-9b59-bf6b37461c1e" (UID: "cf947105-e97d-4a1c-9b59-bf6b37461c1e"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.325823 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf947105-e97d-4a1c-9b59-bf6b37461c1e-kube-api-access-w6npk" (OuterVolumeSpecName: "kube-api-access-w6npk") pod "cf947105-e97d-4a1c-9b59-bf6b37461c1e" (UID: "cf947105-e97d-4a1c-9b59-bf6b37461c1e"). InnerVolumeSpecName "kube-api-access-w6npk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.325834 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "cf947105-e97d-4a1c-9b59-bf6b37461c1e" (UID: "cf947105-e97d-4a1c-9b59-bf6b37461c1e"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.326217 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "cf947105-e97d-4a1c-9b59-bf6b37461c1e" (UID: "cf947105-e97d-4a1c-9b59-bf6b37461c1e"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.326261 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "cf947105-e97d-4a1c-9b59-bf6b37461c1e" (UID: "cf947105-e97d-4a1c-9b59-bf6b37461c1e"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.326590 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "cf947105-e97d-4a1c-9b59-bf6b37461c1e" (UID: "cf947105-e97d-4a1c-9b59-bf6b37461c1e"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.328050 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "cf947105-e97d-4a1c-9b59-bf6b37461c1e" (UID: "cf947105-e97d-4a1c-9b59-bf6b37461c1e"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.328430 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "cf947105-e97d-4a1c-9b59-bf6b37461c1e" (UID: "cf947105-e97d-4a1c-9b59-bf6b37461c1e"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.423155 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ed4919fc-1df3-4328-b8d5-d17bdbd202f7-audit-dir\") pod \"oauth-openshift-5dcd86cbbd-59n9d\" (UID: \"ed4919fc-1df3-4328-b8d5-d17bdbd202f7\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-59n9d" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.423239 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ed4919fc-1df3-4328-b8d5-d17bdbd202f7-audit-policies\") pod \"oauth-openshift-5dcd86cbbd-59n9d\" (UID: \"ed4919fc-1df3-4328-b8d5-d17bdbd202f7\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-59n9d" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.423272 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ed4919fc-1df3-4328-b8d5-d17bdbd202f7-v4-0-config-system-session\") pod \"oauth-openshift-5dcd86cbbd-59n9d\" (UID: \"ed4919fc-1df3-4328-b8d5-d17bdbd202f7\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-59n9d" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.423293 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ed4919fc-1df3-4328-b8d5-d17bdbd202f7-v4-0-config-user-template-login\") pod \"oauth-openshift-5dcd86cbbd-59n9d\" (UID: \"ed4919fc-1df3-4328-b8d5-d17bdbd202f7\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-59n9d" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.423330 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frqvj\" (UniqueName: \"kubernetes.io/projected/ed4919fc-1df3-4328-b8d5-d17bdbd202f7-kube-api-access-frqvj\") pod \"oauth-openshift-5dcd86cbbd-59n9d\" (UID: \"ed4919fc-1df3-4328-b8d5-d17bdbd202f7\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-59n9d" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.423314 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ed4919fc-1df3-4328-b8d5-d17bdbd202f7-audit-dir\") pod \"oauth-openshift-5dcd86cbbd-59n9d\" (UID: \"ed4919fc-1df3-4328-b8d5-d17bdbd202f7\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-59n9d" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.423359 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ed4919fc-1df3-4328-b8d5-d17bdbd202f7-v4-0-config-user-template-error\") pod \"oauth-openshift-5dcd86cbbd-59n9d\" (UID: \"ed4919fc-1df3-4328-b8d5-d17bdbd202f7\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-59n9d" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.423480 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ed4919fc-1df3-4328-b8d5-d17bdbd202f7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5dcd86cbbd-59n9d\" (UID: \"ed4919fc-1df3-4328-b8d5-d17bdbd202f7\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-59n9d" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.423519 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ed4919fc-1df3-4328-b8d5-d17bdbd202f7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5dcd86cbbd-59n9d\" (UID: \"ed4919fc-1df3-4328-b8d5-d17bdbd202f7\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-59n9d" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.423552 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ed4919fc-1df3-4328-b8d5-d17bdbd202f7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5dcd86cbbd-59n9d\" (UID: \"ed4919fc-1df3-4328-b8d5-d17bdbd202f7\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-59n9d" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.423573 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ed4919fc-1df3-4328-b8d5-d17bdbd202f7-v4-0-config-system-service-ca\") pod \"oauth-openshift-5dcd86cbbd-59n9d\" (UID: \"ed4919fc-1df3-4328-b8d5-d17bdbd202f7\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-59n9d" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.423594 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ed4919fc-1df3-4328-b8d5-d17bdbd202f7-v4-0-config-system-router-certs\") pod \"oauth-openshift-5dcd86cbbd-59n9d\" (UID: \"ed4919fc-1df3-4328-b8d5-d17bdbd202f7\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-59n9d" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.423770 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ed4919fc-1df3-4328-b8d5-d17bdbd202f7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5dcd86cbbd-59n9d\" (UID: \"ed4919fc-1df3-4328-b8d5-d17bdbd202f7\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-59n9d" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.423792 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed4919fc-1df3-4328-b8d5-d17bdbd202f7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5dcd86cbbd-59n9d\" (UID: \"ed4919fc-1df3-4328-b8d5-d17bdbd202f7\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-59n9d" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.423815 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ed4919fc-1df3-4328-b8d5-d17bdbd202f7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5dcd86cbbd-59n9d\" (UID: \"ed4919fc-1df3-4328-b8d5-d17bdbd202f7\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-59n9d" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.423863 4664 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.423877 4664 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.423891 4664 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.423906 4664 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.423921 4664 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.423934 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6npk\" (UniqueName: \"kubernetes.io/projected/cf947105-e97d-4a1c-9b59-bf6b37461c1e-kube-api-access-w6npk\") on node \"crc\" DevicePath \"\"" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.423948 4664 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.423961 4664 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.423977 4664 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cf947105-e97d-4a1c-9b59-bf6b37461c1e-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.424434 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ed4919fc-1df3-4328-b8d5-d17bdbd202f7-audit-policies\") pod \"oauth-openshift-5dcd86cbbd-59n9d\" (UID: \"ed4919fc-1df3-4328-b8d5-d17bdbd202f7\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-59n9d" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.425360 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ed4919fc-1df3-4328-b8d5-d17bdbd202f7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5dcd86cbbd-59n9d\" (UID: \"ed4919fc-1df3-4328-b8d5-d17bdbd202f7\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-59n9d" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.426290 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed4919fc-1df3-4328-b8d5-d17bdbd202f7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5dcd86cbbd-59n9d\" (UID: \"ed4919fc-1df3-4328-b8d5-d17bdbd202f7\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-59n9d" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.426969 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ed4919fc-1df3-4328-b8d5-d17bdbd202f7-v4-0-config-system-service-ca\") pod \"oauth-openshift-5dcd86cbbd-59n9d\" (UID: \"ed4919fc-1df3-4328-b8d5-d17bdbd202f7\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-59n9d" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.427513 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ed4919fc-1df3-4328-b8d5-d17bdbd202f7-v4-0-config-system-session\") pod \"oauth-openshift-5dcd86cbbd-59n9d\" (UID: \"ed4919fc-1df3-4328-b8d5-d17bdbd202f7\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-59n9d" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.427721 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ed4919fc-1df3-4328-b8d5-d17bdbd202f7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5dcd86cbbd-59n9d\" (UID: \"ed4919fc-1df3-4328-b8d5-d17bdbd202f7\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-59n9d" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.427757 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ed4919fc-1df3-4328-b8d5-d17bdbd202f7-v4-0-config-user-template-error\") pod \"oauth-openshift-5dcd86cbbd-59n9d\" (UID: \"ed4919fc-1df3-4328-b8d5-d17bdbd202f7\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-59n9d" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.428231 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ed4919fc-1df3-4328-b8d5-d17bdbd202f7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5dcd86cbbd-59n9d\" (UID: \"ed4919fc-1df3-4328-b8d5-d17bdbd202f7\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-59n9d" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.428662 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ed4919fc-1df3-4328-b8d5-d17bdbd202f7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5dcd86cbbd-59n9d\" (UID: \"ed4919fc-1df3-4328-b8d5-d17bdbd202f7\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-59n9d" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.428822 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ed4919fc-1df3-4328-b8d5-d17bdbd202f7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5dcd86cbbd-59n9d\" (UID: \"ed4919fc-1df3-4328-b8d5-d17bdbd202f7\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-59n9d" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.429594 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ed4919fc-1df3-4328-b8d5-d17bdbd202f7-v4-0-config-system-router-certs\") pod \"oauth-openshift-5dcd86cbbd-59n9d\" (UID: \"ed4919fc-1df3-4328-b8d5-d17bdbd202f7\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-59n9d" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.429843 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ed4919fc-1df3-4328-b8d5-d17bdbd202f7-v4-0-config-user-template-login\") pod \"oauth-openshift-5dcd86cbbd-59n9d\" (UID: \"ed4919fc-1df3-4328-b8d5-d17bdbd202f7\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-59n9d" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.441406 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frqvj\" (UniqueName: \"kubernetes.io/projected/ed4919fc-1df3-4328-b8d5-d17bdbd202f7-kube-api-access-frqvj\") pod \"oauth-openshift-5dcd86cbbd-59n9d\" (UID: \"ed4919fc-1df3-4328-b8d5-d17bdbd202f7\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-59n9d" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.535927 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5dcd86cbbd-59n9d" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.565559 4664 generic.go:334] "Generic (PLEG): container finished" podID="cf947105-e97d-4a1c-9b59-bf6b37461c1e" containerID="5c9e2acb46f4810eaf773f4cd064187a66df4475f0cdad646fdd494e5a5711d5" exitCode=0 Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.565649 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-h46lz" event={"ID":"cf947105-e97d-4a1c-9b59-bf6b37461c1e","Type":"ContainerDied","Data":"5c9e2acb46f4810eaf773f4cd064187a66df4475f0cdad646fdd494e5a5711d5"} Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.565694 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-h46lz" event={"ID":"cf947105-e97d-4a1c-9b59-bf6b37461c1e","Type":"ContainerDied","Data":"3ebb4f216ef38315c507dba210b0bd4c9426b9e7c02e4ee9007d37313e2d1919"} Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.565717 4664 scope.go:117] "RemoveContainer" containerID="5c9e2acb46f4810eaf773f4cd064187a66df4475f0cdad646fdd494e5a5711d5" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.565912 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-h46lz" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.589674 4664 scope.go:117] "RemoveContainer" containerID="5c9e2acb46f4810eaf773f4cd064187a66df4475f0cdad646fdd494e5a5711d5" Oct 03 07:52:36 crc kubenswrapper[4664]: E1003 07:52:36.590374 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c9e2acb46f4810eaf773f4cd064187a66df4475f0cdad646fdd494e5a5711d5\": container with ID starting with 5c9e2acb46f4810eaf773f4cd064187a66df4475f0cdad646fdd494e5a5711d5 not found: ID does not exist" containerID="5c9e2acb46f4810eaf773f4cd064187a66df4475f0cdad646fdd494e5a5711d5" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.590410 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c9e2acb46f4810eaf773f4cd064187a66df4475f0cdad646fdd494e5a5711d5"} err="failed to get container status \"5c9e2acb46f4810eaf773f4cd064187a66df4475f0cdad646fdd494e5a5711d5\": rpc error: code = NotFound desc = could not find container \"5c9e2acb46f4810eaf773f4cd064187a66df4475f0cdad646fdd494e5a5711d5\": container with ID starting with 5c9e2acb46f4810eaf773f4cd064187a66df4475f0cdad646fdd494e5a5711d5 not found: ID does not exist" Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.610508 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-h46lz"] Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.614162 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-h46lz"] Oct 03 07:52:36 crc kubenswrapper[4664]: I1003 07:52:36.964636 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5dcd86cbbd-59n9d"] Oct 03 07:52:37 crc kubenswrapper[4664]: I1003 07:52:37.572136 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5dcd86cbbd-59n9d" event={"ID":"ed4919fc-1df3-4328-b8d5-d17bdbd202f7","Type":"ContainerStarted","Data":"205b4a849361988f9fb0943a927de993770b5197921161d4f951d9195dddecdd"} Oct 03 07:52:37 crc kubenswrapper[4664]: I1003 07:52:37.572484 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5dcd86cbbd-59n9d" event={"ID":"ed4919fc-1df3-4328-b8d5-d17bdbd202f7","Type":"ContainerStarted","Data":"0311742bb3408a6c11d463432f38b20157262816aa847779c47b3d3bac1dccd8"} Oct 03 07:52:37 crc kubenswrapper[4664]: I1003 07:52:37.572511 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5dcd86cbbd-59n9d" Oct 03 07:52:37 crc kubenswrapper[4664]: I1003 07:52:37.596920 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5dcd86cbbd-59n9d" podStartSLOduration=27.596897405 podStartE2EDuration="27.596897405s" podCreationTimestamp="2025-10-03 07:52:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:52:37.595473579 +0000 UTC m=+258.416664089" watchObservedRunningTime="2025-10-03 07:52:37.596897405 +0000 UTC m=+258.418087895" Oct 03 07:52:37 crc kubenswrapper[4664]: I1003 07:52:37.780836 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5dcd86cbbd-59n9d" Oct 03 07:52:37 crc kubenswrapper[4664]: I1003 07:52:37.884352 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf947105-e97d-4a1c-9b59-bf6b37461c1e" path="/var/lib/kubelet/pods/cf947105-e97d-4a1c-9b59-bf6b37461c1e/volumes" Oct 03 07:52:58 crc kubenswrapper[4664]: I1003 07:52:58.982691 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-szr58" podUID="478f27ac-050e-4086-82c9-2e23559cf70b" containerName="registry" containerID="cri-o://bf0b9811e718078c691368635b828a300e6559d943528edc44e70aab1c4f9ba4" gracePeriod=30 Oct 03 07:52:59 crc kubenswrapper[4664]: I1003 07:52:59.359509 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:52:59 crc kubenswrapper[4664]: I1003 07:52:59.482698 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"478f27ac-050e-4086-82c9-2e23559cf70b\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " Oct 03 07:52:59 crc kubenswrapper[4664]: I1003 07:52:59.482761 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/478f27ac-050e-4086-82c9-2e23559cf70b-registry-tls\") pod \"478f27ac-050e-4086-82c9-2e23559cf70b\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " Oct 03 07:52:59 crc kubenswrapper[4664]: I1003 07:52:59.482842 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/478f27ac-050e-4086-82c9-2e23559cf70b-ca-trust-extracted\") pod \"478f27ac-050e-4086-82c9-2e23559cf70b\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " Oct 03 07:52:59 crc kubenswrapper[4664]: I1003 07:52:59.482891 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/478f27ac-050e-4086-82c9-2e23559cf70b-installation-pull-secrets\") pod \"478f27ac-050e-4086-82c9-2e23559cf70b\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " Oct 03 07:52:59 crc kubenswrapper[4664]: I1003 07:52:59.482952 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/478f27ac-050e-4086-82c9-2e23559cf70b-registry-certificates\") pod \"478f27ac-050e-4086-82c9-2e23559cf70b\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " Oct 03 07:52:59 crc kubenswrapper[4664]: I1003 07:52:59.482979 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/478f27ac-050e-4086-82c9-2e23559cf70b-bound-sa-token\") pod \"478f27ac-050e-4086-82c9-2e23559cf70b\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " Oct 03 07:52:59 crc kubenswrapper[4664]: I1003 07:52:59.483004 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsnrt\" (UniqueName: \"kubernetes.io/projected/478f27ac-050e-4086-82c9-2e23559cf70b-kube-api-access-fsnrt\") pod \"478f27ac-050e-4086-82c9-2e23559cf70b\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " Oct 03 07:52:59 crc kubenswrapper[4664]: I1003 07:52:59.483055 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/478f27ac-050e-4086-82c9-2e23559cf70b-trusted-ca\") pod \"478f27ac-050e-4086-82c9-2e23559cf70b\" (UID: \"478f27ac-050e-4086-82c9-2e23559cf70b\") " Oct 03 07:52:59 crc kubenswrapper[4664]: I1003 07:52:59.484011 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/478f27ac-050e-4086-82c9-2e23559cf70b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "478f27ac-050e-4086-82c9-2e23559cf70b" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:52:59 crc kubenswrapper[4664]: I1003 07:52:59.484113 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/478f27ac-050e-4086-82c9-2e23559cf70b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "478f27ac-050e-4086-82c9-2e23559cf70b" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:52:59 crc kubenswrapper[4664]: I1003 07:52:59.489854 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/478f27ac-050e-4086-82c9-2e23559cf70b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "478f27ac-050e-4086-82c9-2e23559cf70b" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:52:59 crc kubenswrapper[4664]: I1003 07:52:59.490713 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/478f27ac-050e-4086-82c9-2e23559cf70b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "478f27ac-050e-4086-82c9-2e23559cf70b" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:52:59 crc kubenswrapper[4664]: I1003 07:52:59.494593 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "478f27ac-050e-4086-82c9-2e23559cf70b" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 03 07:52:59 crc kubenswrapper[4664]: I1003 07:52:59.498737 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/478f27ac-050e-4086-82c9-2e23559cf70b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "478f27ac-050e-4086-82c9-2e23559cf70b" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:52:59 crc kubenswrapper[4664]: I1003 07:52:59.499696 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/478f27ac-050e-4086-82c9-2e23559cf70b-kube-api-access-fsnrt" (OuterVolumeSpecName: "kube-api-access-fsnrt") pod "478f27ac-050e-4086-82c9-2e23559cf70b" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b"). InnerVolumeSpecName "kube-api-access-fsnrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:52:59 crc kubenswrapper[4664]: I1003 07:52:59.500649 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/478f27ac-050e-4086-82c9-2e23559cf70b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "478f27ac-050e-4086-82c9-2e23559cf70b" (UID: "478f27ac-050e-4086-82c9-2e23559cf70b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:52:59 crc kubenswrapper[4664]: I1003 07:52:59.584785 4664 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/478f27ac-050e-4086-82c9-2e23559cf70b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 07:52:59 crc kubenswrapper[4664]: I1003 07:52:59.584842 4664 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/478f27ac-050e-4086-82c9-2e23559cf70b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 03 07:52:59 crc kubenswrapper[4664]: I1003 07:52:59.584858 4664 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/478f27ac-050e-4086-82c9-2e23559cf70b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 03 07:52:59 crc kubenswrapper[4664]: I1003 07:52:59.584871 4664 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/478f27ac-050e-4086-82c9-2e23559cf70b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 03 07:52:59 crc kubenswrapper[4664]: I1003 07:52:59.584884 4664 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/478f27ac-050e-4086-82c9-2e23559cf70b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 03 07:52:59 crc kubenswrapper[4664]: I1003 07:52:59.584895 4664 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/478f27ac-050e-4086-82c9-2e23559cf70b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 03 07:52:59 crc kubenswrapper[4664]: I1003 07:52:59.584906 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsnrt\" (UniqueName: \"kubernetes.io/projected/478f27ac-050e-4086-82c9-2e23559cf70b-kube-api-access-fsnrt\") on node \"crc\" DevicePath \"\"" Oct 03 07:52:59 crc kubenswrapper[4664]: I1003 07:52:59.697292 4664 generic.go:334] "Generic (PLEG): container finished" podID="478f27ac-050e-4086-82c9-2e23559cf70b" containerID="bf0b9811e718078c691368635b828a300e6559d943528edc44e70aab1c4f9ba4" exitCode=0 Oct 03 07:52:59 crc kubenswrapper[4664]: I1003 07:52:59.697368 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-szr58" event={"ID":"478f27ac-050e-4086-82c9-2e23559cf70b","Type":"ContainerDied","Data":"bf0b9811e718078c691368635b828a300e6559d943528edc44e70aab1c4f9ba4"} Oct 03 07:52:59 crc kubenswrapper[4664]: I1003 07:52:59.697421 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-szr58" Oct 03 07:52:59 crc kubenswrapper[4664]: I1003 07:52:59.697469 4664 scope.go:117] "RemoveContainer" containerID="bf0b9811e718078c691368635b828a300e6559d943528edc44e70aab1c4f9ba4" Oct 03 07:52:59 crc kubenswrapper[4664]: I1003 07:52:59.697446 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-szr58" event={"ID":"478f27ac-050e-4086-82c9-2e23559cf70b","Type":"ContainerDied","Data":"778d2bcdb0801b4aaeca5861384f6171396b6cbb1e709bc89918b8f4533d0e0c"} Oct 03 07:52:59 crc kubenswrapper[4664]: I1003 07:52:59.714075 4664 scope.go:117] "RemoveContainer" containerID="bf0b9811e718078c691368635b828a300e6559d943528edc44e70aab1c4f9ba4" Oct 03 07:52:59 crc kubenswrapper[4664]: E1003 07:52:59.714568 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf0b9811e718078c691368635b828a300e6559d943528edc44e70aab1c4f9ba4\": container with ID starting with bf0b9811e718078c691368635b828a300e6559d943528edc44e70aab1c4f9ba4 not found: ID does not exist" containerID="bf0b9811e718078c691368635b828a300e6559d943528edc44e70aab1c4f9ba4" Oct 03 07:52:59 crc kubenswrapper[4664]: I1003 07:52:59.714631 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf0b9811e718078c691368635b828a300e6559d943528edc44e70aab1c4f9ba4"} err="failed to get container status \"bf0b9811e718078c691368635b828a300e6559d943528edc44e70aab1c4f9ba4\": rpc error: code = NotFound desc = could not find container \"bf0b9811e718078c691368635b828a300e6559d943528edc44e70aab1c4f9ba4\": container with ID starting with bf0b9811e718078c691368635b828a300e6559d943528edc44e70aab1c4f9ba4 not found: ID does not exist" Oct 03 07:52:59 crc kubenswrapper[4664]: I1003 07:52:59.724923 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-szr58"] Oct 03 07:52:59 crc kubenswrapper[4664]: I1003 07:52:59.742475 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-szr58"] Oct 03 07:52:59 crc kubenswrapper[4664]: I1003 07:52:59.883629 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="478f27ac-050e-4086-82c9-2e23559cf70b" path="/var/lib/kubelet/pods/478f27ac-050e-4086-82c9-2e23559cf70b/volumes" Oct 03 07:53:35 crc kubenswrapper[4664]: I1003 07:53:35.474713 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2t2bc"] Oct 03 07:53:35 crc kubenswrapper[4664]: I1003 07:53:35.476059 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2t2bc" podUID="5c238baa-b35f-404b-b6e3-ebec940e30be" containerName="registry-server" containerID="cri-o://47e075c99e8cb43fd41f83ca04ac2ab2317548da290bb0374f603bc8d2b82130" gracePeriod=30 Oct 03 07:53:35 crc kubenswrapper[4664]: I1003 07:53:35.480901 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n4cpb"] Oct 03 07:53:35 crc kubenswrapper[4664]: I1003 07:53:35.481254 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-n4cpb" podUID="ea079e38-0970-4e57-af62-4910892ea04d" containerName="registry-server" containerID="cri-o://1ecf1bc8ddb7df3a2e5880d5ae0f1041a7702b1c88649b4085a79b4eab9989f2" gracePeriod=30 Oct 03 07:53:35 crc kubenswrapper[4664]: I1003 07:53:35.506974 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gj7qw"] Oct 03 07:53:35 crc kubenswrapper[4664]: I1003 07:53:35.507349 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-gj7qw" podUID="4efd784b-b02d-4298-a96b-ed5663641afa" containerName="marketplace-operator" containerID="cri-o://0e9a92668969e66d232e906c266e4572dfc1ca353999125f1588251a903ce3e8" gracePeriod=30 Oct 03 07:53:35 crc kubenswrapper[4664]: I1003 07:53:35.510351 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gtxvm"] Oct 03 07:53:35 crc kubenswrapper[4664]: I1003 07:53:35.510708 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gtxvm" podUID="583c3690-dbf8-4272-bb35-b5557b7a3e74" containerName="registry-server" containerID="cri-o://27dd716a1f125c3785120296322d9108be131c2470b085d701985b1c8d2d00b7" gracePeriod=30 Oct 03 07:53:35 crc kubenswrapper[4664]: I1003 07:53:35.519655 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gwncn"] Oct 03 07:53:35 crc kubenswrapper[4664]: E1003 07:53:35.519978 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="478f27ac-050e-4086-82c9-2e23559cf70b" containerName="registry" Oct 03 07:53:35 crc kubenswrapper[4664]: I1003 07:53:35.520000 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="478f27ac-050e-4086-82c9-2e23559cf70b" containerName="registry" Oct 03 07:53:35 crc kubenswrapper[4664]: I1003 07:53:35.520128 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="478f27ac-050e-4086-82c9-2e23559cf70b" containerName="registry" Oct 03 07:53:35 crc kubenswrapper[4664]: I1003 07:53:35.520688 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gwncn" Oct 03 07:53:35 crc kubenswrapper[4664]: I1003 07:53:35.524157 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rvmkh"] Oct 03 07:53:35 crc kubenswrapper[4664]: I1003 07:53:35.524396 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rvmkh" podUID="b855f400-75f9-44f7-9da3-1b4a850ac090" containerName="registry-server" containerID="cri-o://58006fba632e0bf618bc78808f16fcb633942aca32c33b55d32489b5cfa6e9c3" gracePeriod=30 Oct 03 07:53:35 crc kubenswrapper[4664]: I1003 07:53:35.527269 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gwncn"] Oct 03 07:53:35 crc kubenswrapper[4664]: I1003 07:53:35.671031 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d87572d6-1577-487c-a43f-e99ea9b20724-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gwncn\" (UID: \"d87572d6-1577-487c-a43f-e99ea9b20724\") " pod="openshift-marketplace/marketplace-operator-79b997595-gwncn" Oct 03 07:53:35 crc kubenswrapper[4664]: I1003 07:53:35.671121 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d87572d6-1577-487c-a43f-e99ea9b20724-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gwncn\" (UID: \"d87572d6-1577-487c-a43f-e99ea9b20724\") " pod="openshift-marketplace/marketplace-operator-79b997595-gwncn" Oct 03 07:53:35 crc kubenswrapper[4664]: I1003 07:53:35.671236 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ptnw\" (UniqueName: \"kubernetes.io/projected/d87572d6-1577-487c-a43f-e99ea9b20724-kube-api-access-4ptnw\") pod \"marketplace-operator-79b997595-gwncn\" (UID: \"d87572d6-1577-487c-a43f-e99ea9b20724\") " pod="openshift-marketplace/marketplace-operator-79b997595-gwncn" Oct 03 07:53:35 crc kubenswrapper[4664]: I1003 07:53:35.772791 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d87572d6-1577-487c-a43f-e99ea9b20724-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gwncn\" (UID: \"d87572d6-1577-487c-a43f-e99ea9b20724\") " pod="openshift-marketplace/marketplace-operator-79b997595-gwncn" Oct 03 07:53:35 crc kubenswrapper[4664]: I1003 07:53:35.772896 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ptnw\" (UniqueName: \"kubernetes.io/projected/d87572d6-1577-487c-a43f-e99ea9b20724-kube-api-access-4ptnw\") pod \"marketplace-operator-79b997595-gwncn\" (UID: \"d87572d6-1577-487c-a43f-e99ea9b20724\") " pod="openshift-marketplace/marketplace-operator-79b997595-gwncn" Oct 03 07:53:35 crc kubenswrapper[4664]: I1003 07:53:35.772931 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d87572d6-1577-487c-a43f-e99ea9b20724-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gwncn\" (UID: \"d87572d6-1577-487c-a43f-e99ea9b20724\") " pod="openshift-marketplace/marketplace-operator-79b997595-gwncn" Oct 03 07:53:35 crc kubenswrapper[4664]: I1003 07:53:35.774168 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d87572d6-1577-487c-a43f-e99ea9b20724-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gwncn\" (UID: \"d87572d6-1577-487c-a43f-e99ea9b20724\") " pod="openshift-marketplace/marketplace-operator-79b997595-gwncn" Oct 03 07:53:35 crc kubenswrapper[4664]: I1003 07:53:35.778450 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d87572d6-1577-487c-a43f-e99ea9b20724-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gwncn\" (UID: \"d87572d6-1577-487c-a43f-e99ea9b20724\") " pod="openshift-marketplace/marketplace-operator-79b997595-gwncn" Oct 03 07:53:35 crc kubenswrapper[4664]: I1003 07:53:35.793992 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ptnw\" (UniqueName: \"kubernetes.io/projected/d87572d6-1577-487c-a43f-e99ea9b20724-kube-api-access-4ptnw\") pod \"marketplace-operator-79b997595-gwncn\" (UID: \"d87572d6-1577-487c-a43f-e99ea9b20724\") " pod="openshift-marketplace/marketplace-operator-79b997595-gwncn" Oct 03 07:53:35 crc kubenswrapper[4664]: I1003 07:53:35.840351 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gwncn" Oct 03 07:53:35 crc kubenswrapper[4664]: I1003 07:53:35.900210 4664 generic.go:334] "Generic (PLEG): container finished" podID="b855f400-75f9-44f7-9da3-1b4a850ac090" containerID="58006fba632e0bf618bc78808f16fcb633942aca32c33b55d32489b5cfa6e9c3" exitCode=0 Oct 03 07:53:35 crc kubenswrapper[4664]: I1003 07:53:35.900237 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rvmkh" event={"ID":"b855f400-75f9-44f7-9da3-1b4a850ac090","Type":"ContainerDied","Data":"58006fba632e0bf618bc78808f16fcb633942aca32c33b55d32489b5cfa6e9c3"} Oct 03 07:53:35 crc kubenswrapper[4664]: I1003 07:53:35.903267 4664 generic.go:334] "Generic (PLEG): container finished" podID="5c238baa-b35f-404b-b6e3-ebec940e30be" containerID="47e075c99e8cb43fd41f83ca04ac2ab2317548da290bb0374f603bc8d2b82130" exitCode=0 Oct 03 07:53:35 crc kubenswrapper[4664]: I1003 07:53:35.903336 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2t2bc" event={"ID":"5c238baa-b35f-404b-b6e3-ebec940e30be","Type":"ContainerDied","Data":"47e075c99e8cb43fd41f83ca04ac2ab2317548da290bb0374f603bc8d2b82130"} Oct 03 07:53:35 crc kubenswrapper[4664]: I1003 07:53:35.905079 4664 generic.go:334] "Generic (PLEG): container finished" podID="4efd784b-b02d-4298-a96b-ed5663641afa" containerID="0e9a92668969e66d232e906c266e4572dfc1ca353999125f1588251a903ce3e8" exitCode=0 Oct 03 07:53:35 crc kubenswrapper[4664]: I1003 07:53:35.905159 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gj7qw" event={"ID":"4efd784b-b02d-4298-a96b-ed5663641afa","Type":"ContainerDied","Data":"0e9a92668969e66d232e906c266e4572dfc1ca353999125f1588251a903ce3e8"} Oct 03 07:53:35 crc kubenswrapper[4664]: I1003 07:53:35.907714 4664 generic.go:334] "Generic (PLEG): container finished" podID="583c3690-dbf8-4272-bb35-b5557b7a3e74" containerID="27dd716a1f125c3785120296322d9108be131c2470b085d701985b1c8d2d00b7" exitCode=0 Oct 03 07:53:35 crc kubenswrapper[4664]: I1003 07:53:35.907795 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gtxvm" event={"ID":"583c3690-dbf8-4272-bb35-b5557b7a3e74","Type":"ContainerDied","Data":"27dd716a1f125c3785120296322d9108be131c2470b085d701985b1c8d2d00b7"} Oct 03 07:53:35 crc kubenswrapper[4664]: I1003 07:53:35.922520 4664 generic.go:334] "Generic (PLEG): container finished" podID="ea079e38-0970-4e57-af62-4910892ea04d" containerID="1ecf1bc8ddb7df3a2e5880d5ae0f1041a7702b1c88649b4085a79b4eab9989f2" exitCode=0 Oct 03 07:53:35 crc kubenswrapper[4664]: I1003 07:53:35.922591 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n4cpb" event={"ID":"ea079e38-0970-4e57-af62-4910892ea04d","Type":"ContainerDied","Data":"1ecf1bc8ddb7df3a2e5880d5ae0f1041a7702b1c88649b4085a79b4eab9989f2"} Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.231403 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gwncn"] Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.450128 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2t2bc" Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.572753 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n4cpb" Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.585644 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4m48\" (UniqueName: \"kubernetes.io/projected/5c238baa-b35f-404b-b6e3-ebec940e30be-kube-api-access-b4m48\") pod \"5c238baa-b35f-404b-b6e3-ebec940e30be\" (UID: \"5c238baa-b35f-404b-b6e3-ebec940e30be\") " Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.585759 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c238baa-b35f-404b-b6e3-ebec940e30be-catalog-content\") pod \"5c238baa-b35f-404b-b6e3-ebec940e30be\" (UID: \"5c238baa-b35f-404b-b6e3-ebec940e30be\") " Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.585825 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c238baa-b35f-404b-b6e3-ebec940e30be-utilities\") pod \"5c238baa-b35f-404b-b6e3-ebec940e30be\" (UID: \"5c238baa-b35f-404b-b6e3-ebec940e30be\") " Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.587986 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c238baa-b35f-404b-b6e3-ebec940e30be-utilities" (OuterVolumeSpecName: "utilities") pod "5c238baa-b35f-404b-b6e3-ebec940e30be" (UID: "5c238baa-b35f-404b-b6e3-ebec940e30be"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.604961 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c238baa-b35f-404b-b6e3-ebec940e30be-kube-api-access-b4m48" (OuterVolumeSpecName: "kube-api-access-b4m48") pod "5c238baa-b35f-404b-b6e3-ebec940e30be" (UID: "5c238baa-b35f-404b-b6e3-ebec940e30be"). InnerVolumeSpecName "kube-api-access-b4m48". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.660684 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c238baa-b35f-404b-b6e3-ebec940e30be-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5c238baa-b35f-404b-b6e3-ebec940e30be" (UID: "5c238baa-b35f-404b-b6e3-ebec940e30be"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.687190 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea079e38-0970-4e57-af62-4910892ea04d-utilities\") pod \"ea079e38-0970-4e57-af62-4910892ea04d\" (UID: \"ea079e38-0970-4e57-af62-4910892ea04d\") " Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.687238 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea079e38-0970-4e57-af62-4910892ea04d-catalog-content\") pod \"ea079e38-0970-4e57-af62-4910892ea04d\" (UID: \"ea079e38-0970-4e57-af62-4910892ea04d\") " Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.687341 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcm94\" (UniqueName: \"kubernetes.io/projected/ea079e38-0970-4e57-af62-4910892ea04d-kube-api-access-jcm94\") pod \"ea079e38-0970-4e57-af62-4910892ea04d\" (UID: \"ea079e38-0970-4e57-af62-4910892ea04d\") " Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.689563 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea079e38-0970-4e57-af62-4910892ea04d-utilities" (OuterVolumeSpecName: "utilities") pod "ea079e38-0970-4e57-af62-4910892ea04d" (UID: "ea079e38-0970-4e57-af62-4910892ea04d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.693535 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea079e38-0970-4e57-af62-4910892ea04d-kube-api-access-jcm94" (OuterVolumeSpecName: "kube-api-access-jcm94") pod "ea079e38-0970-4e57-af62-4910892ea04d" (UID: "ea079e38-0970-4e57-af62-4910892ea04d"). InnerVolumeSpecName "kube-api-access-jcm94". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.696410 4664 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea079e38-0970-4e57-af62-4910892ea04d-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.696452 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4m48\" (UniqueName: \"kubernetes.io/projected/5c238baa-b35f-404b-b6e3-ebec940e30be-kube-api-access-b4m48\") on node \"crc\" DevicePath \"\"" Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.696465 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcm94\" (UniqueName: \"kubernetes.io/projected/ea079e38-0970-4e57-af62-4910892ea04d-kube-api-access-jcm94\") on node \"crc\" DevicePath \"\"" Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.696475 4664 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c238baa-b35f-404b-b6e3-ebec940e30be-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.696483 4664 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c238baa-b35f-404b-b6e3-ebec940e30be-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.717902 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rvmkh" Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.723963 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gj7qw" Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.736569 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gtxvm" Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.742453 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea079e38-0970-4e57-af62-4910892ea04d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ea079e38-0970-4e57-af62-4910892ea04d" (UID: "ea079e38-0970-4e57-af62-4910892ea04d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.797753 4664 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea079e38-0970-4e57-af62-4910892ea04d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.899152 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b855f400-75f9-44f7-9da3-1b4a850ac090-catalog-content\") pod \"b855f400-75f9-44f7-9da3-1b4a850ac090\" (UID: \"b855f400-75f9-44f7-9da3-1b4a850ac090\") " Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.899229 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/583c3690-dbf8-4272-bb35-b5557b7a3e74-utilities\") pod \"583c3690-dbf8-4272-bb35-b5557b7a3e74\" (UID: \"583c3690-dbf8-4272-bb35-b5557b7a3e74\") " Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.899255 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zh7gc\" (UniqueName: \"kubernetes.io/projected/583c3690-dbf8-4272-bb35-b5557b7a3e74-kube-api-access-zh7gc\") pod \"583c3690-dbf8-4272-bb35-b5557b7a3e74\" (UID: \"583c3690-dbf8-4272-bb35-b5557b7a3e74\") " Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.899301 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjwjr\" (UniqueName: \"kubernetes.io/projected/4efd784b-b02d-4298-a96b-ed5663641afa-kube-api-access-mjwjr\") pod \"4efd784b-b02d-4298-a96b-ed5663641afa\" (UID: \"4efd784b-b02d-4298-a96b-ed5663641afa\") " Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.899350 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/583c3690-dbf8-4272-bb35-b5557b7a3e74-catalog-content\") pod \"583c3690-dbf8-4272-bb35-b5557b7a3e74\" (UID: \"583c3690-dbf8-4272-bb35-b5557b7a3e74\") " Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.899398 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnbfr\" (UniqueName: \"kubernetes.io/projected/b855f400-75f9-44f7-9da3-1b4a850ac090-kube-api-access-qnbfr\") pod \"b855f400-75f9-44f7-9da3-1b4a850ac090\" (UID: \"b855f400-75f9-44f7-9da3-1b4a850ac090\") " Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.899432 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4efd784b-b02d-4298-a96b-ed5663641afa-marketplace-operator-metrics\") pod \"4efd784b-b02d-4298-a96b-ed5663641afa\" (UID: \"4efd784b-b02d-4298-a96b-ed5663641afa\") " Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.899459 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4efd784b-b02d-4298-a96b-ed5663641afa-marketplace-trusted-ca\") pod \"4efd784b-b02d-4298-a96b-ed5663641afa\" (UID: \"4efd784b-b02d-4298-a96b-ed5663641afa\") " Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.899475 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b855f400-75f9-44f7-9da3-1b4a850ac090-utilities\") pod \"b855f400-75f9-44f7-9da3-1b4a850ac090\" (UID: \"b855f400-75f9-44f7-9da3-1b4a850ac090\") " Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.900480 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b855f400-75f9-44f7-9da3-1b4a850ac090-utilities" (OuterVolumeSpecName: "utilities") pod "b855f400-75f9-44f7-9da3-1b4a850ac090" (UID: "b855f400-75f9-44f7-9da3-1b4a850ac090"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.901212 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/583c3690-dbf8-4272-bb35-b5557b7a3e74-utilities" (OuterVolumeSpecName: "utilities") pod "583c3690-dbf8-4272-bb35-b5557b7a3e74" (UID: "583c3690-dbf8-4272-bb35-b5557b7a3e74"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.904198 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b855f400-75f9-44f7-9da3-1b4a850ac090-kube-api-access-qnbfr" (OuterVolumeSpecName: "kube-api-access-qnbfr") pod "b855f400-75f9-44f7-9da3-1b4a850ac090" (UID: "b855f400-75f9-44f7-9da3-1b4a850ac090"). InnerVolumeSpecName "kube-api-access-qnbfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.908279 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/583c3690-dbf8-4272-bb35-b5557b7a3e74-kube-api-access-zh7gc" (OuterVolumeSpecName: "kube-api-access-zh7gc") pod "583c3690-dbf8-4272-bb35-b5557b7a3e74" (UID: "583c3690-dbf8-4272-bb35-b5557b7a3e74"). InnerVolumeSpecName "kube-api-access-zh7gc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.908347 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4efd784b-b02d-4298-a96b-ed5663641afa-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "4efd784b-b02d-4298-a96b-ed5663641afa" (UID: "4efd784b-b02d-4298-a96b-ed5663641afa"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.910237 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4efd784b-b02d-4298-a96b-ed5663641afa-kube-api-access-mjwjr" (OuterVolumeSpecName: "kube-api-access-mjwjr") pod "4efd784b-b02d-4298-a96b-ed5663641afa" (UID: "4efd784b-b02d-4298-a96b-ed5663641afa"). InnerVolumeSpecName "kube-api-access-mjwjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.910273 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4efd784b-b02d-4298-a96b-ed5663641afa-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "4efd784b-b02d-4298-a96b-ed5663641afa" (UID: "4efd784b-b02d-4298-a96b-ed5663641afa"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.917900 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/583c3690-dbf8-4272-bb35-b5557b7a3e74-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "583c3690-dbf8-4272-bb35-b5557b7a3e74" (UID: "583c3690-dbf8-4272-bb35-b5557b7a3e74"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.928892 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rvmkh" Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.928960 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rvmkh" event={"ID":"b855f400-75f9-44f7-9da3-1b4a850ac090","Type":"ContainerDied","Data":"9f44bf72963849d735930f05db476a9c0207df8f26361baaa1690c872302be58"} Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.929007 4664 scope.go:117] "RemoveContainer" containerID="58006fba632e0bf618bc78808f16fcb633942aca32c33b55d32489b5cfa6e9c3" Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.931257 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2t2bc" Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.931229 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2t2bc" event={"ID":"5c238baa-b35f-404b-b6e3-ebec940e30be","Type":"ContainerDied","Data":"407c6a0a92fd9e8c562d093d20e0749be92f57771ea2404294790fc4894a7adf"} Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.932932 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gj7qw" event={"ID":"4efd784b-b02d-4298-a96b-ed5663641afa","Type":"ContainerDied","Data":"a4dcd594943948e7c9116c5878c9c29d03d0bffb47f432a37d6138483d4ea04d"} Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.932944 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gj7qw" Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.936120 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gtxvm" event={"ID":"583c3690-dbf8-4272-bb35-b5557b7a3e74","Type":"ContainerDied","Data":"a33fd0bebbb47d9df35316a4e49fb9bd9236cd12954709b0208c6a7e384293e1"} Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.936202 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gtxvm" Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.943463 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n4cpb" event={"ID":"ea079e38-0970-4e57-af62-4910892ea04d","Type":"ContainerDied","Data":"8e4fbdc4b3415f1af4bca9861846e19cf703a9a4390a6f44b8ea5e624c26e12b"} Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.943821 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n4cpb" Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.945954 4664 scope.go:117] "RemoveContainer" containerID="09af6798fe0e74acd7e15256ad0a0f97c7bc9f7679278439751f7bae1494d955" Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.946419 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gwncn" event={"ID":"d87572d6-1577-487c-a43f-e99ea9b20724","Type":"ContainerStarted","Data":"4bc96e1b538323037acc56b38fe59f2cea82e97983c55a1def03d2ddcc7e331c"} Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.946452 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gwncn" event={"ID":"d87572d6-1577-487c-a43f-e99ea9b20724","Type":"ContainerStarted","Data":"f4b3b577140cbea3b7fc9da78f7df95fa953a8a790229ae5bbb8c9178020aae1"} Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.947438 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-gwncn" Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.954945 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-gwncn" Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.976236 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-gwncn" podStartSLOduration=1.9762136799999999 podStartE2EDuration="1.97621368s" podCreationTimestamp="2025-10-03 07:53:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:53:36.972693421 +0000 UTC m=+317.793883931" watchObservedRunningTime="2025-10-03 07:53:36.97621368 +0000 UTC m=+317.797404170" Oct 03 07:53:36 crc kubenswrapper[4664]: I1003 07:53:36.991300 4664 scope.go:117] "RemoveContainer" containerID="b62562a28a58fabb9a5b01f537214d33128d05ad09af16996ffca85b151636d5" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.002515 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zh7gc\" (UniqueName: \"kubernetes.io/projected/583c3690-dbf8-4272-bb35-b5557b7a3e74-kube-api-access-zh7gc\") on node \"crc\" DevicePath \"\"" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.002553 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjwjr\" (UniqueName: \"kubernetes.io/projected/4efd784b-b02d-4298-a96b-ed5663641afa-kube-api-access-mjwjr\") on node \"crc\" DevicePath \"\"" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.002564 4664 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/583c3690-dbf8-4272-bb35-b5557b7a3e74-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.002575 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnbfr\" (UniqueName: \"kubernetes.io/projected/b855f400-75f9-44f7-9da3-1b4a850ac090-kube-api-access-qnbfr\") on node \"crc\" DevicePath \"\"" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.002587 4664 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4efd784b-b02d-4298-a96b-ed5663641afa-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.002599 4664 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4efd784b-b02d-4298-a96b-ed5663641afa-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.002638 4664 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b855f400-75f9-44f7-9da3-1b4a850ac090-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.002651 4664 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/583c3690-dbf8-4272-bb35-b5557b7a3e74-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.014397 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gj7qw"] Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.024171 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gj7qw"] Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.034290 4664 scope.go:117] "RemoveContainer" containerID="47e075c99e8cb43fd41f83ca04ac2ab2317548da290bb0374f603bc8d2b82130" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.048489 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gtxvm"] Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.054273 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b855f400-75f9-44f7-9da3-1b4a850ac090-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b855f400-75f9-44f7-9da3-1b4a850ac090" (UID: "b855f400-75f9-44f7-9da3-1b4a850ac090"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.062654 4664 scope.go:117] "RemoveContainer" containerID="3f327c4c349eaf2f06061e3674b98765fa8604fdb5f0889cb36c04cad955e6f5" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.064230 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gtxvm"] Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.067531 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2t2bc"] Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.070672 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2t2bc"] Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.073189 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n4cpb"] Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.075413 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-n4cpb"] Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.076514 4664 scope.go:117] "RemoveContainer" containerID="fc5f5f61d2852dba7006aea91cc5d03bbf7b7df35f5b4aba10cbccb8ec21bcf4" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.088912 4664 scope.go:117] "RemoveContainer" containerID="0e9a92668969e66d232e906c266e4572dfc1ca353999125f1588251a903ce3e8" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.099594 4664 scope.go:117] "RemoveContainer" containerID="27dd716a1f125c3785120296322d9108be131c2470b085d701985b1c8d2d00b7" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.104867 4664 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b855f400-75f9-44f7-9da3-1b4a850ac090-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.111072 4664 scope.go:117] "RemoveContainer" containerID="5f9bd733440713c370f0f03c6b638cb3bdbf57b2f7c529be624663a75bf7d830" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.123951 4664 scope.go:117] "RemoveContainer" containerID="e9e355b6077feadbe86518ec74bafe3c4547c5c7215287c7e918864fc301967d" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.138065 4664 scope.go:117] "RemoveContainer" containerID="1ecf1bc8ddb7df3a2e5880d5ae0f1041a7702b1c88649b4085a79b4eab9989f2" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.152368 4664 scope.go:117] "RemoveContainer" containerID="19eeb749c947f19a3a372484ff897b2591519ebcc04afc0ca2b4f331bc4bd138" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.167064 4664 scope.go:117] "RemoveContainer" containerID="e49f502612bef94c12e697bc00f79db50f7b907f80534b51ad099ce75963af10" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.264917 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rvmkh"] Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.267302 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rvmkh"] Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.669568 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pvwj7"] Oct 03 07:53:37 crc kubenswrapper[4664]: E1003 07:53:37.670045 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b855f400-75f9-44f7-9da3-1b4a850ac090" containerName="extract-utilities" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.670057 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="b855f400-75f9-44f7-9da3-1b4a850ac090" containerName="extract-utilities" Oct 03 07:53:37 crc kubenswrapper[4664]: E1003 07:53:37.670070 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea079e38-0970-4e57-af62-4910892ea04d" containerName="extract-content" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.670076 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea079e38-0970-4e57-af62-4910892ea04d" containerName="extract-content" Oct 03 07:53:37 crc kubenswrapper[4664]: E1003 07:53:37.670087 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="583c3690-dbf8-4272-bb35-b5557b7a3e74" containerName="registry-server" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.670092 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="583c3690-dbf8-4272-bb35-b5557b7a3e74" containerName="registry-server" Oct 03 07:53:37 crc kubenswrapper[4664]: E1003 07:53:37.670100 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b855f400-75f9-44f7-9da3-1b4a850ac090" containerName="extract-content" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.670106 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="b855f400-75f9-44f7-9da3-1b4a850ac090" containerName="extract-content" Oct 03 07:53:37 crc kubenswrapper[4664]: E1003 07:53:37.670115 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b855f400-75f9-44f7-9da3-1b4a850ac090" containerName="registry-server" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.670122 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="b855f400-75f9-44f7-9da3-1b4a850ac090" containerName="registry-server" Oct 03 07:53:37 crc kubenswrapper[4664]: E1003 07:53:37.670131 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4efd784b-b02d-4298-a96b-ed5663641afa" containerName="marketplace-operator" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.670138 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="4efd784b-b02d-4298-a96b-ed5663641afa" containerName="marketplace-operator" Oct 03 07:53:37 crc kubenswrapper[4664]: E1003 07:53:37.670147 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="583c3690-dbf8-4272-bb35-b5557b7a3e74" containerName="extract-utilities" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.670153 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="583c3690-dbf8-4272-bb35-b5557b7a3e74" containerName="extract-utilities" Oct 03 07:53:37 crc kubenswrapper[4664]: E1003 07:53:37.670162 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c238baa-b35f-404b-b6e3-ebec940e30be" containerName="extract-content" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.670167 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c238baa-b35f-404b-b6e3-ebec940e30be" containerName="extract-content" Oct 03 07:53:37 crc kubenswrapper[4664]: E1003 07:53:37.670175 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="583c3690-dbf8-4272-bb35-b5557b7a3e74" containerName="extract-content" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.670181 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="583c3690-dbf8-4272-bb35-b5557b7a3e74" containerName="extract-content" Oct 03 07:53:37 crc kubenswrapper[4664]: E1003 07:53:37.670191 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea079e38-0970-4e57-af62-4910892ea04d" containerName="extract-utilities" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.670196 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea079e38-0970-4e57-af62-4910892ea04d" containerName="extract-utilities" Oct 03 07:53:37 crc kubenswrapper[4664]: E1003 07:53:37.670204 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c238baa-b35f-404b-b6e3-ebec940e30be" containerName="registry-server" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.670209 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c238baa-b35f-404b-b6e3-ebec940e30be" containerName="registry-server" Oct 03 07:53:37 crc kubenswrapper[4664]: E1003 07:53:37.670216 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c238baa-b35f-404b-b6e3-ebec940e30be" containerName="extract-utilities" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.670222 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c238baa-b35f-404b-b6e3-ebec940e30be" containerName="extract-utilities" Oct 03 07:53:37 crc kubenswrapper[4664]: E1003 07:53:37.670229 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea079e38-0970-4e57-af62-4910892ea04d" containerName="registry-server" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.670235 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea079e38-0970-4e57-af62-4910892ea04d" containerName="registry-server" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.670320 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="4efd784b-b02d-4298-a96b-ed5663641afa" containerName="marketplace-operator" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.670332 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea079e38-0970-4e57-af62-4910892ea04d" containerName="registry-server" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.670341 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="b855f400-75f9-44f7-9da3-1b4a850ac090" containerName="registry-server" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.670350 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="583c3690-dbf8-4272-bb35-b5557b7a3e74" containerName="registry-server" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.670359 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c238baa-b35f-404b-b6e3-ebec940e30be" containerName="registry-server" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.671189 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pvwj7" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.673096 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.686342 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pvwj7"] Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.814504 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3abe4747-28b2-446f-afd5-b0e736c90d03-utilities\") pod \"redhat-marketplace-pvwj7\" (UID: \"3abe4747-28b2-446f-afd5-b0e736c90d03\") " pod="openshift-marketplace/redhat-marketplace-pvwj7" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.814551 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxsxl\" (UniqueName: \"kubernetes.io/projected/3abe4747-28b2-446f-afd5-b0e736c90d03-kube-api-access-hxsxl\") pod \"redhat-marketplace-pvwj7\" (UID: \"3abe4747-28b2-446f-afd5-b0e736c90d03\") " pod="openshift-marketplace/redhat-marketplace-pvwj7" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.814598 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3abe4747-28b2-446f-afd5-b0e736c90d03-catalog-content\") pod \"redhat-marketplace-pvwj7\" (UID: \"3abe4747-28b2-446f-afd5-b0e736c90d03\") " pod="openshift-marketplace/redhat-marketplace-pvwj7" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.871988 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dxhhl"] Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.873302 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dxhhl" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.875993 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.884026 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4efd784b-b02d-4298-a96b-ed5663641afa" path="/var/lib/kubelet/pods/4efd784b-b02d-4298-a96b-ed5663641afa/volumes" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.884488 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="583c3690-dbf8-4272-bb35-b5557b7a3e74" path="/var/lib/kubelet/pods/583c3690-dbf8-4272-bb35-b5557b7a3e74/volumes" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.885052 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c238baa-b35f-404b-b6e3-ebec940e30be" path="/var/lib/kubelet/pods/5c238baa-b35f-404b-b6e3-ebec940e30be/volumes" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.886068 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b855f400-75f9-44f7-9da3-1b4a850ac090" path="/var/lib/kubelet/pods/b855f400-75f9-44f7-9da3-1b4a850ac090/volumes" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.887248 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea079e38-0970-4e57-af62-4910892ea04d" path="/var/lib/kubelet/pods/ea079e38-0970-4e57-af62-4910892ea04d/volumes" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.888244 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dxhhl"] Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.915556 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3abe4747-28b2-446f-afd5-b0e736c90d03-catalog-content\") pod \"redhat-marketplace-pvwj7\" (UID: \"3abe4747-28b2-446f-afd5-b0e736c90d03\") " pod="openshift-marketplace/redhat-marketplace-pvwj7" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.915642 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3abe4747-28b2-446f-afd5-b0e736c90d03-utilities\") pod \"redhat-marketplace-pvwj7\" (UID: \"3abe4747-28b2-446f-afd5-b0e736c90d03\") " pod="openshift-marketplace/redhat-marketplace-pvwj7" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.915665 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxsxl\" (UniqueName: \"kubernetes.io/projected/3abe4747-28b2-446f-afd5-b0e736c90d03-kube-api-access-hxsxl\") pod \"redhat-marketplace-pvwj7\" (UID: \"3abe4747-28b2-446f-afd5-b0e736c90d03\") " pod="openshift-marketplace/redhat-marketplace-pvwj7" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.916180 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3abe4747-28b2-446f-afd5-b0e736c90d03-utilities\") pod \"redhat-marketplace-pvwj7\" (UID: \"3abe4747-28b2-446f-afd5-b0e736c90d03\") " pod="openshift-marketplace/redhat-marketplace-pvwj7" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.916195 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3abe4747-28b2-446f-afd5-b0e736c90d03-catalog-content\") pod \"redhat-marketplace-pvwj7\" (UID: \"3abe4747-28b2-446f-afd5-b0e736c90d03\") " pod="openshift-marketplace/redhat-marketplace-pvwj7" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.933152 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxsxl\" (UniqueName: \"kubernetes.io/projected/3abe4747-28b2-446f-afd5-b0e736c90d03-kube-api-access-hxsxl\") pod \"redhat-marketplace-pvwj7\" (UID: \"3abe4747-28b2-446f-afd5-b0e736c90d03\") " pod="openshift-marketplace/redhat-marketplace-pvwj7" Oct 03 07:53:37 crc kubenswrapper[4664]: I1003 07:53:37.987450 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pvwj7" Oct 03 07:53:38 crc kubenswrapper[4664]: I1003 07:53:38.016977 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2p2h\" (UniqueName: \"kubernetes.io/projected/61835213-5c1e-47e2-88ed-453c167e750d-kube-api-access-p2p2h\") pod \"redhat-operators-dxhhl\" (UID: \"61835213-5c1e-47e2-88ed-453c167e750d\") " pod="openshift-marketplace/redhat-operators-dxhhl" Oct 03 07:53:38 crc kubenswrapper[4664]: I1003 07:53:38.018994 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61835213-5c1e-47e2-88ed-453c167e750d-utilities\") pod \"redhat-operators-dxhhl\" (UID: \"61835213-5c1e-47e2-88ed-453c167e750d\") " pod="openshift-marketplace/redhat-operators-dxhhl" Oct 03 07:53:38 crc kubenswrapper[4664]: I1003 07:53:38.019072 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61835213-5c1e-47e2-88ed-453c167e750d-catalog-content\") pod \"redhat-operators-dxhhl\" (UID: \"61835213-5c1e-47e2-88ed-453c167e750d\") " pod="openshift-marketplace/redhat-operators-dxhhl" Oct 03 07:53:38 crc kubenswrapper[4664]: I1003 07:53:38.120486 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61835213-5c1e-47e2-88ed-453c167e750d-catalog-content\") pod \"redhat-operators-dxhhl\" (UID: \"61835213-5c1e-47e2-88ed-453c167e750d\") " pod="openshift-marketplace/redhat-operators-dxhhl" Oct 03 07:53:38 crc kubenswrapper[4664]: I1003 07:53:38.120888 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2p2h\" (UniqueName: \"kubernetes.io/projected/61835213-5c1e-47e2-88ed-453c167e750d-kube-api-access-p2p2h\") pod \"redhat-operators-dxhhl\" (UID: \"61835213-5c1e-47e2-88ed-453c167e750d\") " pod="openshift-marketplace/redhat-operators-dxhhl" Oct 03 07:53:38 crc kubenswrapper[4664]: I1003 07:53:38.120912 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61835213-5c1e-47e2-88ed-453c167e750d-utilities\") pod \"redhat-operators-dxhhl\" (UID: \"61835213-5c1e-47e2-88ed-453c167e750d\") " pod="openshift-marketplace/redhat-operators-dxhhl" Oct 03 07:53:38 crc kubenswrapper[4664]: I1003 07:53:38.120979 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61835213-5c1e-47e2-88ed-453c167e750d-catalog-content\") pod \"redhat-operators-dxhhl\" (UID: \"61835213-5c1e-47e2-88ed-453c167e750d\") " pod="openshift-marketplace/redhat-operators-dxhhl" Oct 03 07:53:38 crc kubenswrapper[4664]: I1003 07:53:38.121235 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61835213-5c1e-47e2-88ed-453c167e750d-utilities\") pod \"redhat-operators-dxhhl\" (UID: \"61835213-5c1e-47e2-88ed-453c167e750d\") " pod="openshift-marketplace/redhat-operators-dxhhl" Oct 03 07:53:38 crc kubenswrapper[4664]: I1003 07:53:38.139640 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2p2h\" (UniqueName: \"kubernetes.io/projected/61835213-5c1e-47e2-88ed-453c167e750d-kube-api-access-p2p2h\") pod \"redhat-operators-dxhhl\" (UID: \"61835213-5c1e-47e2-88ed-453c167e750d\") " pod="openshift-marketplace/redhat-operators-dxhhl" Oct 03 07:53:38 crc kubenswrapper[4664]: I1003 07:53:38.152372 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pvwj7"] Oct 03 07:53:38 crc kubenswrapper[4664]: W1003 07:53:38.161264 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3abe4747_28b2_446f_afd5_b0e736c90d03.slice/crio-b0a9ef3867053c81dd728c6f257fd35e1d6a90c2edc97ab5a334b73ade5622fa WatchSource:0}: Error finding container b0a9ef3867053c81dd728c6f257fd35e1d6a90c2edc97ab5a334b73ade5622fa: Status 404 returned error can't find the container with id b0a9ef3867053c81dd728c6f257fd35e1d6a90c2edc97ab5a334b73ade5622fa Oct 03 07:53:38 crc kubenswrapper[4664]: I1003 07:53:38.200005 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dxhhl" Oct 03 07:53:38 crc kubenswrapper[4664]: I1003 07:53:38.385065 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dxhhl"] Oct 03 07:53:38 crc kubenswrapper[4664]: W1003 07:53:38.448729 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61835213_5c1e_47e2_88ed_453c167e750d.slice/crio-26ee705b6c8945c16658232b228b797f6cca37be033a982297e4a902122a5bc3 WatchSource:0}: Error finding container 26ee705b6c8945c16658232b228b797f6cca37be033a982297e4a902122a5bc3: Status 404 returned error can't find the container with id 26ee705b6c8945c16658232b228b797f6cca37be033a982297e4a902122a5bc3 Oct 03 07:53:38 crc kubenswrapper[4664]: I1003 07:53:38.961736 4664 generic.go:334] "Generic (PLEG): container finished" podID="61835213-5c1e-47e2-88ed-453c167e750d" containerID="07e9f3e17bea6022aacce7bc107e1a6e6a75810ad7e42ff2921c24dc43b603bd" exitCode=0 Oct 03 07:53:38 crc kubenswrapper[4664]: I1003 07:53:38.961810 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dxhhl" event={"ID":"61835213-5c1e-47e2-88ed-453c167e750d","Type":"ContainerDied","Data":"07e9f3e17bea6022aacce7bc107e1a6e6a75810ad7e42ff2921c24dc43b603bd"} Oct 03 07:53:38 crc kubenswrapper[4664]: I1003 07:53:38.961839 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dxhhl" event={"ID":"61835213-5c1e-47e2-88ed-453c167e750d","Type":"ContainerStarted","Data":"26ee705b6c8945c16658232b228b797f6cca37be033a982297e4a902122a5bc3"} Oct 03 07:53:38 crc kubenswrapper[4664]: I1003 07:53:38.963340 4664 generic.go:334] "Generic (PLEG): container finished" podID="3abe4747-28b2-446f-afd5-b0e736c90d03" containerID="5a4fcdfa2dc98b1cb15fa28fea83f1e13f5a947bf08fef7bef85fda05c9c50e8" exitCode=0 Oct 03 07:53:38 crc kubenswrapper[4664]: I1003 07:53:38.963404 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pvwj7" event={"ID":"3abe4747-28b2-446f-afd5-b0e736c90d03","Type":"ContainerDied","Data":"5a4fcdfa2dc98b1cb15fa28fea83f1e13f5a947bf08fef7bef85fda05c9c50e8"} Oct 03 07:53:38 crc kubenswrapper[4664]: I1003 07:53:38.963426 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pvwj7" event={"ID":"3abe4747-28b2-446f-afd5-b0e736c90d03","Type":"ContainerStarted","Data":"b0a9ef3867053c81dd728c6f257fd35e1d6a90c2edc97ab5a334b73ade5622fa"} Oct 03 07:53:40 crc kubenswrapper[4664]: I1003 07:53:40.078809 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-27n9z"] Oct 03 07:53:40 crc kubenswrapper[4664]: I1003 07:53:40.084122 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-27n9z" Oct 03 07:53:40 crc kubenswrapper[4664]: I1003 07:53:40.089383 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 03 07:53:40 crc kubenswrapper[4664]: I1003 07:53:40.092941 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-27n9z"] Oct 03 07:53:40 crc kubenswrapper[4664]: I1003 07:53:40.247759 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfffz\" (UniqueName: \"kubernetes.io/projected/5844f220-7a74-41b4-9b03-eee894a66f32-kube-api-access-nfffz\") pod \"certified-operators-27n9z\" (UID: \"5844f220-7a74-41b4-9b03-eee894a66f32\") " pod="openshift-marketplace/certified-operators-27n9z" Oct 03 07:53:40 crc kubenswrapper[4664]: I1003 07:53:40.247878 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5844f220-7a74-41b4-9b03-eee894a66f32-catalog-content\") pod \"certified-operators-27n9z\" (UID: \"5844f220-7a74-41b4-9b03-eee894a66f32\") " pod="openshift-marketplace/certified-operators-27n9z" Oct 03 07:53:40 crc kubenswrapper[4664]: I1003 07:53:40.247905 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5844f220-7a74-41b4-9b03-eee894a66f32-utilities\") pod \"certified-operators-27n9z\" (UID: \"5844f220-7a74-41b4-9b03-eee894a66f32\") " pod="openshift-marketplace/certified-operators-27n9z" Oct 03 07:53:40 crc kubenswrapper[4664]: I1003 07:53:40.273330 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zr8nh"] Oct 03 07:53:40 crc kubenswrapper[4664]: I1003 07:53:40.274483 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zr8nh" Oct 03 07:53:40 crc kubenswrapper[4664]: I1003 07:53:40.276941 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 03 07:53:40 crc kubenswrapper[4664]: I1003 07:53:40.285633 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zr8nh"] Oct 03 07:53:40 crc kubenswrapper[4664]: I1003 07:53:40.348580 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5844f220-7a74-41b4-9b03-eee894a66f32-utilities\") pod \"certified-operators-27n9z\" (UID: \"5844f220-7a74-41b4-9b03-eee894a66f32\") " pod="openshift-marketplace/certified-operators-27n9z" Oct 03 07:53:40 crc kubenswrapper[4664]: I1003 07:53:40.348676 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfffz\" (UniqueName: \"kubernetes.io/projected/5844f220-7a74-41b4-9b03-eee894a66f32-kube-api-access-nfffz\") pod \"certified-operators-27n9z\" (UID: \"5844f220-7a74-41b4-9b03-eee894a66f32\") " pod="openshift-marketplace/certified-operators-27n9z" Oct 03 07:53:40 crc kubenswrapper[4664]: I1003 07:53:40.348718 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5844f220-7a74-41b4-9b03-eee894a66f32-catalog-content\") pod \"certified-operators-27n9z\" (UID: \"5844f220-7a74-41b4-9b03-eee894a66f32\") " pod="openshift-marketplace/certified-operators-27n9z" Oct 03 07:53:40 crc kubenswrapper[4664]: I1003 07:53:40.349116 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5844f220-7a74-41b4-9b03-eee894a66f32-catalog-content\") pod \"certified-operators-27n9z\" (UID: \"5844f220-7a74-41b4-9b03-eee894a66f32\") " pod="openshift-marketplace/certified-operators-27n9z" Oct 03 07:53:40 crc kubenswrapper[4664]: I1003 07:53:40.349217 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5844f220-7a74-41b4-9b03-eee894a66f32-utilities\") pod \"certified-operators-27n9z\" (UID: \"5844f220-7a74-41b4-9b03-eee894a66f32\") " pod="openshift-marketplace/certified-operators-27n9z" Oct 03 07:53:40 crc kubenswrapper[4664]: I1003 07:53:40.369187 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfffz\" (UniqueName: \"kubernetes.io/projected/5844f220-7a74-41b4-9b03-eee894a66f32-kube-api-access-nfffz\") pod \"certified-operators-27n9z\" (UID: \"5844f220-7a74-41b4-9b03-eee894a66f32\") " pod="openshift-marketplace/certified-operators-27n9z" Oct 03 07:53:40 crc kubenswrapper[4664]: I1003 07:53:40.403093 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-27n9z" Oct 03 07:53:40 crc kubenswrapper[4664]: I1003 07:53:40.449859 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjf4l\" (UniqueName: \"kubernetes.io/projected/40c95b7d-be3e-4613-b795-f5d636b12ce4-kube-api-access-hjf4l\") pod \"community-operators-zr8nh\" (UID: \"40c95b7d-be3e-4613-b795-f5d636b12ce4\") " pod="openshift-marketplace/community-operators-zr8nh" Oct 03 07:53:40 crc kubenswrapper[4664]: I1003 07:53:40.449979 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40c95b7d-be3e-4613-b795-f5d636b12ce4-utilities\") pod \"community-operators-zr8nh\" (UID: \"40c95b7d-be3e-4613-b795-f5d636b12ce4\") " pod="openshift-marketplace/community-operators-zr8nh" Oct 03 07:53:40 crc kubenswrapper[4664]: I1003 07:53:40.450044 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40c95b7d-be3e-4613-b795-f5d636b12ce4-catalog-content\") pod \"community-operators-zr8nh\" (UID: \"40c95b7d-be3e-4613-b795-f5d636b12ce4\") " pod="openshift-marketplace/community-operators-zr8nh" Oct 03 07:53:40 crc kubenswrapper[4664]: I1003 07:53:40.551793 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjf4l\" (UniqueName: \"kubernetes.io/projected/40c95b7d-be3e-4613-b795-f5d636b12ce4-kube-api-access-hjf4l\") pod \"community-operators-zr8nh\" (UID: \"40c95b7d-be3e-4613-b795-f5d636b12ce4\") " pod="openshift-marketplace/community-operators-zr8nh" Oct 03 07:53:40 crc kubenswrapper[4664]: I1003 07:53:40.551852 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40c95b7d-be3e-4613-b795-f5d636b12ce4-utilities\") pod \"community-operators-zr8nh\" (UID: \"40c95b7d-be3e-4613-b795-f5d636b12ce4\") " pod="openshift-marketplace/community-operators-zr8nh" Oct 03 07:53:40 crc kubenswrapper[4664]: I1003 07:53:40.551901 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40c95b7d-be3e-4613-b795-f5d636b12ce4-catalog-content\") pod \"community-operators-zr8nh\" (UID: \"40c95b7d-be3e-4613-b795-f5d636b12ce4\") " pod="openshift-marketplace/community-operators-zr8nh" Oct 03 07:53:40 crc kubenswrapper[4664]: I1003 07:53:40.553096 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40c95b7d-be3e-4613-b795-f5d636b12ce4-catalog-content\") pod \"community-operators-zr8nh\" (UID: \"40c95b7d-be3e-4613-b795-f5d636b12ce4\") " pod="openshift-marketplace/community-operators-zr8nh" Oct 03 07:53:40 crc kubenswrapper[4664]: I1003 07:53:40.553126 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40c95b7d-be3e-4613-b795-f5d636b12ce4-utilities\") pod \"community-operators-zr8nh\" (UID: \"40c95b7d-be3e-4613-b795-f5d636b12ce4\") " pod="openshift-marketplace/community-operators-zr8nh" Oct 03 07:53:40 crc kubenswrapper[4664]: I1003 07:53:40.574535 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjf4l\" (UniqueName: \"kubernetes.io/projected/40c95b7d-be3e-4613-b795-f5d636b12ce4-kube-api-access-hjf4l\") pod \"community-operators-zr8nh\" (UID: \"40c95b7d-be3e-4613-b795-f5d636b12ce4\") " pod="openshift-marketplace/community-operators-zr8nh" Oct 03 07:53:40 crc kubenswrapper[4664]: I1003 07:53:40.598310 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-27n9z"] Oct 03 07:53:40 crc kubenswrapper[4664]: I1003 07:53:40.605329 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zr8nh" Oct 03 07:53:40 crc kubenswrapper[4664]: W1003 07:53:40.607538 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5844f220_7a74_41b4_9b03_eee894a66f32.slice/crio-2f07cc80e8ebf1a84530263c181f38612ead86c5e2d6bc06dd0289ffc41d788e WatchSource:0}: Error finding container 2f07cc80e8ebf1a84530263c181f38612ead86c5e2d6bc06dd0289ffc41d788e: Status 404 returned error can't find the container with id 2f07cc80e8ebf1a84530263c181f38612ead86c5e2d6bc06dd0289ffc41d788e Oct 03 07:53:40 crc kubenswrapper[4664]: I1003 07:53:40.805916 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zr8nh"] Oct 03 07:53:40 crc kubenswrapper[4664]: I1003 07:53:40.975309 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dxhhl" event={"ID":"61835213-5c1e-47e2-88ed-453c167e750d","Type":"ContainerStarted","Data":"5dc42e37f28b32171202d75578de698fefb569d9c7a5d2463b81f02f57109d07"} Oct 03 07:53:40 crc kubenswrapper[4664]: I1003 07:53:40.976903 4664 generic.go:334] "Generic (PLEG): container finished" podID="5844f220-7a74-41b4-9b03-eee894a66f32" containerID="b9c2c1734bf2cdd6005ce365be697d573586cfd9fb193cdcc214ba8dc0ef8eca" exitCode=0 Oct 03 07:53:40 crc kubenswrapper[4664]: I1003 07:53:40.976967 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-27n9z" event={"ID":"5844f220-7a74-41b4-9b03-eee894a66f32","Type":"ContainerDied","Data":"b9c2c1734bf2cdd6005ce365be697d573586cfd9fb193cdcc214ba8dc0ef8eca"} Oct 03 07:53:40 crc kubenswrapper[4664]: I1003 07:53:40.976989 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-27n9z" event={"ID":"5844f220-7a74-41b4-9b03-eee894a66f32","Type":"ContainerStarted","Data":"2f07cc80e8ebf1a84530263c181f38612ead86c5e2d6bc06dd0289ffc41d788e"} Oct 03 07:53:40 crc kubenswrapper[4664]: I1003 07:53:40.978715 4664 generic.go:334] "Generic (PLEG): container finished" podID="3abe4747-28b2-446f-afd5-b0e736c90d03" containerID="bf0a79f1561a6c7d6263173edd28b1bb33c42f624205d219df38f255e5a1a38b" exitCode=0 Oct 03 07:53:40 crc kubenswrapper[4664]: I1003 07:53:40.978806 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pvwj7" event={"ID":"3abe4747-28b2-446f-afd5-b0e736c90d03","Type":"ContainerDied","Data":"bf0a79f1561a6c7d6263173edd28b1bb33c42f624205d219df38f255e5a1a38b"} Oct 03 07:53:40 crc kubenswrapper[4664]: I1003 07:53:40.979761 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zr8nh" event={"ID":"40c95b7d-be3e-4613-b795-f5d636b12ce4","Type":"ContainerStarted","Data":"436d7e3d93761d0608ac3f96001e37dd9636b14db68b10f34834898a019172ab"} Oct 03 07:53:41 crc kubenswrapper[4664]: I1003 07:53:41.988035 4664 generic.go:334] "Generic (PLEG): container finished" podID="61835213-5c1e-47e2-88ed-453c167e750d" containerID="5dc42e37f28b32171202d75578de698fefb569d9c7a5d2463b81f02f57109d07" exitCode=0 Oct 03 07:53:41 crc kubenswrapper[4664]: I1003 07:53:41.988144 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dxhhl" event={"ID":"61835213-5c1e-47e2-88ed-453c167e750d","Type":"ContainerDied","Data":"5dc42e37f28b32171202d75578de698fefb569d9c7a5d2463b81f02f57109d07"} Oct 03 07:53:41 crc kubenswrapper[4664]: I1003 07:53:41.990263 4664 generic.go:334] "Generic (PLEG): container finished" podID="40c95b7d-be3e-4613-b795-f5d636b12ce4" containerID="83388f83612b239f9e1c2d77a9a22ddb080260b1599486772900c36057c7ea34" exitCode=0 Oct 03 07:53:41 crc kubenswrapper[4664]: I1003 07:53:41.990322 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zr8nh" event={"ID":"40c95b7d-be3e-4613-b795-f5d636b12ce4","Type":"ContainerDied","Data":"83388f83612b239f9e1c2d77a9a22ddb080260b1599486772900c36057c7ea34"} Oct 03 07:53:43 crc kubenswrapper[4664]: I1003 07:53:43.000366 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-27n9z" event={"ID":"5844f220-7a74-41b4-9b03-eee894a66f32","Type":"ContainerStarted","Data":"27dccc9053b5087b570cf81a9be202ba88453afca8d5b6ca28307c723a784642"} Oct 03 07:53:43 crc kubenswrapper[4664]: I1003 07:53:43.005235 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pvwj7" event={"ID":"3abe4747-28b2-446f-afd5-b0e736c90d03","Type":"ContainerStarted","Data":"1f383910a97ef581aaccc1aaec3bf144d0a623809093431b5234fe400c3a5411"} Oct 03 07:53:44 crc kubenswrapper[4664]: I1003 07:53:44.012995 4664 generic.go:334] "Generic (PLEG): container finished" podID="5844f220-7a74-41b4-9b03-eee894a66f32" containerID="27dccc9053b5087b570cf81a9be202ba88453afca8d5b6ca28307c723a784642" exitCode=0 Oct 03 07:53:44 crc kubenswrapper[4664]: I1003 07:53:44.013135 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-27n9z" event={"ID":"5844f220-7a74-41b4-9b03-eee894a66f32","Type":"ContainerDied","Data":"27dccc9053b5087b570cf81a9be202ba88453afca8d5b6ca28307c723a784642"} Oct 03 07:53:44 crc kubenswrapper[4664]: I1003 07:53:44.052579 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pvwj7" podStartSLOduration=3.316611538 podStartE2EDuration="7.052559785s" podCreationTimestamp="2025-10-03 07:53:37 +0000 UTC" firstStartedPulling="2025-10-03 07:53:38.965136259 +0000 UTC m=+319.786326749" lastFinishedPulling="2025-10-03 07:53:42.701084506 +0000 UTC m=+323.522274996" observedRunningTime="2025-10-03 07:53:44.051001396 +0000 UTC m=+324.872191916" watchObservedRunningTime="2025-10-03 07:53:44.052559785 +0000 UTC m=+324.873750275" Oct 03 07:53:45 crc kubenswrapper[4664]: I1003 07:53:45.020527 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dxhhl" event={"ID":"61835213-5c1e-47e2-88ed-453c167e750d","Type":"ContainerStarted","Data":"23503f0e66a006546d2c6a347b02f7aee5367e885e0fbfaa06ded0e4affc990a"} Oct 03 07:53:45 crc kubenswrapper[4664]: I1003 07:53:45.024089 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-27n9z" event={"ID":"5844f220-7a74-41b4-9b03-eee894a66f32","Type":"ContainerStarted","Data":"fc17769ae83478283869f2d1268b1d36d49360d0473b425df054b382c15261f8"} Oct 03 07:53:45 crc kubenswrapper[4664]: I1003 07:53:45.026331 4664 generic.go:334] "Generic (PLEG): container finished" podID="40c95b7d-be3e-4613-b795-f5d636b12ce4" containerID="af9e636443a538c3543f4db0aa467d351570d19f417d119f7a40068a46c0507c" exitCode=0 Oct 03 07:53:45 crc kubenswrapper[4664]: I1003 07:53:45.026378 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zr8nh" event={"ID":"40c95b7d-be3e-4613-b795-f5d636b12ce4","Type":"ContainerDied","Data":"af9e636443a538c3543f4db0aa467d351570d19f417d119f7a40068a46c0507c"} Oct 03 07:53:45 crc kubenswrapper[4664]: I1003 07:53:45.051419 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dxhhl" podStartSLOduration=3.376061523 podStartE2EDuration="8.051399685s" podCreationTimestamp="2025-10-03 07:53:37 +0000 UTC" firstStartedPulling="2025-10-03 07:53:38.964145168 +0000 UTC m=+319.785335658" lastFinishedPulling="2025-10-03 07:53:43.63948332 +0000 UTC m=+324.460673820" observedRunningTime="2025-10-03 07:53:45.049492256 +0000 UTC m=+325.870682746" watchObservedRunningTime="2025-10-03 07:53:45.051399685 +0000 UTC m=+325.872590175" Oct 03 07:53:45 crc kubenswrapper[4664]: I1003 07:53:45.073073 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-27n9z" podStartSLOduration=1.578472134 podStartE2EDuration="5.073045107s" podCreationTimestamp="2025-10-03 07:53:40 +0000 UTC" firstStartedPulling="2025-10-03 07:53:40.979464547 +0000 UTC m=+321.800655037" lastFinishedPulling="2025-10-03 07:53:44.47403752 +0000 UTC m=+325.295228010" observedRunningTime="2025-10-03 07:53:45.070993244 +0000 UTC m=+325.892183734" watchObservedRunningTime="2025-10-03 07:53:45.073045107 +0000 UTC m=+325.894235607" Oct 03 07:53:46 crc kubenswrapper[4664]: I1003 07:53:46.041812 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zr8nh" event={"ID":"40c95b7d-be3e-4613-b795-f5d636b12ce4","Type":"ContainerStarted","Data":"3330c5000fbbba8a868e43f570760da6b5dcae9570523be540f0e41bd6b6f8d1"} Oct 03 07:53:46 crc kubenswrapper[4664]: I1003 07:53:46.068877 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zr8nh" podStartSLOduration=2.669851146 podStartE2EDuration="6.068856923s" podCreationTimestamp="2025-10-03 07:53:40 +0000 UTC" firstStartedPulling="2025-10-03 07:53:42.105828635 +0000 UTC m=+322.927019125" lastFinishedPulling="2025-10-03 07:53:45.504834412 +0000 UTC m=+326.326024902" observedRunningTime="2025-10-03 07:53:46.063837357 +0000 UTC m=+326.885027867" watchObservedRunningTime="2025-10-03 07:53:46.068856923 +0000 UTC m=+326.890047413" Oct 03 07:53:47 crc kubenswrapper[4664]: I1003 07:53:47.988669 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pvwj7" Oct 03 07:53:47 crc kubenswrapper[4664]: I1003 07:53:47.989297 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pvwj7" Oct 03 07:53:48 crc kubenswrapper[4664]: I1003 07:53:48.046982 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pvwj7" Oct 03 07:53:48 crc kubenswrapper[4664]: I1003 07:53:48.093624 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pvwj7" Oct 03 07:53:48 crc kubenswrapper[4664]: I1003 07:53:48.200502 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dxhhl" Oct 03 07:53:48 crc kubenswrapper[4664]: I1003 07:53:48.200591 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dxhhl" Oct 03 07:53:49 crc kubenswrapper[4664]: I1003 07:53:49.242540 4664 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dxhhl" podUID="61835213-5c1e-47e2-88ed-453c167e750d" containerName="registry-server" probeResult="failure" output=< Oct 03 07:53:49 crc kubenswrapper[4664]: timeout: failed to connect service ":50051" within 1s Oct 03 07:53:49 crc kubenswrapper[4664]: > Oct 03 07:53:50 crc kubenswrapper[4664]: I1003 07:53:50.404090 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-27n9z" Oct 03 07:53:50 crc kubenswrapper[4664]: I1003 07:53:50.404475 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-27n9z" Oct 03 07:53:50 crc kubenswrapper[4664]: I1003 07:53:50.452679 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-27n9z" Oct 03 07:53:50 crc kubenswrapper[4664]: I1003 07:53:50.606180 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zr8nh" Oct 03 07:53:50 crc kubenswrapper[4664]: I1003 07:53:50.606577 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zr8nh" Oct 03 07:53:50 crc kubenswrapper[4664]: I1003 07:53:50.643162 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zr8nh" Oct 03 07:53:51 crc kubenswrapper[4664]: I1003 07:53:51.119999 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-27n9z" Oct 03 07:53:51 crc kubenswrapper[4664]: I1003 07:53:51.120494 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zr8nh" Oct 03 07:53:58 crc kubenswrapper[4664]: I1003 07:53:58.236224 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dxhhl" Oct 03 07:53:58 crc kubenswrapper[4664]: I1003 07:53:58.277884 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dxhhl" Oct 03 07:54:41 crc kubenswrapper[4664]: I1003 07:54:41.987568 4664 patch_prober.go:28] interesting pod/machine-config-daemon-x9dgm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 07:54:41 crc kubenswrapper[4664]: I1003 07:54:41.988113 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 07:55:11 crc kubenswrapper[4664]: I1003 07:55:11.987746 4664 patch_prober.go:28] interesting pod/machine-config-daemon-x9dgm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 07:55:11 crc kubenswrapper[4664]: I1003 07:55:11.988405 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 07:55:41 crc kubenswrapper[4664]: I1003 07:55:41.987664 4664 patch_prober.go:28] interesting pod/machine-config-daemon-x9dgm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 07:55:41 crc kubenswrapper[4664]: I1003 07:55:41.988448 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 07:55:41 crc kubenswrapper[4664]: I1003 07:55:41.988519 4664 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" Oct 03 07:55:41 crc kubenswrapper[4664]: I1003 07:55:41.989243 4664 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"67bcc74759e35616c9e66561e791cb4182af2c1e5e890cdf15bef9f1f05e339f"} pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 07:55:41 crc kubenswrapper[4664]: I1003 07:55:41.989316 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" containerID="cri-o://67bcc74759e35616c9e66561e791cb4182af2c1e5e890cdf15bef9f1f05e339f" gracePeriod=600 Oct 03 07:55:42 crc kubenswrapper[4664]: I1003 07:55:42.680965 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" event={"ID":"598b81ce-0ce7-498f-9337-ae5e6e64682b","Type":"ContainerDied","Data":"67bcc74759e35616c9e66561e791cb4182af2c1e5e890cdf15bef9f1f05e339f"} Oct 03 07:55:42 crc kubenswrapper[4664]: I1003 07:55:42.680874 4664 generic.go:334] "Generic (PLEG): container finished" podID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerID="67bcc74759e35616c9e66561e791cb4182af2c1e5e890cdf15bef9f1f05e339f" exitCode=0 Oct 03 07:55:42 crc kubenswrapper[4664]: I1003 07:55:42.681347 4664 scope.go:117] "RemoveContainer" containerID="b446e000609801e974810aa233005f7c2810cc977c7a13155452de76e1268427" Oct 03 07:55:42 crc kubenswrapper[4664]: I1003 07:55:42.681395 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" event={"ID":"598b81ce-0ce7-498f-9337-ae5e6e64682b","Type":"ContainerStarted","Data":"2b9821c1193b1d9cb01e00b77753645dc25b62e22b09d41a84e0eb1787f597c7"} Oct 03 07:58:11 crc kubenswrapper[4664]: I1003 07:58:11.987110 4664 patch_prober.go:28] interesting pod/machine-config-daemon-x9dgm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 07:58:11 crc kubenswrapper[4664]: I1003 07:58:11.987813 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 07:58:41 crc kubenswrapper[4664]: I1003 07:58:41.987362 4664 patch_prober.go:28] interesting pod/machine-config-daemon-x9dgm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 07:58:41 crc kubenswrapper[4664]: I1003 07:58:41.988595 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 07:59:11 crc kubenswrapper[4664]: I1003 07:59:11.987402 4664 patch_prober.go:28] interesting pod/machine-config-daemon-x9dgm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 07:59:11 crc kubenswrapper[4664]: I1003 07:59:11.988029 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 07:59:11 crc kubenswrapper[4664]: I1003 07:59:11.988083 4664 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" Oct 03 07:59:11 crc kubenswrapper[4664]: I1003 07:59:11.988743 4664 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2b9821c1193b1d9cb01e00b77753645dc25b62e22b09d41a84e0eb1787f597c7"} pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 07:59:11 crc kubenswrapper[4664]: I1003 07:59:11.988807 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" containerID="cri-o://2b9821c1193b1d9cb01e00b77753645dc25b62e22b09d41a84e0eb1787f597c7" gracePeriod=600 Oct 03 07:59:12 crc kubenswrapper[4664]: I1003 07:59:12.763206 4664 generic.go:334] "Generic (PLEG): container finished" podID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerID="2b9821c1193b1d9cb01e00b77753645dc25b62e22b09d41a84e0eb1787f597c7" exitCode=0 Oct 03 07:59:12 crc kubenswrapper[4664]: I1003 07:59:12.763673 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" event={"ID":"598b81ce-0ce7-498f-9337-ae5e6e64682b","Type":"ContainerDied","Data":"2b9821c1193b1d9cb01e00b77753645dc25b62e22b09d41a84e0eb1787f597c7"} Oct 03 07:59:12 crc kubenswrapper[4664]: I1003 07:59:12.763699 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" event={"ID":"598b81ce-0ce7-498f-9337-ae5e6e64682b","Type":"ContainerStarted","Data":"33a410bbdb246cf2e9dcb8e9de77a40e30f71ec5cde831e8cfca46d88165b8b1"} Oct 03 07:59:12 crc kubenswrapper[4664]: I1003 07:59:12.763715 4664 scope.go:117] "RemoveContainer" containerID="67bcc74759e35616c9e66561e791cb4182af2c1e5e890cdf15bef9f1f05e339f" Oct 03 08:00:00 crc kubenswrapper[4664]: I1003 08:00:00.136827 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324640-tg8nc"] Oct 03 08:00:00 crc kubenswrapper[4664]: I1003 08:00:00.138669 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324640-tg8nc" Oct 03 08:00:00 crc kubenswrapper[4664]: I1003 08:00:00.141875 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 08:00:00 crc kubenswrapper[4664]: I1003 08:00:00.142945 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 08:00:00 crc kubenswrapper[4664]: I1003 08:00:00.154183 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324640-tg8nc"] Oct 03 08:00:00 crc kubenswrapper[4664]: I1003 08:00:00.284363 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsvn5\" (UniqueName: \"kubernetes.io/projected/5dc17bd7-8f56-4801-8595-5c4578747397-kube-api-access-xsvn5\") pod \"collect-profiles-29324640-tg8nc\" (UID: \"5dc17bd7-8f56-4801-8595-5c4578747397\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324640-tg8nc" Oct 03 08:00:00 crc kubenswrapper[4664]: I1003 08:00:00.284432 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5dc17bd7-8f56-4801-8595-5c4578747397-config-volume\") pod \"collect-profiles-29324640-tg8nc\" (UID: \"5dc17bd7-8f56-4801-8595-5c4578747397\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324640-tg8nc" Oct 03 08:00:00 crc kubenswrapper[4664]: I1003 08:00:00.284661 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5dc17bd7-8f56-4801-8595-5c4578747397-secret-volume\") pod \"collect-profiles-29324640-tg8nc\" (UID: \"5dc17bd7-8f56-4801-8595-5c4578747397\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324640-tg8nc" Oct 03 08:00:00 crc kubenswrapper[4664]: I1003 08:00:00.386640 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5dc17bd7-8f56-4801-8595-5c4578747397-secret-volume\") pod \"collect-profiles-29324640-tg8nc\" (UID: \"5dc17bd7-8f56-4801-8595-5c4578747397\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324640-tg8nc" Oct 03 08:00:00 crc kubenswrapper[4664]: I1003 08:00:00.386753 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsvn5\" (UniqueName: \"kubernetes.io/projected/5dc17bd7-8f56-4801-8595-5c4578747397-kube-api-access-xsvn5\") pod \"collect-profiles-29324640-tg8nc\" (UID: \"5dc17bd7-8f56-4801-8595-5c4578747397\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324640-tg8nc" Oct 03 08:00:00 crc kubenswrapper[4664]: I1003 08:00:00.386797 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5dc17bd7-8f56-4801-8595-5c4578747397-config-volume\") pod \"collect-profiles-29324640-tg8nc\" (UID: \"5dc17bd7-8f56-4801-8595-5c4578747397\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324640-tg8nc" Oct 03 08:00:00 crc kubenswrapper[4664]: I1003 08:00:00.388364 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5dc17bd7-8f56-4801-8595-5c4578747397-config-volume\") pod \"collect-profiles-29324640-tg8nc\" (UID: \"5dc17bd7-8f56-4801-8595-5c4578747397\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324640-tg8nc" Oct 03 08:00:00 crc kubenswrapper[4664]: I1003 08:00:00.394709 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5dc17bd7-8f56-4801-8595-5c4578747397-secret-volume\") pod \"collect-profiles-29324640-tg8nc\" (UID: \"5dc17bd7-8f56-4801-8595-5c4578747397\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324640-tg8nc" Oct 03 08:00:00 crc kubenswrapper[4664]: I1003 08:00:00.403917 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsvn5\" (UniqueName: \"kubernetes.io/projected/5dc17bd7-8f56-4801-8595-5c4578747397-kube-api-access-xsvn5\") pod \"collect-profiles-29324640-tg8nc\" (UID: \"5dc17bd7-8f56-4801-8595-5c4578747397\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324640-tg8nc" Oct 03 08:00:00 crc kubenswrapper[4664]: I1003 08:00:00.460781 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324640-tg8nc" Oct 03 08:00:00 crc kubenswrapper[4664]: I1003 08:00:00.645315 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324640-tg8nc"] Oct 03 08:00:00 crc kubenswrapper[4664]: I1003 08:00:00.998221 4664 generic.go:334] "Generic (PLEG): container finished" podID="5dc17bd7-8f56-4801-8595-5c4578747397" containerID="8363663347e92baf645e5ae9dff3c8ecc7c6bacd929ea27644bf6242b0dfe4e1" exitCode=0 Oct 03 08:00:00 crc kubenswrapper[4664]: I1003 08:00:00.998421 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324640-tg8nc" event={"ID":"5dc17bd7-8f56-4801-8595-5c4578747397","Type":"ContainerDied","Data":"8363663347e92baf645e5ae9dff3c8ecc7c6bacd929ea27644bf6242b0dfe4e1"} Oct 03 08:00:00 crc kubenswrapper[4664]: I1003 08:00:00.998621 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324640-tg8nc" event={"ID":"5dc17bd7-8f56-4801-8595-5c4578747397","Type":"ContainerStarted","Data":"35dabd509a5b648b3a711e2101fa580c3ca8b0d8d6b3762bb65475b07171ab47"} Oct 03 08:00:02 crc kubenswrapper[4664]: I1003 08:00:02.195990 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324640-tg8nc" Oct 03 08:00:02 crc kubenswrapper[4664]: I1003 08:00:02.310936 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5dc17bd7-8f56-4801-8595-5c4578747397-config-volume\") pod \"5dc17bd7-8f56-4801-8595-5c4578747397\" (UID: \"5dc17bd7-8f56-4801-8595-5c4578747397\") " Oct 03 08:00:02 crc kubenswrapper[4664]: I1003 08:00:02.311012 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5dc17bd7-8f56-4801-8595-5c4578747397-secret-volume\") pod \"5dc17bd7-8f56-4801-8595-5c4578747397\" (UID: \"5dc17bd7-8f56-4801-8595-5c4578747397\") " Oct 03 08:00:02 crc kubenswrapper[4664]: I1003 08:00:02.311052 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsvn5\" (UniqueName: \"kubernetes.io/projected/5dc17bd7-8f56-4801-8595-5c4578747397-kube-api-access-xsvn5\") pod \"5dc17bd7-8f56-4801-8595-5c4578747397\" (UID: \"5dc17bd7-8f56-4801-8595-5c4578747397\") " Oct 03 08:00:02 crc kubenswrapper[4664]: I1003 08:00:02.312594 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5dc17bd7-8f56-4801-8595-5c4578747397-config-volume" (OuterVolumeSpecName: "config-volume") pod "5dc17bd7-8f56-4801-8595-5c4578747397" (UID: "5dc17bd7-8f56-4801-8595-5c4578747397"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:00:02 crc kubenswrapper[4664]: I1003 08:00:02.319215 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dc17bd7-8f56-4801-8595-5c4578747397-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5dc17bd7-8f56-4801-8595-5c4578747397" (UID: "5dc17bd7-8f56-4801-8595-5c4578747397"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:00:02 crc kubenswrapper[4664]: I1003 08:00:02.320041 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dc17bd7-8f56-4801-8595-5c4578747397-kube-api-access-xsvn5" (OuterVolumeSpecName: "kube-api-access-xsvn5") pod "5dc17bd7-8f56-4801-8595-5c4578747397" (UID: "5dc17bd7-8f56-4801-8595-5c4578747397"). InnerVolumeSpecName "kube-api-access-xsvn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:00:02 crc kubenswrapper[4664]: I1003 08:00:02.412840 4664 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5dc17bd7-8f56-4801-8595-5c4578747397-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 08:00:02 crc kubenswrapper[4664]: I1003 08:00:02.412893 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsvn5\" (UniqueName: \"kubernetes.io/projected/5dc17bd7-8f56-4801-8595-5c4578747397-kube-api-access-xsvn5\") on node \"crc\" DevicePath \"\"" Oct 03 08:00:02 crc kubenswrapper[4664]: I1003 08:00:02.412906 4664 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5dc17bd7-8f56-4801-8595-5c4578747397-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 08:00:03 crc kubenswrapper[4664]: I1003 08:00:03.011434 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324640-tg8nc" event={"ID":"5dc17bd7-8f56-4801-8595-5c4578747397","Type":"ContainerDied","Data":"35dabd509a5b648b3a711e2101fa580c3ca8b0d8d6b3762bb65475b07171ab47"} Oct 03 08:00:03 crc kubenswrapper[4664]: I1003 08:00:03.011498 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35dabd509a5b648b3a711e2101fa580c3ca8b0d8d6b3762bb65475b07171ab47" Oct 03 08:00:03 crc kubenswrapper[4664]: I1003 08:00:03.011532 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324640-tg8nc" Oct 03 08:00:46 crc kubenswrapper[4664]: I1003 08:00:46.865853 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sd2nj"] Oct 03 08:00:46 crc kubenswrapper[4664]: I1003 08:00:46.867530 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-sd2nj" podUID="1b63fb51-bfa4-4c92-a1b0-9044cf7cff03" containerName="controller-manager" containerID="cri-o://bfdc5c9858c31055e7fb71a810ef444fe98f36862c5c54c54bf46c24840a5952" gracePeriod=30 Oct 03 08:00:46 crc kubenswrapper[4664]: I1003 08:00:46.991289 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-sm42l"] Oct 03 08:00:46 crc kubenswrapper[4664]: I1003 08:00:46.991945 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sm42l" podUID="11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e" containerName="route-controller-manager" containerID="cri-o://1133a9ae7f3f82170f7e6137864a76e7e56515b17c4c0e57529489e2ae0a0aec" gracePeriod=30 Oct 03 08:00:47 crc kubenswrapper[4664]: I1003 08:00:47.268503 4664 generic.go:334] "Generic (PLEG): container finished" podID="1b63fb51-bfa4-4c92-a1b0-9044cf7cff03" containerID="bfdc5c9858c31055e7fb71a810ef444fe98f36862c5c54c54bf46c24840a5952" exitCode=0 Oct 03 08:00:47 crc kubenswrapper[4664]: I1003 08:00:47.268624 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-sd2nj" event={"ID":"1b63fb51-bfa4-4c92-a1b0-9044cf7cff03","Type":"ContainerDied","Data":"bfdc5c9858c31055e7fb71a810ef444fe98f36862c5c54c54bf46c24840a5952"} Oct 03 08:00:47 crc kubenswrapper[4664]: I1003 08:00:47.271808 4664 generic.go:334] "Generic (PLEG): container finished" podID="11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e" containerID="1133a9ae7f3f82170f7e6137864a76e7e56515b17c4c0e57529489e2ae0a0aec" exitCode=0 Oct 03 08:00:47 crc kubenswrapper[4664]: I1003 08:00:47.271838 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sm42l" event={"ID":"11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e","Type":"ContainerDied","Data":"1133a9ae7f3f82170f7e6137864a76e7e56515b17c4c0e57529489e2ae0a0aec"} Oct 03 08:00:47 crc kubenswrapper[4664]: I1003 08:00:47.385212 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-sd2nj" Oct 03 08:00:47 crc kubenswrapper[4664]: I1003 08:00:47.427393 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sm42l" Oct 03 08:00:47 crc kubenswrapper[4664]: I1003 08:00:47.477622 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b63fb51-bfa4-4c92-a1b0-9044cf7cff03-config\") pod \"1b63fb51-bfa4-4c92-a1b0-9044cf7cff03\" (UID: \"1b63fb51-bfa4-4c92-a1b0-9044cf7cff03\") " Oct 03 08:00:47 crc kubenswrapper[4664]: I1003 08:00:47.477907 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1b63fb51-bfa4-4c92-a1b0-9044cf7cff03-proxy-ca-bundles\") pod \"1b63fb51-bfa4-4c92-a1b0-9044cf7cff03\" (UID: \"1b63fb51-bfa4-4c92-a1b0-9044cf7cff03\") " Oct 03 08:00:47 crc kubenswrapper[4664]: I1003 08:00:47.477994 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e-client-ca\") pod \"11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e\" (UID: \"11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e\") " Oct 03 08:00:47 crc kubenswrapper[4664]: I1003 08:00:47.478104 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e-config\") pod \"11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e\" (UID: \"11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e\") " Oct 03 08:00:47 crc kubenswrapper[4664]: I1003 08:00:47.478179 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b63fb51-bfa4-4c92-a1b0-9044cf7cff03-client-ca\") pod \"1b63fb51-bfa4-4c92-a1b0-9044cf7cff03\" (UID: \"1b63fb51-bfa4-4c92-a1b0-9044cf7cff03\") " Oct 03 08:00:47 crc kubenswrapper[4664]: I1003 08:00:47.478289 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8tqq\" (UniqueName: \"kubernetes.io/projected/11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e-kube-api-access-n8tqq\") pod \"11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e\" (UID: \"11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e\") " Oct 03 08:00:47 crc kubenswrapper[4664]: I1003 08:00:47.478400 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqrm7\" (UniqueName: \"kubernetes.io/projected/1b63fb51-bfa4-4c92-a1b0-9044cf7cff03-kube-api-access-tqrm7\") pod \"1b63fb51-bfa4-4c92-a1b0-9044cf7cff03\" (UID: \"1b63fb51-bfa4-4c92-a1b0-9044cf7cff03\") " Oct 03 08:00:47 crc kubenswrapper[4664]: I1003 08:00:47.478463 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e-serving-cert\") pod \"11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e\" (UID: \"11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e\") " Oct 03 08:00:47 crc kubenswrapper[4664]: I1003 08:00:47.478565 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b63fb51-bfa4-4c92-a1b0-9044cf7cff03-serving-cert\") pod \"1b63fb51-bfa4-4c92-a1b0-9044cf7cff03\" (UID: \"1b63fb51-bfa4-4c92-a1b0-9044cf7cff03\") " Oct 03 08:00:47 crc kubenswrapper[4664]: I1003 08:00:47.484073 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b63fb51-bfa4-4c92-a1b0-9044cf7cff03-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "1b63fb51-bfa4-4c92-a1b0-9044cf7cff03" (UID: "1b63fb51-bfa4-4c92-a1b0-9044cf7cff03"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:00:47 crc kubenswrapper[4664]: I1003 08:00:47.485123 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e-client-ca" (OuterVolumeSpecName: "client-ca") pod "11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e" (UID: "11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:00:47 crc kubenswrapper[4664]: I1003 08:00:47.484225 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b63fb51-bfa4-4c92-a1b0-9044cf7cff03-client-ca" (OuterVolumeSpecName: "client-ca") pod "1b63fb51-bfa4-4c92-a1b0-9044cf7cff03" (UID: "1b63fb51-bfa4-4c92-a1b0-9044cf7cff03"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:00:47 crc kubenswrapper[4664]: I1003 08:00:47.484626 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e-config" (OuterVolumeSpecName: "config") pod "11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e" (UID: "11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:00:47 crc kubenswrapper[4664]: I1003 08:00:47.484691 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b63fb51-bfa4-4c92-a1b0-9044cf7cff03-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1b63fb51-bfa4-4c92-a1b0-9044cf7cff03" (UID: "1b63fb51-bfa4-4c92-a1b0-9044cf7cff03"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:00:47 crc kubenswrapper[4664]: I1003 08:00:47.484833 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b63fb51-bfa4-4c92-a1b0-9044cf7cff03-config" (OuterVolumeSpecName: "config") pod "1b63fb51-bfa4-4c92-a1b0-9044cf7cff03" (UID: "1b63fb51-bfa4-4c92-a1b0-9044cf7cff03"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:00:47 crc kubenswrapper[4664]: I1003 08:00:47.491940 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e" (UID: "11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:00:47 crc kubenswrapper[4664]: I1003 08:00:47.491961 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b63fb51-bfa4-4c92-a1b0-9044cf7cff03-kube-api-access-tqrm7" (OuterVolumeSpecName: "kube-api-access-tqrm7") pod "1b63fb51-bfa4-4c92-a1b0-9044cf7cff03" (UID: "1b63fb51-bfa4-4c92-a1b0-9044cf7cff03"). InnerVolumeSpecName "kube-api-access-tqrm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:00:47 crc kubenswrapper[4664]: I1003 08:00:47.492972 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e-kube-api-access-n8tqq" (OuterVolumeSpecName: "kube-api-access-n8tqq") pod "11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e" (UID: "11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e"). InnerVolumeSpecName "kube-api-access-n8tqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:00:47 crc kubenswrapper[4664]: I1003 08:00:47.579496 4664 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b63fb51-bfa4-4c92-a1b0-9044cf7cff03-client-ca\") on node \"crc\" DevicePath \"\"" Oct 03 08:00:47 crc kubenswrapper[4664]: I1003 08:00:47.579540 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8tqq\" (UniqueName: \"kubernetes.io/projected/11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e-kube-api-access-n8tqq\") on node \"crc\" DevicePath \"\"" Oct 03 08:00:47 crc kubenswrapper[4664]: I1003 08:00:47.579556 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqrm7\" (UniqueName: \"kubernetes.io/projected/1b63fb51-bfa4-4c92-a1b0-9044cf7cff03-kube-api-access-tqrm7\") on node \"crc\" DevicePath \"\"" Oct 03 08:00:47 crc kubenswrapper[4664]: I1003 08:00:47.579568 4664 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:00:47 crc kubenswrapper[4664]: I1003 08:00:47.579578 4664 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b63fb51-bfa4-4c92-a1b0-9044cf7cff03-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:00:47 crc kubenswrapper[4664]: I1003 08:00:47.579587 4664 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b63fb51-bfa4-4c92-a1b0-9044cf7cff03-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:00:47 crc kubenswrapper[4664]: I1003 08:00:47.579595 4664 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1b63fb51-bfa4-4c92-a1b0-9044cf7cff03-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 03 08:00:47 crc kubenswrapper[4664]: I1003 08:00:47.579626 4664 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e-client-ca\") on node \"crc\" DevicePath \"\"" Oct 03 08:00:47 crc kubenswrapper[4664]: I1003 08:00:47.579645 4664 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.278085 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sm42l" Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.278071 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sm42l" event={"ID":"11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e","Type":"ContainerDied","Data":"05edfd23a24c34af5986d8f26457b0d1ea4b17c287ece267a213f607d9bb5f28"} Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.278182 4664 scope.go:117] "RemoveContainer" containerID="1133a9ae7f3f82170f7e6137864a76e7e56515b17c4c0e57529489e2ae0a0aec" Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.281022 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-sd2nj" event={"ID":"1b63fb51-bfa4-4c92-a1b0-9044cf7cff03","Type":"ContainerDied","Data":"a6e177761557151257501904255ec8e94c8e46f85fa67ec67cb557b0406ca568"} Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.281320 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-sd2nj" Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.293343 4664 scope.go:117] "RemoveContainer" containerID="bfdc5c9858c31055e7fb71a810ef444fe98f36862c5c54c54bf46c24840a5952" Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.297347 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-fb878f79-5mk52"] Oct 03 08:00:48 crc kubenswrapper[4664]: E1003 08:00:48.297550 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b63fb51-bfa4-4c92-a1b0-9044cf7cff03" containerName="controller-manager" Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.297565 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b63fb51-bfa4-4c92-a1b0-9044cf7cff03" containerName="controller-manager" Oct 03 08:00:48 crc kubenswrapper[4664]: E1003 08:00:48.297576 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dc17bd7-8f56-4801-8595-5c4578747397" containerName="collect-profiles" Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.297582 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dc17bd7-8f56-4801-8595-5c4578747397" containerName="collect-profiles" Oct 03 08:00:48 crc kubenswrapper[4664]: E1003 08:00:48.297598 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e" containerName="route-controller-manager" Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.297638 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e" containerName="route-controller-manager" Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.297727 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e" containerName="route-controller-manager" Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.297744 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dc17bd7-8f56-4801-8595-5c4578747397" containerName="collect-profiles" Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.297753 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b63fb51-bfa4-4c92-a1b0-9044cf7cff03" containerName="controller-manager" Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.298124 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-fb878f79-5mk52" Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.304758 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.304996 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.305118 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.305405 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.305447 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.305479 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.309394 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-fb878f79-5mk52"] Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.322064 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sd2nj"] Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.327512 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sd2nj"] Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.340407 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-sm42l"] Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.345057 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-sm42l"] Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.395505 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c4d9fd18-2046-473a-a48c-bbfb769cf382-client-ca\") pod \"route-controller-manager-fb878f79-5mk52\" (UID: \"c4d9fd18-2046-473a-a48c-bbfb769cf382\") " pod="openshift-route-controller-manager/route-controller-manager-fb878f79-5mk52" Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.395563 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4d9fd18-2046-473a-a48c-bbfb769cf382-serving-cert\") pod \"route-controller-manager-fb878f79-5mk52\" (UID: \"c4d9fd18-2046-473a-a48c-bbfb769cf382\") " pod="openshift-route-controller-manager/route-controller-manager-fb878f79-5mk52" Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.395583 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzq2j\" (UniqueName: \"kubernetes.io/projected/c4d9fd18-2046-473a-a48c-bbfb769cf382-kube-api-access-xzq2j\") pod \"route-controller-manager-fb878f79-5mk52\" (UID: \"c4d9fd18-2046-473a-a48c-bbfb769cf382\") " pod="openshift-route-controller-manager/route-controller-manager-fb878f79-5mk52" Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.395642 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4d9fd18-2046-473a-a48c-bbfb769cf382-config\") pod \"route-controller-manager-fb878f79-5mk52\" (UID: \"c4d9fd18-2046-473a-a48c-bbfb769cf382\") " pod="openshift-route-controller-manager/route-controller-manager-fb878f79-5mk52" Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.496969 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c4d9fd18-2046-473a-a48c-bbfb769cf382-client-ca\") pod \"route-controller-manager-fb878f79-5mk52\" (UID: \"c4d9fd18-2046-473a-a48c-bbfb769cf382\") " pod="openshift-route-controller-manager/route-controller-manager-fb878f79-5mk52" Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.497064 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4d9fd18-2046-473a-a48c-bbfb769cf382-serving-cert\") pod \"route-controller-manager-fb878f79-5mk52\" (UID: \"c4d9fd18-2046-473a-a48c-bbfb769cf382\") " pod="openshift-route-controller-manager/route-controller-manager-fb878f79-5mk52" Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.497101 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzq2j\" (UniqueName: \"kubernetes.io/projected/c4d9fd18-2046-473a-a48c-bbfb769cf382-kube-api-access-xzq2j\") pod \"route-controller-manager-fb878f79-5mk52\" (UID: \"c4d9fd18-2046-473a-a48c-bbfb769cf382\") " pod="openshift-route-controller-manager/route-controller-manager-fb878f79-5mk52" Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.497140 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4d9fd18-2046-473a-a48c-bbfb769cf382-config\") pod \"route-controller-manager-fb878f79-5mk52\" (UID: \"c4d9fd18-2046-473a-a48c-bbfb769cf382\") " pod="openshift-route-controller-manager/route-controller-manager-fb878f79-5mk52" Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.498364 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c4d9fd18-2046-473a-a48c-bbfb769cf382-client-ca\") pod \"route-controller-manager-fb878f79-5mk52\" (UID: \"c4d9fd18-2046-473a-a48c-bbfb769cf382\") " pod="openshift-route-controller-manager/route-controller-manager-fb878f79-5mk52" Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.498573 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4d9fd18-2046-473a-a48c-bbfb769cf382-config\") pod \"route-controller-manager-fb878f79-5mk52\" (UID: \"c4d9fd18-2046-473a-a48c-bbfb769cf382\") " pod="openshift-route-controller-manager/route-controller-manager-fb878f79-5mk52" Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.508383 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4d9fd18-2046-473a-a48c-bbfb769cf382-serving-cert\") pod \"route-controller-manager-fb878f79-5mk52\" (UID: \"c4d9fd18-2046-473a-a48c-bbfb769cf382\") " pod="openshift-route-controller-manager/route-controller-manager-fb878f79-5mk52" Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.515433 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzq2j\" (UniqueName: \"kubernetes.io/projected/c4d9fd18-2046-473a-a48c-bbfb769cf382-kube-api-access-xzq2j\") pod \"route-controller-manager-fb878f79-5mk52\" (UID: \"c4d9fd18-2046-473a-a48c-bbfb769cf382\") " pod="openshift-route-controller-manager/route-controller-manager-fb878f79-5mk52" Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.615879 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-fb878f79-5mk52" Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.634972 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5456dd5d7-4hv7z"] Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.635717 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5456dd5d7-4hv7z" Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.637230 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.645233 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.645395 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.645535 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.645962 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.646181 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.649588 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.649747 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5456dd5d7-4hv7z"] Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.704065 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj5ks\" (UniqueName: \"kubernetes.io/projected/8767f61f-a497-40ad-b761-b72f97710556-kube-api-access-zj5ks\") pod \"controller-manager-5456dd5d7-4hv7z\" (UID: \"8767f61f-a497-40ad-b761-b72f97710556\") " pod="openshift-controller-manager/controller-manager-5456dd5d7-4hv7z" Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.704117 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8767f61f-a497-40ad-b761-b72f97710556-client-ca\") pod \"controller-manager-5456dd5d7-4hv7z\" (UID: \"8767f61f-a497-40ad-b761-b72f97710556\") " pod="openshift-controller-manager/controller-manager-5456dd5d7-4hv7z" Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.704147 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8767f61f-a497-40ad-b761-b72f97710556-config\") pod \"controller-manager-5456dd5d7-4hv7z\" (UID: \"8767f61f-a497-40ad-b761-b72f97710556\") " pod="openshift-controller-manager/controller-manager-5456dd5d7-4hv7z" Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.704181 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8767f61f-a497-40ad-b761-b72f97710556-proxy-ca-bundles\") pod \"controller-manager-5456dd5d7-4hv7z\" (UID: \"8767f61f-a497-40ad-b761-b72f97710556\") " pod="openshift-controller-manager/controller-manager-5456dd5d7-4hv7z" Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.704217 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8767f61f-a497-40ad-b761-b72f97710556-serving-cert\") pod \"controller-manager-5456dd5d7-4hv7z\" (UID: \"8767f61f-a497-40ad-b761-b72f97710556\") " pod="openshift-controller-manager/controller-manager-5456dd5d7-4hv7z" Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.806590 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8767f61f-a497-40ad-b761-b72f97710556-proxy-ca-bundles\") pod \"controller-manager-5456dd5d7-4hv7z\" (UID: \"8767f61f-a497-40ad-b761-b72f97710556\") " pod="openshift-controller-manager/controller-manager-5456dd5d7-4hv7z" Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.806773 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8767f61f-a497-40ad-b761-b72f97710556-serving-cert\") pod \"controller-manager-5456dd5d7-4hv7z\" (UID: \"8767f61f-a497-40ad-b761-b72f97710556\") " pod="openshift-controller-manager/controller-manager-5456dd5d7-4hv7z" Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.807145 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8767f61f-a497-40ad-b761-b72f97710556-client-ca\") pod \"controller-manager-5456dd5d7-4hv7z\" (UID: \"8767f61f-a497-40ad-b761-b72f97710556\") " pod="openshift-controller-manager/controller-manager-5456dd5d7-4hv7z" Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.807190 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zj5ks\" (UniqueName: \"kubernetes.io/projected/8767f61f-a497-40ad-b761-b72f97710556-kube-api-access-zj5ks\") pod \"controller-manager-5456dd5d7-4hv7z\" (UID: \"8767f61f-a497-40ad-b761-b72f97710556\") " pod="openshift-controller-manager/controller-manager-5456dd5d7-4hv7z" Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.807229 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8767f61f-a497-40ad-b761-b72f97710556-config\") pod \"controller-manager-5456dd5d7-4hv7z\" (UID: \"8767f61f-a497-40ad-b761-b72f97710556\") " pod="openshift-controller-manager/controller-manager-5456dd5d7-4hv7z" Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.808308 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8767f61f-a497-40ad-b761-b72f97710556-proxy-ca-bundles\") pod \"controller-manager-5456dd5d7-4hv7z\" (UID: \"8767f61f-a497-40ad-b761-b72f97710556\") " pod="openshift-controller-manager/controller-manager-5456dd5d7-4hv7z" Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.808413 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8767f61f-a497-40ad-b761-b72f97710556-client-ca\") pod \"controller-manager-5456dd5d7-4hv7z\" (UID: \"8767f61f-a497-40ad-b761-b72f97710556\") " pod="openshift-controller-manager/controller-manager-5456dd5d7-4hv7z" Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.811387 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8767f61f-a497-40ad-b761-b72f97710556-serving-cert\") pod \"controller-manager-5456dd5d7-4hv7z\" (UID: \"8767f61f-a497-40ad-b761-b72f97710556\") " pod="openshift-controller-manager/controller-manager-5456dd5d7-4hv7z" Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.816805 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8767f61f-a497-40ad-b761-b72f97710556-config\") pod \"controller-manager-5456dd5d7-4hv7z\" (UID: \"8767f61f-a497-40ad-b761-b72f97710556\") " pod="openshift-controller-manager/controller-manager-5456dd5d7-4hv7z" Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.827073 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zj5ks\" (UniqueName: \"kubernetes.io/projected/8767f61f-a497-40ad-b761-b72f97710556-kube-api-access-zj5ks\") pod \"controller-manager-5456dd5d7-4hv7z\" (UID: \"8767f61f-a497-40ad-b761-b72f97710556\") " pod="openshift-controller-manager/controller-manager-5456dd5d7-4hv7z" Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.832448 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-fb878f79-5mk52"] Oct 03 08:00:48 crc kubenswrapper[4664]: I1003 08:00:48.988915 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5456dd5d7-4hv7z" Oct 03 08:00:49 crc kubenswrapper[4664]: I1003 08:00:49.179587 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5456dd5d7-4hv7z"] Oct 03 08:00:49 crc kubenswrapper[4664]: I1003 08:00:49.285646 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5456dd5d7-4hv7z" event={"ID":"8767f61f-a497-40ad-b761-b72f97710556","Type":"ContainerStarted","Data":"c86ec148ff71328681d5a90be11678950f702d081888880636b89af1b752af49"} Oct 03 08:00:49 crc kubenswrapper[4664]: I1003 08:00:49.288434 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-fb878f79-5mk52" event={"ID":"c4d9fd18-2046-473a-a48c-bbfb769cf382","Type":"ContainerStarted","Data":"dbf95f9755005dce38fe09c96bb41f720ecda5c869a4af6c1494316ab16e7f5a"} Oct 03 08:00:49 crc kubenswrapper[4664]: I1003 08:00:49.288475 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-fb878f79-5mk52" event={"ID":"c4d9fd18-2046-473a-a48c-bbfb769cf382","Type":"ContainerStarted","Data":"610b45e89295b2744259ecc14f1c4904632c4cccd9216218843e51f2b69a2384"} Oct 03 08:00:49 crc kubenswrapper[4664]: I1003 08:00:49.288700 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-fb878f79-5mk52" Oct 03 08:00:49 crc kubenswrapper[4664]: I1003 08:00:49.308020 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-fb878f79-5mk52" podStartSLOduration=1.308000007 podStartE2EDuration="1.308000007s" podCreationTimestamp="2025-10-03 08:00:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:00:49.306936547 +0000 UTC m=+750.128127057" watchObservedRunningTime="2025-10-03 08:00:49.308000007 +0000 UTC m=+750.129190497" Oct 03 08:00:49 crc kubenswrapper[4664]: I1003 08:00:49.402471 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-fb878f79-5mk52" Oct 03 08:00:49 crc kubenswrapper[4664]: I1003 08:00:49.883445 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e" path="/var/lib/kubelet/pods/11fab2ae-3bcb-42ac-bde8-9ceeb3e2da3e/volumes" Oct 03 08:00:49 crc kubenswrapper[4664]: I1003 08:00:49.883984 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b63fb51-bfa4-4c92-a1b0-9044cf7cff03" path="/var/lib/kubelet/pods/1b63fb51-bfa4-4c92-a1b0-9044cf7cff03/volumes" Oct 03 08:00:50 crc kubenswrapper[4664]: I1003 08:00:50.304644 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5456dd5d7-4hv7z" event={"ID":"8767f61f-a497-40ad-b761-b72f97710556","Type":"ContainerStarted","Data":"f47fa98cbd1ec5622704ad1062d3604c9abfde2f16cbdc088726e7866566a984"} Oct 03 08:00:50 crc kubenswrapper[4664]: I1003 08:00:50.320890 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5456dd5d7-4hv7z" podStartSLOduration=3.320868738 podStartE2EDuration="3.320868738s" podCreationTimestamp="2025-10-03 08:00:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:00:50.319649584 +0000 UTC m=+751.140840084" watchObservedRunningTime="2025-10-03 08:00:50.320868738 +0000 UTC m=+751.142059228" Oct 03 08:00:51 crc kubenswrapper[4664]: I1003 08:00:51.308960 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5456dd5d7-4hv7z" Oct 03 08:00:51 crc kubenswrapper[4664]: I1003 08:00:51.314194 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5456dd5d7-4hv7z" Oct 03 08:00:54 crc kubenswrapper[4664]: I1003 08:00:54.513835 4664 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 03 08:01:19 crc kubenswrapper[4664]: I1003 08:01:19.455877 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xdgnx"] Oct 03 08:01:19 crc kubenswrapper[4664]: I1003 08:01:19.457672 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xdgnx" Oct 03 08:01:19 crc kubenswrapper[4664]: I1003 08:01:19.481248 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xdgnx"] Oct 03 08:01:19 crc kubenswrapper[4664]: I1003 08:01:19.522681 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a23c5b33-2c0a-4fd8-ab02-2bdb1130a468-utilities\") pod \"community-operators-xdgnx\" (UID: \"a23c5b33-2c0a-4fd8-ab02-2bdb1130a468\") " pod="openshift-marketplace/community-operators-xdgnx" Oct 03 08:01:19 crc kubenswrapper[4664]: I1003 08:01:19.522756 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zg8xx\" (UniqueName: \"kubernetes.io/projected/a23c5b33-2c0a-4fd8-ab02-2bdb1130a468-kube-api-access-zg8xx\") pod \"community-operators-xdgnx\" (UID: \"a23c5b33-2c0a-4fd8-ab02-2bdb1130a468\") " pod="openshift-marketplace/community-operators-xdgnx" Oct 03 08:01:19 crc kubenswrapper[4664]: I1003 08:01:19.522786 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a23c5b33-2c0a-4fd8-ab02-2bdb1130a468-catalog-content\") pod \"community-operators-xdgnx\" (UID: \"a23c5b33-2c0a-4fd8-ab02-2bdb1130a468\") " pod="openshift-marketplace/community-operators-xdgnx" Oct 03 08:01:19 crc kubenswrapper[4664]: I1003 08:01:19.623737 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zg8xx\" (UniqueName: \"kubernetes.io/projected/a23c5b33-2c0a-4fd8-ab02-2bdb1130a468-kube-api-access-zg8xx\") pod \"community-operators-xdgnx\" (UID: \"a23c5b33-2c0a-4fd8-ab02-2bdb1130a468\") " pod="openshift-marketplace/community-operators-xdgnx" Oct 03 08:01:19 crc kubenswrapper[4664]: I1003 08:01:19.623793 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a23c5b33-2c0a-4fd8-ab02-2bdb1130a468-catalog-content\") pod \"community-operators-xdgnx\" (UID: \"a23c5b33-2c0a-4fd8-ab02-2bdb1130a468\") " pod="openshift-marketplace/community-operators-xdgnx" Oct 03 08:01:19 crc kubenswrapper[4664]: I1003 08:01:19.623884 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a23c5b33-2c0a-4fd8-ab02-2bdb1130a468-utilities\") pod \"community-operators-xdgnx\" (UID: \"a23c5b33-2c0a-4fd8-ab02-2bdb1130a468\") " pod="openshift-marketplace/community-operators-xdgnx" Oct 03 08:01:19 crc kubenswrapper[4664]: I1003 08:01:19.624418 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a23c5b33-2c0a-4fd8-ab02-2bdb1130a468-utilities\") pod \"community-operators-xdgnx\" (UID: \"a23c5b33-2c0a-4fd8-ab02-2bdb1130a468\") " pod="openshift-marketplace/community-operators-xdgnx" Oct 03 08:01:19 crc kubenswrapper[4664]: I1003 08:01:19.624757 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a23c5b33-2c0a-4fd8-ab02-2bdb1130a468-catalog-content\") pod \"community-operators-xdgnx\" (UID: \"a23c5b33-2c0a-4fd8-ab02-2bdb1130a468\") " pod="openshift-marketplace/community-operators-xdgnx" Oct 03 08:01:19 crc kubenswrapper[4664]: I1003 08:01:19.644552 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zg8xx\" (UniqueName: \"kubernetes.io/projected/a23c5b33-2c0a-4fd8-ab02-2bdb1130a468-kube-api-access-zg8xx\") pod \"community-operators-xdgnx\" (UID: \"a23c5b33-2c0a-4fd8-ab02-2bdb1130a468\") " pod="openshift-marketplace/community-operators-xdgnx" Oct 03 08:01:19 crc kubenswrapper[4664]: I1003 08:01:19.775347 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xdgnx" Oct 03 08:01:19 crc kubenswrapper[4664]: I1003 08:01:19.952744 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-zvfms"] Oct 03 08:01:19 crc kubenswrapper[4664]: I1003 08:01:19.956146 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-2f8b7"] Oct 03 08:01:19 crc kubenswrapper[4664]: I1003 08:01:19.957130 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-2f8b7" Oct 03 08:01:19 crc kubenswrapper[4664]: I1003 08:01:19.957573 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-zvfms" Oct 03 08:01:19 crc kubenswrapper[4664]: I1003 08:01:19.961797 4664 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-6pm4j" Oct 03 08:01:19 crc kubenswrapper[4664]: I1003 08:01:19.962786 4664 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-9sxzw" Oct 03 08:01:19 crc kubenswrapper[4664]: I1003 08:01:19.962985 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 03 08:01:19 crc kubenswrapper[4664]: I1003 08:01:19.967857 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 03 08:01:19 crc kubenswrapper[4664]: I1003 08:01:19.982995 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-zvfms"] Oct 03 08:01:20 crc kubenswrapper[4664]: I1003 08:01:20.000156 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-2f8b7"] Oct 03 08:01:20 crc kubenswrapper[4664]: I1003 08:01:20.027659 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-xtrlc"] Oct 03 08:01:20 crc kubenswrapper[4664]: I1003 08:01:20.028404 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-xtrlc" Oct 03 08:01:20 crc kubenswrapper[4664]: I1003 08:01:20.032277 4664 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-stq8q" Oct 03 08:01:20 crc kubenswrapper[4664]: I1003 08:01:20.034139 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wcxt\" (UniqueName: \"kubernetes.io/projected/d3d13871-7a04-4b9f-a8f0-7fbaff9cff1e-kube-api-access-6wcxt\") pod \"cert-manager-5b446d88c5-2f8b7\" (UID: \"d3d13871-7a04-4b9f-a8f0-7fbaff9cff1e\") " pod="cert-manager/cert-manager-5b446d88c5-2f8b7" Oct 03 08:01:20 crc kubenswrapper[4664]: I1003 08:01:20.034180 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzjcl\" (UniqueName: \"kubernetes.io/projected/bc158e46-d2d5-4c58-aa38-a1d395d68991-kube-api-access-hzjcl\") pod \"cert-manager-cainjector-7f985d654d-zvfms\" (UID: \"bc158e46-d2d5-4c58-aa38-a1d395d68991\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-zvfms" Oct 03 08:01:20 crc kubenswrapper[4664]: I1003 08:01:20.044851 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-xtrlc"] Oct 03 08:01:20 crc kubenswrapper[4664]: I1003 08:01:20.138955 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wcxt\" (UniqueName: \"kubernetes.io/projected/d3d13871-7a04-4b9f-a8f0-7fbaff9cff1e-kube-api-access-6wcxt\") pod \"cert-manager-5b446d88c5-2f8b7\" (UID: \"d3d13871-7a04-4b9f-a8f0-7fbaff9cff1e\") " pod="cert-manager/cert-manager-5b446d88c5-2f8b7" Oct 03 08:01:20 crc kubenswrapper[4664]: I1003 08:01:20.139003 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzjcl\" (UniqueName: \"kubernetes.io/projected/bc158e46-d2d5-4c58-aa38-a1d395d68991-kube-api-access-hzjcl\") pod \"cert-manager-cainjector-7f985d654d-zvfms\" (UID: \"bc158e46-d2d5-4c58-aa38-a1d395d68991\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-zvfms" Oct 03 08:01:20 crc kubenswrapper[4664]: I1003 08:01:20.139058 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph69h\" (UniqueName: \"kubernetes.io/projected/bbaac95d-7035-4d08-97e6-70e1b4ef4b3f-kube-api-access-ph69h\") pod \"cert-manager-webhook-5655c58dd6-xtrlc\" (UID: \"bbaac95d-7035-4d08-97e6-70e1b4ef4b3f\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-xtrlc" Oct 03 08:01:20 crc kubenswrapper[4664]: I1003 08:01:20.157232 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzjcl\" (UniqueName: \"kubernetes.io/projected/bc158e46-d2d5-4c58-aa38-a1d395d68991-kube-api-access-hzjcl\") pod \"cert-manager-cainjector-7f985d654d-zvfms\" (UID: \"bc158e46-d2d5-4c58-aa38-a1d395d68991\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-zvfms" Oct 03 08:01:20 crc kubenswrapper[4664]: I1003 08:01:20.161227 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wcxt\" (UniqueName: \"kubernetes.io/projected/d3d13871-7a04-4b9f-a8f0-7fbaff9cff1e-kube-api-access-6wcxt\") pod \"cert-manager-5b446d88c5-2f8b7\" (UID: \"d3d13871-7a04-4b9f-a8f0-7fbaff9cff1e\") " pod="cert-manager/cert-manager-5b446d88c5-2f8b7" Oct 03 08:01:20 crc kubenswrapper[4664]: I1003 08:01:20.239915 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph69h\" (UniqueName: \"kubernetes.io/projected/bbaac95d-7035-4d08-97e6-70e1b4ef4b3f-kube-api-access-ph69h\") pod \"cert-manager-webhook-5655c58dd6-xtrlc\" (UID: \"bbaac95d-7035-4d08-97e6-70e1b4ef4b3f\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-xtrlc" Oct 03 08:01:20 crc kubenswrapper[4664]: I1003 08:01:20.255861 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph69h\" (UniqueName: \"kubernetes.io/projected/bbaac95d-7035-4d08-97e6-70e1b4ef4b3f-kube-api-access-ph69h\") pod \"cert-manager-webhook-5655c58dd6-xtrlc\" (UID: \"bbaac95d-7035-4d08-97e6-70e1b4ef4b3f\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-xtrlc" Oct 03 08:01:20 crc kubenswrapper[4664]: I1003 08:01:20.315182 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-2f8b7" Oct 03 08:01:20 crc kubenswrapper[4664]: I1003 08:01:20.342125 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-zvfms" Oct 03 08:01:20 crc kubenswrapper[4664]: I1003 08:01:20.350423 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xdgnx"] Oct 03 08:01:20 crc kubenswrapper[4664]: I1003 08:01:20.366738 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-xtrlc" Oct 03 08:01:20 crc kubenswrapper[4664]: I1003 08:01:20.460069 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xdgnx" event={"ID":"a23c5b33-2c0a-4fd8-ab02-2bdb1130a468","Type":"ContainerStarted","Data":"41516f657ed1357d8bfb091a6504ca742104a39b02f18a9c93dd707233083c45"} Oct 03 08:01:20 crc kubenswrapper[4664]: I1003 08:01:20.734421 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-2f8b7"] Oct 03 08:01:20 crc kubenswrapper[4664]: I1003 08:01:20.754404 4664 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 08:01:20 crc kubenswrapper[4664]: I1003 08:01:20.826375 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-zvfms"] Oct 03 08:01:20 crc kubenswrapper[4664]: W1003 08:01:20.830963 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc158e46_d2d5_4c58_aa38_a1d395d68991.slice/crio-3f3f355caa4dc0d9f411ac6dd31af3347176dbb1c7602d5983878c54a5261e28 WatchSource:0}: Error finding container 3f3f355caa4dc0d9f411ac6dd31af3347176dbb1c7602d5983878c54a5261e28: Status 404 returned error can't find the container with id 3f3f355caa4dc0d9f411ac6dd31af3347176dbb1c7602d5983878c54a5261e28 Oct 03 08:01:20 crc kubenswrapper[4664]: I1003 08:01:20.899333 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-xtrlc"] Oct 03 08:01:20 crc kubenswrapper[4664]: W1003 08:01:20.905287 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbbaac95d_7035_4d08_97e6_70e1b4ef4b3f.slice/crio-16b148634ca30190927a494493c1c78e615e4bea1830e4fd88d0e7c6081b224e WatchSource:0}: Error finding container 16b148634ca30190927a494493c1c78e615e4bea1830e4fd88d0e7c6081b224e: Status 404 returned error can't find the container with id 16b148634ca30190927a494493c1c78e615e4bea1830e4fd88d0e7c6081b224e Oct 03 08:01:21 crc kubenswrapper[4664]: I1003 08:01:21.466332 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-zvfms" event={"ID":"bc158e46-d2d5-4c58-aa38-a1d395d68991","Type":"ContainerStarted","Data":"3f3f355caa4dc0d9f411ac6dd31af3347176dbb1c7602d5983878c54a5261e28"} Oct 03 08:01:21 crc kubenswrapper[4664]: I1003 08:01:21.467563 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-2f8b7" event={"ID":"d3d13871-7a04-4b9f-a8f0-7fbaff9cff1e","Type":"ContainerStarted","Data":"3b5a02177b03e3655b19b307727747d1d48aa87816971aa75e4d83e112eb4e88"} Oct 03 08:01:21 crc kubenswrapper[4664]: I1003 08:01:21.468814 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-xtrlc" event={"ID":"bbaac95d-7035-4d08-97e6-70e1b4ef4b3f","Type":"ContainerStarted","Data":"16b148634ca30190927a494493c1c78e615e4bea1830e4fd88d0e7c6081b224e"} Oct 03 08:01:21 crc kubenswrapper[4664]: I1003 08:01:21.470088 4664 generic.go:334] "Generic (PLEG): container finished" podID="a23c5b33-2c0a-4fd8-ab02-2bdb1130a468" containerID="202bd688a282a89bee6cd992ac37c678c045f64d50e02190ca10a962b6830515" exitCode=0 Oct 03 08:01:21 crc kubenswrapper[4664]: I1003 08:01:21.470119 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xdgnx" event={"ID":"a23c5b33-2c0a-4fd8-ab02-2bdb1130a468","Type":"ContainerDied","Data":"202bd688a282a89bee6cd992ac37c678c045f64d50e02190ca10a962b6830515"} Oct 03 08:01:25 crc kubenswrapper[4664]: I1003 08:01:25.495728 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-2f8b7" event={"ID":"d3d13871-7a04-4b9f-a8f0-7fbaff9cff1e","Type":"ContainerStarted","Data":"95baca410a530e952ccb19d1970a2295c4206b7279b9058682a1d7ef86f7c0f2"} Oct 03 08:01:25 crc kubenswrapper[4664]: I1003 08:01:25.497409 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-xtrlc" event={"ID":"bbaac95d-7035-4d08-97e6-70e1b4ef4b3f","Type":"ContainerStarted","Data":"742a32dd4c14076f84b64a99b08d1ae0709ede26b5419f4d3edca5a6d1006b67"} Oct 03 08:01:25 crc kubenswrapper[4664]: I1003 08:01:25.499537 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xdgnx" event={"ID":"a23c5b33-2c0a-4fd8-ab02-2bdb1130a468","Type":"ContainerStarted","Data":"970dd5b273c4279232457ebc6aab29153aa669d2107b734cc0b49cd51d954851"} Oct 03 08:01:25 crc kubenswrapper[4664]: I1003 08:01:25.501118 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-zvfms" event={"ID":"bc158e46-d2d5-4c58-aa38-a1d395d68991","Type":"ContainerStarted","Data":"ade34bd06772db755d4c2c8d7ecb7654cc5083e55a4a29cfc77dca658746f93b"} Oct 03 08:01:26 crc kubenswrapper[4664]: I1003 08:01:26.507896 4664 generic.go:334] "Generic (PLEG): container finished" podID="a23c5b33-2c0a-4fd8-ab02-2bdb1130a468" containerID="970dd5b273c4279232457ebc6aab29153aa669d2107b734cc0b49cd51d954851" exitCode=0 Oct 03 08:01:26 crc kubenswrapper[4664]: I1003 08:01:26.507956 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xdgnx" event={"ID":"a23c5b33-2c0a-4fd8-ab02-2bdb1130a468","Type":"ContainerDied","Data":"970dd5b273c4279232457ebc6aab29153aa669d2107b734cc0b49cd51d954851"} Oct 03 08:01:26 crc kubenswrapper[4664]: I1003 08:01:26.508351 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-xtrlc" Oct 03 08:01:26 crc kubenswrapper[4664]: I1003 08:01:26.545702 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-xtrlc" podStartSLOduration=3.367859369 podStartE2EDuration="7.545681774s" podCreationTimestamp="2025-10-03 08:01:19 +0000 UTC" firstStartedPulling="2025-10-03 08:01:20.907268449 +0000 UTC m=+781.728458929" lastFinishedPulling="2025-10-03 08:01:25.085090844 +0000 UTC m=+785.906281334" observedRunningTime="2025-10-03 08:01:26.543200457 +0000 UTC m=+787.364390967" watchObservedRunningTime="2025-10-03 08:01:26.545681774 +0000 UTC m=+787.366872284" Oct 03 08:01:26 crc kubenswrapper[4664]: I1003 08:01:26.559018 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-zvfms" podStartSLOduration=3.306500361 podStartE2EDuration="7.559006018s" podCreationTimestamp="2025-10-03 08:01:19 +0000 UTC" firstStartedPulling="2025-10-03 08:01:20.833447381 +0000 UTC m=+781.654637871" lastFinishedPulling="2025-10-03 08:01:25.085953038 +0000 UTC m=+785.907143528" observedRunningTime="2025-10-03 08:01:26.558740221 +0000 UTC m=+787.379930711" watchObservedRunningTime="2025-10-03 08:01:26.559006018 +0000 UTC m=+787.380196508" Oct 03 08:01:26 crc kubenswrapper[4664]: I1003 08:01:26.575584 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-2f8b7" podStartSLOduration=3.167879443 podStartE2EDuration="7.575564781s" podCreationTimestamp="2025-10-03 08:01:19 +0000 UTC" firstStartedPulling="2025-10-03 08:01:20.753778604 +0000 UTC m=+781.574969094" lastFinishedPulling="2025-10-03 08:01:25.161463942 +0000 UTC m=+785.982654432" observedRunningTime="2025-10-03 08:01:26.573175646 +0000 UTC m=+787.394366156" watchObservedRunningTime="2025-10-03 08:01:26.575564781 +0000 UTC m=+787.396755291" Oct 03 08:01:27 crc kubenswrapper[4664]: I1003 08:01:27.515548 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xdgnx" event={"ID":"a23c5b33-2c0a-4fd8-ab02-2bdb1130a468","Type":"ContainerStarted","Data":"f601128437d606a98290048770341bda91c4f2959d07bd40533983ce7a71c3ad"} Oct 03 08:01:27 crc kubenswrapper[4664]: I1003 08:01:27.534035 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xdgnx" podStartSLOduration=2.978822805 podStartE2EDuration="8.534018406s" podCreationTimestamp="2025-10-03 08:01:19 +0000 UTC" firstStartedPulling="2025-10-03 08:01:21.472397805 +0000 UTC m=+782.293588295" lastFinishedPulling="2025-10-03 08:01:27.027593406 +0000 UTC m=+787.848783896" observedRunningTime="2025-10-03 08:01:27.532903516 +0000 UTC m=+788.354094016" watchObservedRunningTime="2025-10-03 08:01:27.534018406 +0000 UTC m=+788.355208886" Oct 03 08:01:29 crc kubenswrapper[4664]: I1003 08:01:29.776310 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xdgnx" Oct 03 08:01:29 crc kubenswrapper[4664]: I1003 08:01:29.776655 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xdgnx" Oct 03 08:01:29 crc kubenswrapper[4664]: I1003 08:01:29.812250 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xdgnx" Oct 03 08:01:30 crc kubenswrapper[4664]: I1003 08:01:30.370703 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-xtrlc" Oct 03 08:01:30 crc kubenswrapper[4664]: I1003 08:01:30.422367 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-2jpvm"] Oct 03 08:01:30 crc kubenswrapper[4664]: I1003 08:01:30.423513 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerName="ovn-controller" containerID="cri-o://023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db" gracePeriod=30 Oct 03 08:01:30 crc kubenswrapper[4664]: I1003 08:01:30.423562 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerName="nbdb" containerID="cri-o://5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1" gracePeriod=30 Oct 03 08:01:30 crc kubenswrapper[4664]: I1003 08:01:30.423748 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerName="northd" containerID="cri-o://2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add" gracePeriod=30 Oct 03 08:01:30 crc kubenswrapper[4664]: I1003 08:01:30.423806 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2" gracePeriod=30 Oct 03 08:01:30 crc kubenswrapper[4664]: I1003 08:01:30.423850 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerName="kube-rbac-proxy-node" containerID="cri-o://67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac" gracePeriod=30 Oct 03 08:01:30 crc kubenswrapper[4664]: I1003 08:01:30.423886 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerName="ovn-acl-logging" containerID="cri-o://82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00" gracePeriod=30 Oct 03 08:01:30 crc kubenswrapper[4664]: I1003 08:01:30.424085 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerName="sbdb" containerID="cri-o://a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b" gracePeriod=30 Oct 03 08:01:30 crc kubenswrapper[4664]: I1003 08:01:30.528140 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerName="ovnkube-controller" containerID="cri-o://74dcdcf3bafa074aa1221fd77fffcad57d7519c23c8edc43218e8762a0ca5e1d" gracePeriod=30 Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.176006 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2jpvm_8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d/ovnkube-controller/3.log" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.178838 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2jpvm_8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d/ovn-acl-logging/0.log" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.179419 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2jpvm_8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d/ovn-controller/0.log" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.179995 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.228387 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-x2777"] Oct 03 08:01:31 crc kubenswrapper[4664]: E1003 08:01:31.228628 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerName="kube-rbac-proxy-node" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.228642 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerName="kube-rbac-proxy-node" Oct 03 08:01:31 crc kubenswrapper[4664]: E1003 08:01:31.228650 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerName="kube-rbac-proxy-ovn-metrics" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.228657 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerName="kube-rbac-proxy-ovn-metrics" Oct 03 08:01:31 crc kubenswrapper[4664]: E1003 08:01:31.228667 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerName="ovnkube-controller" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.228673 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerName="ovnkube-controller" Oct 03 08:01:31 crc kubenswrapper[4664]: E1003 08:01:31.228680 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerName="sbdb" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.228687 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerName="sbdb" Oct 03 08:01:31 crc kubenswrapper[4664]: E1003 08:01:31.228696 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerName="ovnkube-controller" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.228701 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerName="ovnkube-controller" Oct 03 08:01:31 crc kubenswrapper[4664]: E1003 08:01:31.228710 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerName="kubecfg-setup" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.228716 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerName="kubecfg-setup" Oct 03 08:01:31 crc kubenswrapper[4664]: E1003 08:01:31.228724 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerName="ovnkube-controller" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.228729 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerName="ovnkube-controller" Oct 03 08:01:31 crc kubenswrapper[4664]: E1003 08:01:31.228740 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerName="ovn-controller" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.228746 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerName="ovn-controller" Oct 03 08:01:31 crc kubenswrapper[4664]: E1003 08:01:31.228755 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerName="northd" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.228760 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerName="northd" Oct 03 08:01:31 crc kubenswrapper[4664]: E1003 08:01:31.228773 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerName="ovn-acl-logging" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.228778 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerName="ovn-acl-logging" Oct 03 08:01:31 crc kubenswrapper[4664]: E1003 08:01:31.228786 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerName="nbdb" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.228793 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerName="nbdb" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.228883 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerName="ovnkube-controller" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.228895 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerName="nbdb" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.228903 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerName="kube-rbac-proxy-ovn-metrics" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.228911 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerName="ovn-controller" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.228920 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerName="sbdb" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.228927 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerName="northd" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.228934 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerName="ovnkube-controller" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.228940 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerName="ovnkube-controller" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.228949 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerName="ovn-acl-logging" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.228956 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerName="ovnkube-controller" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.228963 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerName="kube-rbac-proxy-node" Oct 03 08:01:31 crc kubenswrapper[4664]: E1003 08:01:31.229043 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerName="ovnkube-controller" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.229050 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerName="ovnkube-controller" Oct 03 08:01:31 crc kubenswrapper[4664]: E1003 08:01:31.229057 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerName="ovnkube-controller" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.229063 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerName="ovnkube-controller" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.229160 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerName="ovnkube-controller" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.244332 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.306181 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-var-lib-openvswitch\") pod \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.306283 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-env-overrides\") pod \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.306314 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-log-socket\") pod \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.306345 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-host-run-ovn-kubernetes\") pod \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.306367 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-host-cni-netd\") pod \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.306385 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-host-slash\") pod \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.306409 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-run-ovn\") pod \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.306425 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-etc-openvswitch\") pod \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.306463 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-host-run-netns\") pod \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.306489 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-host-kubelet\") pod \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.306516 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-ovnkube-config\") pod \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.306505 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-log-socket" (OuterVolumeSpecName: "log-socket") pod "8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" (UID: "8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.306523 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" (UID: "8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.306585 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" (UID: "8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.306590 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" (UID: "8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.306548 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.306525 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" (UID: "8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.306654 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" (UID: "8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.306590 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" (UID: "8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.306569 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" (UID: "8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.306600 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-host-slash" (OuterVolumeSpecName: "host-slash") pod "8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" (UID: "8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.306739 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-host-cni-bin\") pod \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.306772 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-run-systemd\") pod \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.306794 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-node-log\") pod \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.306829 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-ovn-node-metrics-cert\") pod \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.306840 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" (UID: "8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.306860 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" (UID: "8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.306873 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" (UID: "8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.306853 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-run-openvswitch\") pod \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.306940 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-node-log" (OuterVolumeSpecName: "node-log") pod "8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" (UID: "8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.306989 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-ovnkube-script-lib\") pod \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.307023 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-systemd-units\") pod \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.307067 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k42nn\" (UniqueName: \"kubernetes.io/projected/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-kube-api-access-k42nn\") pod \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\" (UID: \"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d\") " Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.307090 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" (UID: "8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.307137 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" (UID: "8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.307802 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" (UID: "8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.307822 4664 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.307968 4664 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-node-log\") on node \"crc\" DevicePath \"\"" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.308033 4664 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.308093 4664 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.308153 4664 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.308214 4664 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-log-socket\") on node \"crc\" DevicePath \"\"" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.308350 4664 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.308442 4664 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.308555 4664 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-host-slash\") on node \"crc\" DevicePath \"\"" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.308717 4664 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.308893 4664 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.307888 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" (UID: "8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.308980 4664 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.309119 4664 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.309196 4664 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.309277 4664 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.313257 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" (UID: "8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.313289 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-kube-api-access-k42nn" (OuterVolumeSpecName: "kube-api-access-k42nn") pod "8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" (UID: "8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d"). InnerVolumeSpecName "kube-api-access-k42nn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.321774 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" (UID: "8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.410373 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.410430 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-host-kubelet\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.410453 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-run-systemd\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.410482 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-host-cni-bin\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.410509 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvgpn\" (UniqueName: \"kubernetes.io/projected/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-kube-api-access-pvgpn\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.410532 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-ovnkube-script-lib\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.410555 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-node-log\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.410582 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-host-run-netns\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.410629 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-ovnkube-config\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.410651 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-env-overrides\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.410684 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-systemd-units\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.410709 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-var-lib-openvswitch\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.410730 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-run-ovn\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.410751 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-host-slash\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.410777 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-etc-openvswitch\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.410803 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-host-cni-netd\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.410831 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-host-run-ovn-kubernetes\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.410852 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-ovn-node-metrics-cert\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.410875 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-log-socket\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.410899 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-run-openvswitch\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.410940 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k42nn\" (UniqueName: \"kubernetes.io/projected/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-kube-api-access-k42nn\") on node \"crc\" DevicePath \"\"" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.410954 4664 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.410966 4664 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.410978 4664 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.410989 4664 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.511685 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-systemd-units\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.511743 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-var-lib-openvswitch\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.511771 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-run-ovn\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.511795 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-host-slash\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.511812 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-systemd-units\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.511842 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-etc-openvswitch\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.511815 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-etc-openvswitch\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.511866 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-host-slash\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.511882 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-host-cni-netd\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.511874 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-var-lib-openvswitch\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.511911 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-host-run-ovn-kubernetes\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.511935 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-ovn-node-metrics-cert\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.511926 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-run-ovn\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.511957 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-host-cni-netd\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.511960 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-log-socket\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.511961 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-host-run-ovn-kubernetes\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.512018 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-log-socket\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.512023 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-run-openvswitch\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.512091 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.512113 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-run-openvswitch\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.512124 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-host-kubelet\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.512147 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-run-systemd\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.512159 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.512168 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-run-systemd\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.512182 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-host-cni-bin\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.512221 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-host-cni-bin\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.512254 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvgpn\" (UniqueName: \"kubernetes.io/projected/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-kube-api-access-pvgpn\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.512184 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-host-kubelet\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.512287 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-ovnkube-script-lib\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.512316 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-node-log\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.512361 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-host-run-netns\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.512402 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-node-log\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.512400 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-ovnkube-config\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.512441 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-env-overrides\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.512450 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-host-run-netns\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.513058 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-env-overrides\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.513182 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-ovnkube-config\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.513547 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-ovnkube-script-lib\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.516057 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-ovn-node-metrics-cert\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.528787 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvgpn\" (UniqueName: \"kubernetes.io/projected/fea2845f-2ae0-4e33-b28b-d11e9d85ea10-kube-api-access-pvgpn\") pod \"ovnkube-node-x2777\" (UID: \"fea2845f-2ae0-4e33-b28b-d11e9d85ea10\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.552658 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2jpvm_8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d/ovnkube-controller/3.log" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.555516 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2jpvm_8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d/ovn-acl-logging/0.log" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.556110 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2jpvm_8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d/ovn-controller/0.log" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.556592 4664 generic.go:334] "Generic (PLEG): container finished" podID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerID="74dcdcf3bafa074aa1221fd77fffcad57d7519c23c8edc43218e8762a0ca5e1d" exitCode=0 Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.556638 4664 generic.go:334] "Generic (PLEG): container finished" podID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerID="a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b" exitCode=0 Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.556649 4664 generic.go:334] "Generic (PLEG): container finished" podID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerID="5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1" exitCode=0 Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.556658 4664 generic.go:334] "Generic (PLEG): container finished" podID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerID="2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add" exitCode=0 Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.556665 4664 generic.go:334] "Generic (PLEG): container finished" podID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerID="5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2" exitCode=0 Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.556673 4664 generic.go:334] "Generic (PLEG): container finished" podID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerID="67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac" exitCode=0 Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.556681 4664 generic.go:334] "Generic (PLEG): container finished" podID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerID="82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00" exitCode=143 Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.556672 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" event={"ID":"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d","Type":"ContainerDied","Data":"74dcdcf3bafa074aa1221fd77fffcad57d7519c23c8edc43218e8762a0ca5e1d"} Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.556729 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" event={"ID":"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d","Type":"ContainerDied","Data":"a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b"} Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.556746 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" event={"ID":"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d","Type":"ContainerDied","Data":"5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1"} Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.556758 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" event={"ID":"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d","Type":"ContainerDied","Data":"2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add"} Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.556770 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" event={"ID":"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d","Type":"ContainerDied","Data":"5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2"} Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.556689 4664 generic.go:334] "Generic (PLEG): container finished" podID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" containerID="023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db" exitCode=143 Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.556794 4664 scope.go:117] "RemoveContainer" containerID="74dcdcf3bafa074aa1221fd77fffcad57d7519c23c8edc43218e8762a0ca5e1d" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.556782 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" event={"ID":"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d","Type":"ContainerDied","Data":"67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac"} Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.556843 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.556858 4664 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"46c7dd53736fe9d646bd1433c7f564d5096f83984954d65d5f63ce953cd49690"} Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.556952 4664 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b"} Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.556972 4664 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1"} Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.556978 4664 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add"} Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.556985 4664 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2"} Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.556990 4664 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac"} Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.556996 4664 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00"} Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.557005 4664 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db"} Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.557011 4664 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12"} Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.557032 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" event={"ID":"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d","Type":"ContainerDied","Data":"82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00"} Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.557053 4664 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"74dcdcf3bafa074aa1221fd77fffcad57d7519c23c8edc43218e8762a0ca5e1d"} Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.557059 4664 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"46c7dd53736fe9d646bd1433c7f564d5096f83984954d65d5f63ce953cd49690"} Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.557064 4664 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b"} Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.557069 4664 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1"} Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.557075 4664 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add"} Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.557079 4664 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2"} Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.557084 4664 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac"} Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.557089 4664 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00"} Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.557094 4664 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db"} Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.557098 4664 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12"} Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.557106 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" event={"ID":"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d","Type":"ContainerDied","Data":"023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db"} Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.557114 4664 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"74dcdcf3bafa074aa1221fd77fffcad57d7519c23c8edc43218e8762a0ca5e1d"} Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.557122 4664 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"46c7dd53736fe9d646bd1433c7f564d5096f83984954d65d5f63ce953cd49690"} Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.557127 4664 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b"} Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.557135 4664 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1"} Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.557143 4664 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add"} Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.557148 4664 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2"} Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.557153 4664 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac"} Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.557159 4664 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00"} Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.557164 4664 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db"} Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.557169 4664 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12"} Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.557176 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2jpvm" event={"ID":"8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d","Type":"ContainerDied","Data":"1f80a89734a09e0e7687baa3ab9f827adc9a0a3af6e78599cde9d3d8954f5cac"} Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.557183 4664 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"74dcdcf3bafa074aa1221fd77fffcad57d7519c23c8edc43218e8762a0ca5e1d"} Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.557189 4664 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"46c7dd53736fe9d646bd1433c7f564d5096f83984954d65d5f63ce953cd49690"} Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.557195 4664 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b"} Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.557200 4664 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1"} Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.557205 4664 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add"} Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.557210 4664 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2"} Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.557215 4664 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac"} Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.557221 4664 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00"} Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.557227 4664 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db"} Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.557232 4664 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12"} Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.558477 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-72cm2_6998d742-8d17-4f20-ab52-c30d9f7b0b89/kube-multus/2.log" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.559532 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-72cm2_6998d742-8d17-4f20-ab52-c30d9f7b0b89/kube-multus/1.log" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.559580 4664 generic.go:334] "Generic (PLEG): container finished" podID="6998d742-8d17-4f20-ab52-c30d9f7b0b89" containerID="4d89d1654dd2e1ba9bea8e000dac63af587a2f465fc475abfe21847cbc232292" exitCode=2 Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.559629 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-72cm2" event={"ID":"6998d742-8d17-4f20-ab52-c30d9f7b0b89","Type":"ContainerDied","Data":"4d89d1654dd2e1ba9bea8e000dac63af587a2f465fc475abfe21847cbc232292"} Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.559654 4664 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"482e54714945acaea85fdeeb4b89eb9b16568c96319d07eb812ef88bd5faeb85"} Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.560204 4664 scope.go:117] "RemoveContainer" containerID="4d89d1654dd2e1ba9bea8e000dac63af587a2f465fc475abfe21847cbc232292" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.562405 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:31 crc kubenswrapper[4664]: W1003 08:01:31.594132 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfea2845f_2ae0_4e33_b28b_d11e9d85ea10.slice/crio-00c3fd925b3bd9e81101d6ed84f8c270bb87e13c6debe0a8de05efe2b60d6d0a WatchSource:0}: Error finding container 00c3fd925b3bd9e81101d6ed84f8c270bb87e13c6debe0a8de05efe2b60d6d0a: Status 404 returned error can't find the container with id 00c3fd925b3bd9e81101d6ed84f8c270bb87e13c6debe0a8de05efe2b60d6d0a Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.606506 4664 scope.go:117] "RemoveContainer" containerID="46c7dd53736fe9d646bd1433c7f564d5096f83984954d65d5f63ce953cd49690" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.668232 4664 scope.go:117] "RemoveContainer" containerID="a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.694946 4664 scope.go:117] "RemoveContainer" containerID="5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.703729 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-2jpvm"] Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.714768 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-2jpvm"] Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.716033 4664 scope.go:117] "RemoveContainer" containerID="2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.733186 4664 scope.go:117] "RemoveContainer" containerID="5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.749401 4664 scope.go:117] "RemoveContainer" containerID="67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.764213 4664 scope.go:117] "RemoveContainer" containerID="82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.783434 4664 scope.go:117] "RemoveContainer" containerID="023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.838055 4664 scope.go:117] "RemoveContainer" containerID="cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.855973 4664 scope.go:117] "RemoveContainer" containerID="74dcdcf3bafa074aa1221fd77fffcad57d7519c23c8edc43218e8762a0ca5e1d" Oct 03 08:01:31 crc kubenswrapper[4664]: E1003 08:01:31.856641 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74dcdcf3bafa074aa1221fd77fffcad57d7519c23c8edc43218e8762a0ca5e1d\": container with ID starting with 74dcdcf3bafa074aa1221fd77fffcad57d7519c23c8edc43218e8762a0ca5e1d not found: ID does not exist" containerID="74dcdcf3bafa074aa1221fd77fffcad57d7519c23c8edc43218e8762a0ca5e1d" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.856771 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74dcdcf3bafa074aa1221fd77fffcad57d7519c23c8edc43218e8762a0ca5e1d"} err="failed to get container status \"74dcdcf3bafa074aa1221fd77fffcad57d7519c23c8edc43218e8762a0ca5e1d\": rpc error: code = NotFound desc = could not find container \"74dcdcf3bafa074aa1221fd77fffcad57d7519c23c8edc43218e8762a0ca5e1d\": container with ID starting with 74dcdcf3bafa074aa1221fd77fffcad57d7519c23c8edc43218e8762a0ca5e1d not found: ID does not exist" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.856884 4664 scope.go:117] "RemoveContainer" containerID="46c7dd53736fe9d646bd1433c7f564d5096f83984954d65d5f63ce953cd49690" Oct 03 08:01:31 crc kubenswrapper[4664]: E1003 08:01:31.857322 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46c7dd53736fe9d646bd1433c7f564d5096f83984954d65d5f63ce953cd49690\": container with ID starting with 46c7dd53736fe9d646bd1433c7f564d5096f83984954d65d5f63ce953cd49690 not found: ID does not exist" containerID="46c7dd53736fe9d646bd1433c7f564d5096f83984954d65d5f63ce953cd49690" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.857342 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46c7dd53736fe9d646bd1433c7f564d5096f83984954d65d5f63ce953cd49690"} err="failed to get container status \"46c7dd53736fe9d646bd1433c7f564d5096f83984954d65d5f63ce953cd49690\": rpc error: code = NotFound desc = could not find container \"46c7dd53736fe9d646bd1433c7f564d5096f83984954d65d5f63ce953cd49690\": container with ID starting with 46c7dd53736fe9d646bd1433c7f564d5096f83984954d65d5f63ce953cd49690 not found: ID does not exist" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.857357 4664 scope.go:117] "RemoveContainer" containerID="a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b" Oct 03 08:01:31 crc kubenswrapper[4664]: E1003 08:01:31.857994 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b\": container with ID starting with a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b not found: ID does not exist" containerID="a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.858099 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b"} err="failed to get container status \"a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b\": rpc error: code = NotFound desc = could not find container \"a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b\": container with ID starting with a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b not found: ID does not exist" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.858167 4664 scope.go:117] "RemoveContainer" containerID="5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1" Oct 03 08:01:31 crc kubenswrapper[4664]: E1003 08:01:31.858511 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1\": container with ID starting with 5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1 not found: ID does not exist" containerID="5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.858567 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1"} err="failed to get container status \"5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1\": rpc error: code = NotFound desc = could not find container \"5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1\": container with ID starting with 5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1 not found: ID does not exist" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.858583 4664 scope.go:117] "RemoveContainer" containerID="2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add" Oct 03 08:01:31 crc kubenswrapper[4664]: E1003 08:01:31.858864 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add\": container with ID starting with 2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add not found: ID does not exist" containerID="2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.858890 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add"} err="failed to get container status \"2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add\": rpc error: code = NotFound desc = could not find container \"2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add\": container with ID starting with 2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add not found: ID does not exist" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.858908 4664 scope.go:117] "RemoveContainer" containerID="5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2" Oct 03 08:01:31 crc kubenswrapper[4664]: E1003 08:01:31.859383 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2\": container with ID starting with 5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2 not found: ID does not exist" containerID="5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.859410 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2"} err="failed to get container status \"5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2\": rpc error: code = NotFound desc = could not find container \"5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2\": container with ID starting with 5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2 not found: ID does not exist" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.859429 4664 scope.go:117] "RemoveContainer" containerID="67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac" Oct 03 08:01:31 crc kubenswrapper[4664]: E1003 08:01:31.859726 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac\": container with ID starting with 67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac not found: ID does not exist" containerID="67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.859750 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac"} err="failed to get container status \"67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac\": rpc error: code = NotFound desc = could not find container \"67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac\": container with ID starting with 67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac not found: ID does not exist" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.859763 4664 scope.go:117] "RemoveContainer" containerID="82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00" Oct 03 08:01:31 crc kubenswrapper[4664]: E1003 08:01:31.859969 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00\": container with ID starting with 82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00 not found: ID does not exist" containerID="82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.859995 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00"} err="failed to get container status \"82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00\": rpc error: code = NotFound desc = could not find container \"82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00\": container with ID starting with 82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00 not found: ID does not exist" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.860015 4664 scope.go:117] "RemoveContainer" containerID="023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db" Oct 03 08:01:31 crc kubenswrapper[4664]: E1003 08:01:31.860279 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db\": container with ID starting with 023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db not found: ID does not exist" containerID="023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.860307 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db"} err="failed to get container status \"023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db\": rpc error: code = NotFound desc = could not find container \"023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db\": container with ID starting with 023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db not found: ID does not exist" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.860379 4664 scope.go:117] "RemoveContainer" containerID="cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12" Oct 03 08:01:31 crc kubenswrapper[4664]: E1003 08:01:31.860838 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\": container with ID starting with cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12 not found: ID does not exist" containerID="cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.860937 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12"} err="failed to get container status \"cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\": rpc error: code = NotFound desc = could not find container \"cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\": container with ID starting with cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12 not found: ID does not exist" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.861003 4664 scope.go:117] "RemoveContainer" containerID="74dcdcf3bafa074aa1221fd77fffcad57d7519c23c8edc43218e8762a0ca5e1d" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.861481 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74dcdcf3bafa074aa1221fd77fffcad57d7519c23c8edc43218e8762a0ca5e1d"} err="failed to get container status \"74dcdcf3bafa074aa1221fd77fffcad57d7519c23c8edc43218e8762a0ca5e1d\": rpc error: code = NotFound desc = could not find container \"74dcdcf3bafa074aa1221fd77fffcad57d7519c23c8edc43218e8762a0ca5e1d\": container with ID starting with 74dcdcf3bafa074aa1221fd77fffcad57d7519c23c8edc43218e8762a0ca5e1d not found: ID does not exist" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.861553 4664 scope.go:117] "RemoveContainer" containerID="46c7dd53736fe9d646bd1433c7f564d5096f83984954d65d5f63ce953cd49690" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.861974 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46c7dd53736fe9d646bd1433c7f564d5096f83984954d65d5f63ce953cd49690"} err="failed to get container status \"46c7dd53736fe9d646bd1433c7f564d5096f83984954d65d5f63ce953cd49690\": rpc error: code = NotFound desc = could not find container \"46c7dd53736fe9d646bd1433c7f564d5096f83984954d65d5f63ce953cd49690\": container with ID starting with 46c7dd53736fe9d646bd1433c7f564d5096f83984954d65d5f63ce953cd49690 not found: ID does not exist" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.862061 4664 scope.go:117] "RemoveContainer" containerID="a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.862376 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b"} err="failed to get container status \"a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b\": rpc error: code = NotFound desc = could not find container \"a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b\": container with ID starting with a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b not found: ID does not exist" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.862410 4664 scope.go:117] "RemoveContainer" containerID="5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.862693 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1"} err="failed to get container status \"5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1\": rpc error: code = NotFound desc = could not find container \"5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1\": container with ID starting with 5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1 not found: ID does not exist" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.862764 4664 scope.go:117] "RemoveContainer" containerID="2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.863076 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add"} err="failed to get container status \"2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add\": rpc error: code = NotFound desc = could not find container \"2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add\": container with ID starting with 2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add not found: ID does not exist" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.863105 4664 scope.go:117] "RemoveContainer" containerID="5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.863382 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2"} err="failed to get container status \"5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2\": rpc error: code = NotFound desc = could not find container \"5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2\": container with ID starting with 5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2 not found: ID does not exist" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.863412 4664 scope.go:117] "RemoveContainer" containerID="67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.863660 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac"} err="failed to get container status \"67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac\": rpc error: code = NotFound desc = could not find container \"67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac\": container with ID starting with 67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac not found: ID does not exist" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.863705 4664 scope.go:117] "RemoveContainer" containerID="82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.864837 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00"} err="failed to get container status \"82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00\": rpc error: code = NotFound desc = could not find container \"82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00\": container with ID starting with 82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00 not found: ID does not exist" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.864873 4664 scope.go:117] "RemoveContainer" containerID="023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.865205 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db"} err="failed to get container status \"023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db\": rpc error: code = NotFound desc = could not find container \"023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db\": container with ID starting with 023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db not found: ID does not exist" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.865290 4664 scope.go:117] "RemoveContainer" containerID="cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.866070 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12"} err="failed to get container status \"cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\": rpc error: code = NotFound desc = could not find container \"cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\": container with ID starting with cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12 not found: ID does not exist" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.866426 4664 scope.go:117] "RemoveContainer" containerID="74dcdcf3bafa074aa1221fd77fffcad57d7519c23c8edc43218e8762a0ca5e1d" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.866965 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74dcdcf3bafa074aa1221fd77fffcad57d7519c23c8edc43218e8762a0ca5e1d"} err="failed to get container status \"74dcdcf3bafa074aa1221fd77fffcad57d7519c23c8edc43218e8762a0ca5e1d\": rpc error: code = NotFound desc = could not find container \"74dcdcf3bafa074aa1221fd77fffcad57d7519c23c8edc43218e8762a0ca5e1d\": container with ID starting with 74dcdcf3bafa074aa1221fd77fffcad57d7519c23c8edc43218e8762a0ca5e1d not found: ID does not exist" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.867040 4664 scope.go:117] "RemoveContainer" containerID="46c7dd53736fe9d646bd1433c7f564d5096f83984954d65d5f63ce953cd49690" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.867348 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46c7dd53736fe9d646bd1433c7f564d5096f83984954d65d5f63ce953cd49690"} err="failed to get container status \"46c7dd53736fe9d646bd1433c7f564d5096f83984954d65d5f63ce953cd49690\": rpc error: code = NotFound desc = could not find container \"46c7dd53736fe9d646bd1433c7f564d5096f83984954d65d5f63ce953cd49690\": container with ID starting with 46c7dd53736fe9d646bd1433c7f564d5096f83984954d65d5f63ce953cd49690 not found: ID does not exist" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.867376 4664 scope.go:117] "RemoveContainer" containerID="a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.867622 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b"} err="failed to get container status \"a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b\": rpc error: code = NotFound desc = could not find container \"a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b\": container with ID starting with a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b not found: ID does not exist" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.867703 4664 scope.go:117] "RemoveContainer" containerID="5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.868029 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1"} err="failed to get container status \"5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1\": rpc error: code = NotFound desc = could not find container \"5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1\": container with ID starting with 5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1 not found: ID does not exist" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.868064 4664 scope.go:117] "RemoveContainer" containerID="2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.868289 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add"} err="failed to get container status \"2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add\": rpc error: code = NotFound desc = could not find container \"2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add\": container with ID starting with 2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add not found: ID does not exist" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.868366 4664 scope.go:117] "RemoveContainer" containerID="5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.868649 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2"} err="failed to get container status \"5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2\": rpc error: code = NotFound desc = could not find container \"5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2\": container with ID starting with 5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2 not found: ID does not exist" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.868900 4664 scope.go:117] "RemoveContainer" containerID="67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.869162 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac"} err="failed to get container status \"67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac\": rpc error: code = NotFound desc = could not find container \"67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac\": container with ID starting with 67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac not found: ID does not exist" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.869230 4664 scope.go:117] "RemoveContainer" containerID="82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.869575 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00"} err="failed to get container status \"82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00\": rpc error: code = NotFound desc = could not find container \"82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00\": container with ID starting with 82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00 not found: ID does not exist" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.869631 4664 scope.go:117] "RemoveContainer" containerID="023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.870257 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db"} err="failed to get container status \"023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db\": rpc error: code = NotFound desc = could not find container \"023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db\": container with ID starting with 023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db not found: ID does not exist" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.870342 4664 scope.go:117] "RemoveContainer" containerID="cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.870688 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12"} err="failed to get container status \"cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\": rpc error: code = NotFound desc = could not find container \"cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\": container with ID starting with cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12 not found: ID does not exist" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.870721 4664 scope.go:117] "RemoveContainer" containerID="74dcdcf3bafa074aa1221fd77fffcad57d7519c23c8edc43218e8762a0ca5e1d" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.871104 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74dcdcf3bafa074aa1221fd77fffcad57d7519c23c8edc43218e8762a0ca5e1d"} err="failed to get container status \"74dcdcf3bafa074aa1221fd77fffcad57d7519c23c8edc43218e8762a0ca5e1d\": rpc error: code = NotFound desc = could not find container \"74dcdcf3bafa074aa1221fd77fffcad57d7519c23c8edc43218e8762a0ca5e1d\": container with ID starting with 74dcdcf3bafa074aa1221fd77fffcad57d7519c23c8edc43218e8762a0ca5e1d not found: ID does not exist" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.871132 4664 scope.go:117] "RemoveContainer" containerID="46c7dd53736fe9d646bd1433c7f564d5096f83984954d65d5f63ce953cd49690" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.871380 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46c7dd53736fe9d646bd1433c7f564d5096f83984954d65d5f63ce953cd49690"} err="failed to get container status \"46c7dd53736fe9d646bd1433c7f564d5096f83984954d65d5f63ce953cd49690\": rpc error: code = NotFound desc = could not find container \"46c7dd53736fe9d646bd1433c7f564d5096f83984954d65d5f63ce953cd49690\": container with ID starting with 46c7dd53736fe9d646bd1433c7f564d5096f83984954d65d5f63ce953cd49690 not found: ID does not exist" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.871460 4664 scope.go:117] "RemoveContainer" containerID="a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.871796 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b"} err="failed to get container status \"a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b\": rpc error: code = NotFound desc = could not find container \"a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b\": container with ID starting with a079e808aae4132f276941674e62a8255ff6dc40a8f45080c37fc54aedaefc5b not found: ID does not exist" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.871821 4664 scope.go:117] "RemoveContainer" containerID="5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.873002 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1"} err="failed to get container status \"5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1\": rpc error: code = NotFound desc = could not find container \"5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1\": container with ID starting with 5cc34879f10ef14e78ab5683fecfc8a8cce3710c9f0c8f9d9ae23b7886670de1 not found: ID does not exist" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.873048 4664 scope.go:117] "RemoveContainer" containerID="2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.873342 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add"} err="failed to get container status \"2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add\": rpc error: code = NotFound desc = could not find container \"2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add\": container with ID starting with 2e3944b4db04263d2921290b81180d1d976e0b72cdada1c36c0bad548ed55add not found: ID does not exist" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.873427 4664 scope.go:117] "RemoveContainer" containerID="5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.873736 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2"} err="failed to get container status \"5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2\": rpc error: code = NotFound desc = could not find container \"5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2\": container with ID starting with 5e6e3a71d2db00b69b5825f359fa48028fbf78e16f1afb0afc7b820f8dd826f2 not found: ID does not exist" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.873757 4664 scope.go:117] "RemoveContainer" containerID="67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.874124 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac"} err="failed to get container status \"67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac\": rpc error: code = NotFound desc = could not find container \"67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac\": container with ID starting with 67ab8f0225fb975e4217e059c6fc35b2d61b88acbb26f61e267aea0b1c8762ac not found: ID does not exist" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.874180 4664 scope.go:117] "RemoveContainer" containerID="82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.874450 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00"} err="failed to get container status \"82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00\": rpc error: code = NotFound desc = could not find container \"82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00\": container with ID starting with 82d159e84992209c5332c8646c54c33f9067e29681471c789db33385fa30de00 not found: ID does not exist" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.874468 4664 scope.go:117] "RemoveContainer" containerID="023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.874666 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db"} err="failed to get container status \"023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db\": rpc error: code = NotFound desc = could not find container \"023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db\": container with ID starting with 023341f8141a15bbf5a498fa13cebb92158028038618663afb9054146bf274db not found: ID does not exist" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.874687 4664 scope.go:117] "RemoveContainer" containerID="cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.874937 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12"} err="failed to get container status \"cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\": rpc error: code = NotFound desc = could not find container \"cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12\": container with ID starting with cadaad9ec967e5c3f12aa9e8322dda3863f66b21f6378568e4f0e24b3b5e5c12 not found: ID does not exist" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.875068 4664 scope.go:117] "RemoveContainer" containerID="74dcdcf3bafa074aa1221fd77fffcad57d7519c23c8edc43218e8762a0ca5e1d" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.875513 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74dcdcf3bafa074aa1221fd77fffcad57d7519c23c8edc43218e8762a0ca5e1d"} err="failed to get container status \"74dcdcf3bafa074aa1221fd77fffcad57d7519c23c8edc43218e8762a0ca5e1d\": rpc error: code = NotFound desc = could not find container \"74dcdcf3bafa074aa1221fd77fffcad57d7519c23c8edc43218e8762a0ca5e1d\": container with ID starting with 74dcdcf3bafa074aa1221fd77fffcad57d7519c23c8edc43218e8762a0ca5e1d not found: ID does not exist" Oct 03 08:01:31 crc kubenswrapper[4664]: I1003 08:01:31.881906 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d" path="/var/lib/kubelet/pods/8bb65800-b794-4cb7-8fdd-ebbf3a8ff78d/volumes" Oct 03 08:01:32 crc kubenswrapper[4664]: I1003 08:01:32.567661 4664 generic.go:334] "Generic (PLEG): container finished" podID="fea2845f-2ae0-4e33-b28b-d11e9d85ea10" containerID="c52d28ddacad4b1ee2495bff6df0b7782aff2f7eb3afb0b6be9b5b4e4e25bb12" exitCode=0 Oct 03 08:01:32 crc kubenswrapper[4664]: I1003 08:01:32.567748 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x2777" event={"ID":"fea2845f-2ae0-4e33-b28b-d11e9d85ea10","Type":"ContainerDied","Data":"c52d28ddacad4b1ee2495bff6df0b7782aff2f7eb3afb0b6be9b5b4e4e25bb12"} Oct 03 08:01:32 crc kubenswrapper[4664]: I1003 08:01:32.568117 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x2777" event={"ID":"fea2845f-2ae0-4e33-b28b-d11e9d85ea10","Type":"ContainerStarted","Data":"00c3fd925b3bd9e81101d6ed84f8c270bb87e13c6debe0a8de05efe2b60d6d0a"} Oct 03 08:01:32 crc kubenswrapper[4664]: I1003 08:01:32.573784 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-72cm2_6998d742-8d17-4f20-ab52-c30d9f7b0b89/kube-multus/2.log" Oct 03 08:01:32 crc kubenswrapper[4664]: I1003 08:01:32.574368 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-72cm2_6998d742-8d17-4f20-ab52-c30d9f7b0b89/kube-multus/1.log" Oct 03 08:01:32 crc kubenswrapper[4664]: I1003 08:01:32.574424 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-72cm2" event={"ID":"6998d742-8d17-4f20-ab52-c30d9f7b0b89","Type":"ContainerStarted","Data":"4e8b77c64875ab0367748b4de9b74c2847a38702ce0d26082a1722f50a5ac5ca"} Oct 03 08:01:33 crc kubenswrapper[4664]: I1003 08:01:33.584041 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x2777" event={"ID":"fea2845f-2ae0-4e33-b28b-d11e9d85ea10","Type":"ContainerStarted","Data":"646d02891f60831eff4744c073e915b01d5f0558b18328cdec7f39d37d14e473"} Oct 03 08:01:33 crc kubenswrapper[4664]: I1003 08:01:33.584558 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x2777" event={"ID":"fea2845f-2ae0-4e33-b28b-d11e9d85ea10","Type":"ContainerStarted","Data":"86b3e52044a9e6f3e4ebc5a40a671018abf8bc009d9d70bd4738ac2e5709e7ec"} Oct 03 08:01:33 crc kubenswrapper[4664]: I1003 08:01:33.584577 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x2777" event={"ID":"fea2845f-2ae0-4e33-b28b-d11e9d85ea10","Type":"ContainerStarted","Data":"9495b85d21bd95d46834c1a7ad91ab955a93ab2f5ec1b7e67014cb90f743bc5f"} Oct 03 08:01:33 crc kubenswrapper[4664]: I1003 08:01:33.584590 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x2777" event={"ID":"fea2845f-2ae0-4e33-b28b-d11e9d85ea10","Type":"ContainerStarted","Data":"a46dabc82828a55d1a18da82ef7e4e659b3667cef5bca0cf5950ab30fe9174e0"} Oct 03 08:01:33 crc kubenswrapper[4664]: I1003 08:01:33.584625 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x2777" event={"ID":"fea2845f-2ae0-4e33-b28b-d11e9d85ea10","Type":"ContainerStarted","Data":"5985042e746be94a8eca14d4155fecf5de320f055a23b83a765e1455ad1f2430"} Oct 03 08:01:33 crc kubenswrapper[4664]: I1003 08:01:33.584637 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x2777" event={"ID":"fea2845f-2ae0-4e33-b28b-d11e9d85ea10","Type":"ContainerStarted","Data":"bde0f22f2da3d202fe0cedfea138ec184f3f2a972cbb50d7796170839405747c"} Oct 03 08:01:35 crc kubenswrapper[4664]: I1003 08:01:35.598226 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x2777" event={"ID":"fea2845f-2ae0-4e33-b28b-d11e9d85ea10","Type":"ContainerStarted","Data":"e0b43ae18d018e6d42d706d603ba01d903e57e942322ed013b46406cde54ac1b"} Oct 03 08:01:38 crc kubenswrapper[4664]: I1003 08:01:38.617743 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x2777" event={"ID":"fea2845f-2ae0-4e33-b28b-d11e9d85ea10","Type":"ContainerStarted","Data":"e971791ee1116a77424f7ec6fa1255c484b766ceb2aebd9e72a8e4cb0e7feddf"} Oct 03 08:01:38 crc kubenswrapper[4664]: I1003 08:01:38.618211 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:38 crc kubenswrapper[4664]: I1003 08:01:38.650629 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-x2777" podStartSLOduration=7.650584357 podStartE2EDuration="7.650584357s" podCreationTimestamp="2025-10-03 08:01:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:01:38.649187109 +0000 UTC m=+799.470377629" watchObservedRunningTime="2025-10-03 08:01:38.650584357 +0000 UTC m=+799.471774847" Oct 03 08:01:38 crc kubenswrapper[4664]: I1003 08:01:38.661704 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:39 crc kubenswrapper[4664]: I1003 08:01:39.622879 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:39 crc kubenswrapper[4664]: I1003 08:01:39.623250 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:39 crc kubenswrapper[4664]: I1003 08:01:39.649755 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:01:39 crc kubenswrapper[4664]: I1003 08:01:39.818024 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xdgnx" Oct 03 08:01:39 crc kubenswrapper[4664]: I1003 08:01:39.892254 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xdgnx"] Oct 03 08:01:40 crc kubenswrapper[4664]: I1003 08:01:40.628023 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xdgnx" podUID="a23c5b33-2c0a-4fd8-ab02-2bdb1130a468" containerName="registry-server" containerID="cri-o://f601128437d606a98290048770341bda91c4f2959d07bd40533983ce7a71c3ad" gracePeriod=2 Oct 03 08:01:41 crc kubenswrapper[4664]: I1003 08:01:41.487095 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xdgnx" Oct 03 08:01:41 crc kubenswrapper[4664]: I1003 08:01:41.634818 4664 generic.go:334] "Generic (PLEG): container finished" podID="a23c5b33-2c0a-4fd8-ab02-2bdb1130a468" containerID="f601128437d606a98290048770341bda91c4f2959d07bd40533983ce7a71c3ad" exitCode=0 Oct 03 08:01:41 crc kubenswrapper[4664]: I1003 08:01:41.634866 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xdgnx" Oct 03 08:01:41 crc kubenswrapper[4664]: I1003 08:01:41.634878 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xdgnx" event={"ID":"a23c5b33-2c0a-4fd8-ab02-2bdb1130a468","Type":"ContainerDied","Data":"f601128437d606a98290048770341bda91c4f2959d07bd40533983ce7a71c3ad"} Oct 03 08:01:41 crc kubenswrapper[4664]: I1003 08:01:41.634916 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xdgnx" event={"ID":"a23c5b33-2c0a-4fd8-ab02-2bdb1130a468","Type":"ContainerDied","Data":"41516f657ed1357d8bfb091a6504ca742104a39b02f18a9c93dd707233083c45"} Oct 03 08:01:41 crc kubenswrapper[4664]: I1003 08:01:41.634938 4664 scope.go:117] "RemoveContainer" containerID="f601128437d606a98290048770341bda91c4f2959d07bd40533983ce7a71c3ad" Oct 03 08:01:41 crc kubenswrapper[4664]: I1003 08:01:41.643803 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zg8xx\" (UniqueName: \"kubernetes.io/projected/a23c5b33-2c0a-4fd8-ab02-2bdb1130a468-kube-api-access-zg8xx\") pod \"a23c5b33-2c0a-4fd8-ab02-2bdb1130a468\" (UID: \"a23c5b33-2c0a-4fd8-ab02-2bdb1130a468\") " Oct 03 08:01:41 crc kubenswrapper[4664]: I1003 08:01:41.643882 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a23c5b33-2c0a-4fd8-ab02-2bdb1130a468-utilities\") pod \"a23c5b33-2c0a-4fd8-ab02-2bdb1130a468\" (UID: \"a23c5b33-2c0a-4fd8-ab02-2bdb1130a468\") " Oct 03 08:01:41 crc kubenswrapper[4664]: I1003 08:01:41.643930 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a23c5b33-2c0a-4fd8-ab02-2bdb1130a468-catalog-content\") pod \"a23c5b33-2c0a-4fd8-ab02-2bdb1130a468\" (UID: \"a23c5b33-2c0a-4fd8-ab02-2bdb1130a468\") " Oct 03 08:01:41 crc kubenswrapper[4664]: I1003 08:01:41.644759 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a23c5b33-2c0a-4fd8-ab02-2bdb1130a468-utilities" (OuterVolumeSpecName: "utilities") pod "a23c5b33-2c0a-4fd8-ab02-2bdb1130a468" (UID: "a23c5b33-2c0a-4fd8-ab02-2bdb1130a468"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:01:41 crc kubenswrapper[4664]: I1003 08:01:41.655649 4664 scope.go:117] "RemoveContainer" containerID="970dd5b273c4279232457ebc6aab29153aa669d2107b734cc0b49cd51d954851" Oct 03 08:01:41 crc kubenswrapper[4664]: I1003 08:01:41.662752 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a23c5b33-2c0a-4fd8-ab02-2bdb1130a468-kube-api-access-zg8xx" (OuterVolumeSpecName: "kube-api-access-zg8xx") pod "a23c5b33-2c0a-4fd8-ab02-2bdb1130a468" (UID: "a23c5b33-2c0a-4fd8-ab02-2bdb1130a468"). InnerVolumeSpecName "kube-api-access-zg8xx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:01:41 crc kubenswrapper[4664]: I1003 08:01:41.673431 4664 scope.go:117] "RemoveContainer" containerID="202bd688a282a89bee6cd992ac37c678c045f64d50e02190ca10a962b6830515" Oct 03 08:01:41 crc kubenswrapper[4664]: I1003 08:01:41.690380 4664 scope.go:117] "RemoveContainer" containerID="f601128437d606a98290048770341bda91c4f2959d07bd40533983ce7a71c3ad" Oct 03 08:01:41 crc kubenswrapper[4664]: E1003 08:01:41.690813 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f601128437d606a98290048770341bda91c4f2959d07bd40533983ce7a71c3ad\": container with ID starting with f601128437d606a98290048770341bda91c4f2959d07bd40533983ce7a71c3ad not found: ID does not exist" containerID="f601128437d606a98290048770341bda91c4f2959d07bd40533983ce7a71c3ad" Oct 03 08:01:41 crc kubenswrapper[4664]: I1003 08:01:41.690845 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f601128437d606a98290048770341bda91c4f2959d07bd40533983ce7a71c3ad"} err="failed to get container status \"f601128437d606a98290048770341bda91c4f2959d07bd40533983ce7a71c3ad\": rpc error: code = NotFound desc = could not find container \"f601128437d606a98290048770341bda91c4f2959d07bd40533983ce7a71c3ad\": container with ID starting with f601128437d606a98290048770341bda91c4f2959d07bd40533983ce7a71c3ad not found: ID does not exist" Oct 03 08:01:41 crc kubenswrapper[4664]: I1003 08:01:41.690867 4664 scope.go:117] "RemoveContainer" containerID="970dd5b273c4279232457ebc6aab29153aa669d2107b734cc0b49cd51d954851" Oct 03 08:01:41 crc kubenswrapper[4664]: E1003 08:01:41.691207 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"970dd5b273c4279232457ebc6aab29153aa669d2107b734cc0b49cd51d954851\": container with ID starting with 970dd5b273c4279232457ebc6aab29153aa669d2107b734cc0b49cd51d954851 not found: ID does not exist" containerID="970dd5b273c4279232457ebc6aab29153aa669d2107b734cc0b49cd51d954851" Oct 03 08:01:41 crc kubenswrapper[4664]: I1003 08:01:41.691234 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"970dd5b273c4279232457ebc6aab29153aa669d2107b734cc0b49cd51d954851"} err="failed to get container status \"970dd5b273c4279232457ebc6aab29153aa669d2107b734cc0b49cd51d954851\": rpc error: code = NotFound desc = could not find container \"970dd5b273c4279232457ebc6aab29153aa669d2107b734cc0b49cd51d954851\": container with ID starting with 970dd5b273c4279232457ebc6aab29153aa669d2107b734cc0b49cd51d954851 not found: ID does not exist" Oct 03 08:01:41 crc kubenswrapper[4664]: I1003 08:01:41.691247 4664 scope.go:117] "RemoveContainer" containerID="202bd688a282a89bee6cd992ac37c678c045f64d50e02190ca10a962b6830515" Oct 03 08:01:41 crc kubenswrapper[4664]: E1003 08:01:41.691474 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"202bd688a282a89bee6cd992ac37c678c045f64d50e02190ca10a962b6830515\": container with ID starting with 202bd688a282a89bee6cd992ac37c678c045f64d50e02190ca10a962b6830515 not found: ID does not exist" containerID="202bd688a282a89bee6cd992ac37c678c045f64d50e02190ca10a962b6830515" Oct 03 08:01:41 crc kubenswrapper[4664]: I1003 08:01:41.691492 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"202bd688a282a89bee6cd992ac37c678c045f64d50e02190ca10a962b6830515"} err="failed to get container status \"202bd688a282a89bee6cd992ac37c678c045f64d50e02190ca10a962b6830515\": rpc error: code = NotFound desc = could not find container \"202bd688a282a89bee6cd992ac37c678c045f64d50e02190ca10a962b6830515\": container with ID starting with 202bd688a282a89bee6cd992ac37c678c045f64d50e02190ca10a962b6830515 not found: ID does not exist" Oct 03 08:01:41 crc kubenswrapper[4664]: I1003 08:01:41.696363 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a23c5b33-2c0a-4fd8-ab02-2bdb1130a468-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a23c5b33-2c0a-4fd8-ab02-2bdb1130a468" (UID: "a23c5b33-2c0a-4fd8-ab02-2bdb1130a468"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:01:41 crc kubenswrapper[4664]: I1003 08:01:41.745139 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zg8xx\" (UniqueName: \"kubernetes.io/projected/a23c5b33-2c0a-4fd8-ab02-2bdb1130a468-kube-api-access-zg8xx\") on node \"crc\" DevicePath \"\"" Oct 03 08:01:41 crc kubenswrapper[4664]: I1003 08:01:41.745173 4664 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a23c5b33-2c0a-4fd8-ab02-2bdb1130a468-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:01:41 crc kubenswrapper[4664]: I1003 08:01:41.745184 4664 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a23c5b33-2c0a-4fd8-ab02-2bdb1130a468-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:01:41 crc kubenswrapper[4664]: I1003 08:01:41.954890 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xdgnx"] Oct 03 08:01:41 crc kubenswrapper[4664]: I1003 08:01:41.959151 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xdgnx"] Oct 03 08:01:41 crc kubenswrapper[4664]: I1003 08:01:41.987682 4664 patch_prober.go:28] interesting pod/machine-config-daemon-x9dgm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:01:41 crc kubenswrapper[4664]: I1003 08:01:41.987749 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:01:43 crc kubenswrapper[4664]: I1003 08:01:43.889454 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a23c5b33-2c0a-4fd8-ab02-2bdb1130a468" path="/var/lib/kubelet/pods/a23c5b33-2c0a-4fd8-ab02-2bdb1130a468/volumes" Oct 03 08:01:44 crc kubenswrapper[4664]: I1003 08:01:44.471251 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-b5jgw"] Oct 03 08:01:44 crc kubenswrapper[4664]: E1003 08:01:44.471906 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a23c5b33-2c0a-4fd8-ab02-2bdb1130a468" containerName="extract-content" Oct 03 08:01:44 crc kubenswrapper[4664]: I1003 08:01:44.471927 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="a23c5b33-2c0a-4fd8-ab02-2bdb1130a468" containerName="extract-content" Oct 03 08:01:44 crc kubenswrapper[4664]: E1003 08:01:44.471951 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a23c5b33-2c0a-4fd8-ab02-2bdb1130a468" containerName="extract-utilities" Oct 03 08:01:44 crc kubenswrapper[4664]: I1003 08:01:44.471964 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="a23c5b33-2c0a-4fd8-ab02-2bdb1130a468" containerName="extract-utilities" Oct 03 08:01:44 crc kubenswrapper[4664]: E1003 08:01:44.471992 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a23c5b33-2c0a-4fd8-ab02-2bdb1130a468" containerName="registry-server" Oct 03 08:01:44 crc kubenswrapper[4664]: I1003 08:01:44.472005 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="a23c5b33-2c0a-4fd8-ab02-2bdb1130a468" containerName="registry-server" Oct 03 08:01:44 crc kubenswrapper[4664]: I1003 08:01:44.472211 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="a23c5b33-2c0a-4fd8-ab02-2bdb1130a468" containerName="registry-server" Oct 03 08:01:44 crc kubenswrapper[4664]: I1003 08:01:44.473420 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b5jgw" Oct 03 08:01:44 crc kubenswrapper[4664]: I1003 08:01:44.486739 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b5jgw"] Oct 03 08:01:44 crc kubenswrapper[4664]: I1003 08:01:44.583152 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55a5a653-b1bd-420f-a23c-b764582b0810-catalog-content\") pod \"redhat-marketplace-b5jgw\" (UID: \"55a5a653-b1bd-420f-a23c-b764582b0810\") " pod="openshift-marketplace/redhat-marketplace-b5jgw" Oct 03 08:01:44 crc kubenswrapper[4664]: I1003 08:01:44.583221 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xr24f\" (UniqueName: \"kubernetes.io/projected/55a5a653-b1bd-420f-a23c-b764582b0810-kube-api-access-xr24f\") pod \"redhat-marketplace-b5jgw\" (UID: \"55a5a653-b1bd-420f-a23c-b764582b0810\") " pod="openshift-marketplace/redhat-marketplace-b5jgw" Oct 03 08:01:44 crc kubenswrapper[4664]: I1003 08:01:44.583245 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55a5a653-b1bd-420f-a23c-b764582b0810-utilities\") pod \"redhat-marketplace-b5jgw\" (UID: \"55a5a653-b1bd-420f-a23c-b764582b0810\") " pod="openshift-marketplace/redhat-marketplace-b5jgw" Oct 03 08:01:44 crc kubenswrapper[4664]: I1003 08:01:44.684995 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55a5a653-b1bd-420f-a23c-b764582b0810-catalog-content\") pod \"redhat-marketplace-b5jgw\" (UID: \"55a5a653-b1bd-420f-a23c-b764582b0810\") " pod="openshift-marketplace/redhat-marketplace-b5jgw" Oct 03 08:01:44 crc kubenswrapper[4664]: I1003 08:01:44.685125 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xr24f\" (UniqueName: \"kubernetes.io/projected/55a5a653-b1bd-420f-a23c-b764582b0810-kube-api-access-xr24f\") pod \"redhat-marketplace-b5jgw\" (UID: \"55a5a653-b1bd-420f-a23c-b764582b0810\") " pod="openshift-marketplace/redhat-marketplace-b5jgw" Oct 03 08:01:44 crc kubenswrapper[4664]: I1003 08:01:44.685167 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55a5a653-b1bd-420f-a23c-b764582b0810-utilities\") pod \"redhat-marketplace-b5jgw\" (UID: \"55a5a653-b1bd-420f-a23c-b764582b0810\") " pod="openshift-marketplace/redhat-marketplace-b5jgw" Oct 03 08:01:44 crc kubenswrapper[4664]: I1003 08:01:44.685752 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55a5a653-b1bd-420f-a23c-b764582b0810-utilities\") pod \"redhat-marketplace-b5jgw\" (UID: \"55a5a653-b1bd-420f-a23c-b764582b0810\") " pod="openshift-marketplace/redhat-marketplace-b5jgw" Oct 03 08:01:44 crc kubenswrapper[4664]: I1003 08:01:44.686298 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55a5a653-b1bd-420f-a23c-b764582b0810-catalog-content\") pod \"redhat-marketplace-b5jgw\" (UID: \"55a5a653-b1bd-420f-a23c-b764582b0810\") " pod="openshift-marketplace/redhat-marketplace-b5jgw" Oct 03 08:01:44 crc kubenswrapper[4664]: I1003 08:01:44.703836 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xr24f\" (UniqueName: \"kubernetes.io/projected/55a5a653-b1bd-420f-a23c-b764582b0810-kube-api-access-xr24f\") pod \"redhat-marketplace-b5jgw\" (UID: \"55a5a653-b1bd-420f-a23c-b764582b0810\") " pod="openshift-marketplace/redhat-marketplace-b5jgw" Oct 03 08:01:44 crc kubenswrapper[4664]: I1003 08:01:44.791623 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b5jgw" Oct 03 08:01:45 crc kubenswrapper[4664]: I1003 08:01:45.011442 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b5jgw"] Oct 03 08:01:45 crc kubenswrapper[4664]: W1003 08:01:45.014770 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55a5a653_b1bd_420f_a23c_b764582b0810.slice/crio-f139dde5a23f0df34433440cc0f7a695d57e2adc9b3bcb65aebc8f7bc60fe8f0 WatchSource:0}: Error finding container f139dde5a23f0df34433440cc0f7a695d57e2adc9b3bcb65aebc8f7bc60fe8f0: Status 404 returned error can't find the container with id f139dde5a23f0df34433440cc0f7a695d57e2adc9b3bcb65aebc8f7bc60fe8f0 Oct 03 08:01:45 crc kubenswrapper[4664]: I1003 08:01:45.658360 4664 generic.go:334] "Generic (PLEG): container finished" podID="55a5a653-b1bd-420f-a23c-b764582b0810" containerID="8d802cce05eca4fe0688c5966eb4bf12796f253db96e20779472ad11fa093e76" exitCode=0 Oct 03 08:01:45 crc kubenswrapper[4664]: I1003 08:01:45.658426 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b5jgw" event={"ID":"55a5a653-b1bd-420f-a23c-b764582b0810","Type":"ContainerDied","Data":"8d802cce05eca4fe0688c5966eb4bf12796f253db96e20779472ad11fa093e76"} Oct 03 08:01:45 crc kubenswrapper[4664]: I1003 08:01:45.658451 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b5jgw" event={"ID":"55a5a653-b1bd-420f-a23c-b764582b0810","Type":"ContainerStarted","Data":"f139dde5a23f0df34433440cc0f7a695d57e2adc9b3bcb65aebc8f7bc60fe8f0"} Oct 03 08:01:46 crc kubenswrapper[4664]: I1003 08:01:46.665028 4664 generic.go:334] "Generic (PLEG): container finished" podID="55a5a653-b1bd-420f-a23c-b764582b0810" containerID="09244c97e7fe82f776a44308a2c71366de4a2b8cc39b817a98c11381253f392d" exitCode=0 Oct 03 08:01:46 crc kubenswrapper[4664]: I1003 08:01:46.665078 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b5jgw" event={"ID":"55a5a653-b1bd-420f-a23c-b764582b0810","Type":"ContainerDied","Data":"09244c97e7fe82f776a44308a2c71366de4a2b8cc39b817a98c11381253f392d"} Oct 03 08:01:47 crc kubenswrapper[4664]: I1003 08:01:47.673626 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b5jgw" event={"ID":"55a5a653-b1bd-420f-a23c-b764582b0810","Type":"ContainerStarted","Data":"dae2432650b9da24951973cd5a17e048d543a383f074a36d44967572dcb0c3a2"} Oct 03 08:01:47 crc kubenswrapper[4664]: I1003 08:01:47.689086 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-b5jgw" podStartSLOduration=2.015989595 podStartE2EDuration="3.689066094s" podCreationTimestamp="2025-10-03 08:01:44 +0000 UTC" firstStartedPulling="2025-10-03 08:01:45.659972824 +0000 UTC m=+806.481163314" lastFinishedPulling="2025-10-03 08:01:47.333049323 +0000 UTC m=+808.154239813" observedRunningTime="2025-10-03 08:01:47.688031696 +0000 UTC m=+808.509222196" watchObservedRunningTime="2025-10-03 08:01:47.689066094 +0000 UTC m=+808.510256594" Oct 03 08:01:54 crc kubenswrapper[4664]: I1003 08:01:54.792129 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-b5jgw" Oct 03 08:01:54 crc kubenswrapper[4664]: I1003 08:01:54.792366 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-b5jgw" Oct 03 08:01:54 crc kubenswrapper[4664]: I1003 08:01:54.829733 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-b5jgw" Oct 03 08:01:55 crc kubenswrapper[4664]: I1003 08:01:55.793961 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-b5jgw" Oct 03 08:01:55 crc kubenswrapper[4664]: I1003 08:01:55.843001 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b5jgw"] Oct 03 08:01:57 crc kubenswrapper[4664]: I1003 08:01:57.735426 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-b5jgw" podUID="55a5a653-b1bd-420f-a23c-b764582b0810" containerName="registry-server" containerID="cri-o://dae2432650b9da24951973cd5a17e048d543a383f074a36d44967572dcb0c3a2" gracePeriod=2 Oct 03 08:01:58 crc kubenswrapper[4664]: I1003 08:01:58.127452 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b5jgw" Oct 03 08:01:58 crc kubenswrapper[4664]: I1003 08:01:58.259229 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55a5a653-b1bd-420f-a23c-b764582b0810-utilities\") pod \"55a5a653-b1bd-420f-a23c-b764582b0810\" (UID: \"55a5a653-b1bd-420f-a23c-b764582b0810\") " Oct 03 08:01:58 crc kubenswrapper[4664]: I1003 08:01:58.260074 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55a5a653-b1bd-420f-a23c-b764582b0810-catalog-content\") pod \"55a5a653-b1bd-420f-a23c-b764582b0810\" (UID: \"55a5a653-b1bd-420f-a23c-b764582b0810\") " Oct 03 08:01:58 crc kubenswrapper[4664]: I1003 08:01:58.260170 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xr24f\" (UniqueName: \"kubernetes.io/projected/55a5a653-b1bd-420f-a23c-b764582b0810-kube-api-access-xr24f\") pod \"55a5a653-b1bd-420f-a23c-b764582b0810\" (UID: \"55a5a653-b1bd-420f-a23c-b764582b0810\") " Oct 03 08:01:58 crc kubenswrapper[4664]: I1003 08:01:58.260627 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55a5a653-b1bd-420f-a23c-b764582b0810-utilities" (OuterVolumeSpecName: "utilities") pod "55a5a653-b1bd-420f-a23c-b764582b0810" (UID: "55a5a653-b1bd-420f-a23c-b764582b0810"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:01:58 crc kubenswrapper[4664]: I1003 08:01:58.260865 4664 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55a5a653-b1bd-420f-a23c-b764582b0810-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:01:58 crc kubenswrapper[4664]: I1003 08:01:58.267541 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55a5a653-b1bd-420f-a23c-b764582b0810-kube-api-access-xr24f" (OuterVolumeSpecName: "kube-api-access-xr24f") pod "55a5a653-b1bd-420f-a23c-b764582b0810" (UID: "55a5a653-b1bd-420f-a23c-b764582b0810"). InnerVolumeSpecName "kube-api-access-xr24f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:01:58 crc kubenswrapper[4664]: I1003 08:01:58.272441 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55a5a653-b1bd-420f-a23c-b764582b0810-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "55a5a653-b1bd-420f-a23c-b764582b0810" (UID: "55a5a653-b1bd-420f-a23c-b764582b0810"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:01:58 crc kubenswrapper[4664]: I1003 08:01:58.362696 4664 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55a5a653-b1bd-420f-a23c-b764582b0810-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:01:58 crc kubenswrapper[4664]: I1003 08:01:58.362755 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xr24f\" (UniqueName: \"kubernetes.io/projected/55a5a653-b1bd-420f-a23c-b764582b0810-kube-api-access-xr24f\") on node \"crc\" DevicePath \"\"" Oct 03 08:01:58 crc kubenswrapper[4664]: I1003 08:01:58.678420 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wx9d8"] Oct 03 08:01:58 crc kubenswrapper[4664]: E1003 08:01:58.679119 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55a5a653-b1bd-420f-a23c-b764582b0810" containerName="extract-content" Oct 03 08:01:58 crc kubenswrapper[4664]: I1003 08:01:58.679136 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="55a5a653-b1bd-420f-a23c-b764582b0810" containerName="extract-content" Oct 03 08:01:58 crc kubenswrapper[4664]: E1003 08:01:58.679151 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55a5a653-b1bd-420f-a23c-b764582b0810" containerName="registry-server" Oct 03 08:01:58 crc kubenswrapper[4664]: I1003 08:01:58.679159 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="55a5a653-b1bd-420f-a23c-b764582b0810" containerName="registry-server" Oct 03 08:01:58 crc kubenswrapper[4664]: E1003 08:01:58.679171 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55a5a653-b1bd-420f-a23c-b764582b0810" containerName="extract-utilities" Oct 03 08:01:58 crc kubenswrapper[4664]: I1003 08:01:58.679179 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="55a5a653-b1bd-420f-a23c-b764582b0810" containerName="extract-utilities" Oct 03 08:01:58 crc kubenswrapper[4664]: I1003 08:01:58.679302 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="55a5a653-b1bd-420f-a23c-b764582b0810" containerName="registry-server" Oct 03 08:01:58 crc kubenswrapper[4664]: I1003 08:01:58.680422 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wx9d8" Oct 03 08:01:58 crc kubenswrapper[4664]: I1003 08:01:58.694376 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wx9d8"] Oct 03 08:01:58 crc kubenswrapper[4664]: I1003 08:01:58.749012 4664 generic.go:334] "Generic (PLEG): container finished" podID="55a5a653-b1bd-420f-a23c-b764582b0810" containerID="dae2432650b9da24951973cd5a17e048d543a383f074a36d44967572dcb0c3a2" exitCode=0 Oct 03 08:01:58 crc kubenswrapper[4664]: I1003 08:01:58.749084 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b5jgw" event={"ID":"55a5a653-b1bd-420f-a23c-b764582b0810","Type":"ContainerDied","Data":"dae2432650b9da24951973cd5a17e048d543a383f074a36d44967572dcb0c3a2"} Oct 03 08:01:58 crc kubenswrapper[4664]: I1003 08:01:58.749131 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b5jgw" event={"ID":"55a5a653-b1bd-420f-a23c-b764582b0810","Type":"ContainerDied","Data":"f139dde5a23f0df34433440cc0f7a695d57e2adc9b3bcb65aebc8f7bc60fe8f0"} Oct 03 08:01:58 crc kubenswrapper[4664]: I1003 08:01:58.749158 4664 scope.go:117] "RemoveContainer" containerID="dae2432650b9da24951973cd5a17e048d543a383f074a36d44967572dcb0c3a2" Oct 03 08:01:58 crc kubenswrapper[4664]: I1003 08:01:58.749344 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b5jgw" Oct 03 08:01:58 crc kubenswrapper[4664]: I1003 08:01:58.768444 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44525\" (UniqueName: \"kubernetes.io/projected/b698d75f-d663-48cc-8881-8635d75bf8ac-kube-api-access-44525\") pod \"certified-operators-wx9d8\" (UID: \"b698d75f-d663-48cc-8881-8635d75bf8ac\") " pod="openshift-marketplace/certified-operators-wx9d8" Oct 03 08:01:58 crc kubenswrapper[4664]: I1003 08:01:58.768533 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b698d75f-d663-48cc-8881-8635d75bf8ac-catalog-content\") pod \"certified-operators-wx9d8\" (UID: \"b698d75f-d663-48cc-8881-8635d75bf8ac\") " pod="openshift-marketplace/certified-operators-wx9d8" Oct 03 08:01:58 crc kubenswrapper[4664]: I1003 08:01:58.768665 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b698d75f-d663-48cc-8881-8635d75bf8ac-utilities\") pod \"certified-operators-wx9d8\" (UID: \"b698d75f-d663-48cc-8881-8635d75bf8ac\") " pod="openshift-marketplace/certified-operators-wx9d8" Oct 03 08:01:58 crc kubenswrapper[4664]: I1003 08:01:58.770780 4664 scope.go:117] "RemoveContainer" containerID="09244c97e7fe82f776a44308a2c71366de4a2b8cc39b817a98c11381253f392d" Oct 03 08:01:58 crc kubenswrapper[4664]: I1003 08:01:58.790460 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b5jgw"] Oct 03 08:01:58 crc kubenswrapper[4664]: I1003 08:01:58.795271 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-b5jgw"] Oct 03 08:01:58 crc kubenswrapper[4664]: I1003 08:01:58.801375 4664 scope.go:117] "RemoveContainer" containerID="8d802cce05eca4fe0688c5966eb4bf12796f253db96e20779472ad11fa093e76" Oct 03 08:01:58 crc kubenswrapper[4664]: I1003 08:01:58.821619 4664 scope.go:117] "RemoveContainer" containerID="dae2432650b9da24951973cd5a17e048d543a383f074a36d44967572dcb0c3a2" Oct 03 08:01:58 crc kubenswrapper[4664]: E1003 08:01:58.822459 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dae2432650b9da24951973cd5a17e048d543a383f074a36d44967572dcb0c3a2\": container with ID starting with dae2432650b9da24951973cd5a17e048d543a383f074a36d44967572dcb0c3a2 not found: ID does not exist" containerID="dae2432650b9da24951973cd5a17e048d543a383f074a36d44967572dcb0c3a2" Oct 03 08:01:58 crc kubenswrapper[4664]: I1003 08:01:58.822512 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dae2432650b9da24951973cd5a17e048d543a383f074a36d44967572dcb0c3a2"} err="failed to get container status \"dae2432650b9da24951973cd5a17e048d543a383f074a36d44967572dcb0c3a2\": rpc error: code = NotFound desc = could not find container \"dae2432650b9da24951973cd5a17e048d543a383f074a36d44967572dcb0c3a2\": container with ID starting with dae2432650b9da24951973cd5a17e048d543a383f074a36d44967572dcb0c3a2 not found: ID does not exist" Oct 03 08:01:58 crc kubenswrapper[4664]: I1003 08:01:58.822547 4664 scope.go:117] "RemoveContainer" containerID="09244c97e7fe82f776a44308a2c71366de4a2b8cc39b817a98c11381253f392d" Oct 03 08:01:58 crc kubenswrapper[4664]: E1003 08:01:58.823135 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09244c97e7fe82f776a44308a2c71366de4a2b8cc39b817a98c11381253f392d\": container with ID starting with 09244c97e7fe82f776a44308a2c71366de4a2b8cc39b817a98c11381253f392d not found: ID does not exist" containerID="09244c97e7fe82f776a44308a2c71366de4a2b8cc39b817a98c11381253f392d" Oct 03 08:01:58 crc kubenswrapper[4664]: I1003 08:01:58.823179 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09244c97e7fe82f776a44308a2c71366de4a2b8cc39b817a98c11381253f392d"} err="failed to get container status \"09244c97e7fe82f776a44308a2c71366de4a2b8cc39b817a98c11381253f392d\": rpc error: code = NotFound desc = could not find container \"09244c97e7fe82f776a44308a2c71366de4a2b8cc39b817a98c11381253f392d\": container with ID starting with 09244c97e7fe82f776a44308a2c71366de4a2b8cc39b817a98c11381253f392d not found: ID does not exist" Oct 03 08:01:58 crc kubenswrapper[4664]: I1003 08:01:58.823212 4664 scope.go:117] "RemoveContainer" containerID="8d802cce05eca4fe0688c5966eb4bf12796f253db96e20779472ad11fa093e76" Oct 03 08:01:58 crc kubenswrapper[4664]: E1003 08:01:58.823572 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d802cce05eca4fe0688c5966eb4bf12796f253db96e20779472ad11fa093e76\": container with ID starting with 8d802cce05eca4fe0688c5966eb4bf12796f253db96e20779472ad11fa093e76 not found: ID does not exist" containerID="8d802cce05eca4fe0688c5966eb4bf12796f253db96e20779472ad11fa093e76" Oct 03 08:01:58 crc kubenswrapper[4664]: I1003 08:01:58.823628 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d802cce05eca4fe0688c5966eb4bf12796f253db96e20779472ad11fa093e76"} err="failed to get container status \"8d802cce05eca4fe0688c5966eb4bf12796f253db96e20779472ad11fa093e76\": rpc error: code = NotFound desc = could not find container \"8d802cce05eca4fe0688c5966eb4bf12796f253db96e20779472ad11fa093e76\": container with ID starting with 8d802cce05eca4fe0688c5966eb4bf12796f253db96e20779472ad11fa093e76 not found: ID does not exist" Oct 03 08:01:58 crc kubenswrapper[4664]: I1003 08:01:58.870209 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44525\" (UniqueName: \"kubernetes.io/projected/b698d75f-d663-48cc-8881-8635d75bf8ac-kube-api-access-44525\") pod \"certified-operators-wx9d8\" (UID: \"b698d75f-d663-48cc-8881-8635d75bf8ac\") " pod="openshift-marketplace/certified-operators-wx9d8" Oct 03 08:01:58 crc kubenswrapper[4664]: I1003 08:01:58.870294 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b698d75f-d663-48cc-8881-8635d75bf8ac-catalog-content\") pod \"certified-operators-wx9d8\" (UID: \"b698d75f-d663-48cc-8881-8635d75bf8ac\") " pod="openshift-marketplace/certified-operators-wx9d8" Oct 03 08:01:58 crc kubenswrapper[4664]: I1003 08:01:58.870350 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b698d75f-d663-48cc-8881-8635d75bf8ac-utilities\") pod \"certified-operators-wx9d8\" (UID: \"b698d75f-d663-48cc-8881-8635d75bf8ac\") " pod="openshift-marketplace/certified-operators-wx9d8" Oct 03 08:01:58 crc kubenswrapper[4664]: I1003 08:01:58.870859 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b698d75f-d663-48cc-8881-8635d75bf8ac-catalog-content\") pod \"certified-operators-wx9d8\" (UID: \"b698d75f-d663-48cc-8881-8635d75bf8ac\") " pod="openshift-marketplace/certified-operators-wx9d8" Oct 03 08:01:58 crc kubenswrapper[4664]: I1003 08:01:58.870960 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b698d75f-d663-48cc-8881-8635d75bf8ac-utilities\") pod \"certified-operators-wx9d8\" (UID: \"b698d75f-d663-48cc-8881-8635d75bf8ac\") " pod="openshift-marketplace/certified-operators-wx9d8" Oct 03 08:01:58 crc kubenswrapper[4664]: I1003 08:01:58.901174 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44525\" (UniqueName: \"kubernetes.io/projected/b698d75f-d663-48cc-8881-8635d75bf8ac-kube-api-access-44525\") pod \"certified-operators-wx9d8\" (UID: \"b698d75f-d663-48cc-8881-8635d75bf8ac\") " pod="openshift-marketplace/certified-operators-wx9d8" Oct 03 08:01:58 crc kubenswrapper[4664]: I1003 08:01:58.998629 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wx9d8" Oct 03 08:01:59 crc kubenswrapper[4664]: I1003 08:01:59.273033 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wx9d8"] Oct 03 08:01:59 crc kubenswrapper[4664]: I1003 08:01:59.756802 4664 generic.go:334] "Generic (PLEG): container finished" podID="b698d75f-d663-48cc-8881-8635d75bf8ac" containerID="a3e0f7b5e55164ef784eab022024e272d92d6632d72dbefd655cefcebde7d1b8" exitCode=0 Oct 03 08:01:59 crc kubenswrapper[4664]: I1003 08:01:59.757152 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wx9d8" event={"ID":"b698d75f-d663-48cc-8881-8635d75bf8ac","Type":"ContainerDied","Data":"a3e0f7b5e55164ef784eab022024e272d92d6632d72dbefd655cefcebde7d1b8"} Oct 03 08:01:59 crc kubenswrapper[4664]: I1003 08:01:59.757186 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wx9d8" event={"ID":"b698d75f-d663-48cc-8881-8635d75bf8ac","Type":"ContainerStarted","Data":"6dbadd33494f72c5b26f1daa111a66f37d173221aa8ae256f3041994c56e4da7"} Oct 03 08:01:59 crc kubenswrapper[4664]: I1003 08:01:59.881836 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55a5a653-b1bd-420f-a23c-b764582b0810" path="/var/lib/kubelet/pods/55a5a653-b1bd-420f-a23c-b764582b0810/volumes" Oct 03 08:02:00 crc kubenswrapper[4664]: I1003 08:02:00.770035 4664 generic.go:334] "Generic (PLEG): container finished" podID="b698d75f-d663-48cc-8881-8635d75bf8ac" containerID="4158210d727c6303a4c5b3163be301a5fc2625c538525fc2cc0cfd52f683a6e3" exitCode=0 Oct 03 08:02:00 crc kubenswrapper[4664]: I1003 08:02:00.770181 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wx9d8" event={"ID":"b698d75f-d663-48cc-8881-8635d75bf8ac","Type":"ContainerDied","Data":"4158210d727c6303a4c5b3163be301a5fc2625c538525fc2cc0cfd52f683a6e3"} Oct 03 08:02:01 crc kubenswrapper[4664]: I1003 08:02:01.594966 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x2777" Oct 03 08:02:01 crc kubenswrapper[4664]: I1003 08:02:01.777805 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wx9d8" event={"ID":"b698d75f-d663-48cc-8881-8635d75bf8ac","Type":"ContainerStarted","Data":"bc7c69215c5841a34870f5394edaa81d451d79d4e101d9e1642b5ede1394d098"} Oct 03 08:02:01 crc kubenswrapper[4664]: I1003 08:02:01.796109 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wx9d8" podStartSLOduration=2.380681497 podStartE2EDuration="3.796076482s" podCreationTimestamp="2025-10-03 08:01:58 +0000 UTC" firstStartedPulling="2025-10-03 08:01:59.758504102 +0000 UTC m=+820.579694592" lastFinishedPulling="2025-10-03 08:02:01.173899087 +0000 UTC m=+821.995089577" observedRunningTime="2025-10-03 08:02:01.794715134 +0000 UTC m=+822.615905634" watchObservedRunningTime="2025-10-03 08:02:01.796076482 +0000 UTC m=+822.617266972" Oct 03 08:02:04 crc kubenswrapper[4664]: I1003 08:02:04.669303 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6fs97"] Oct 03 08:02:04 crc kubenswrapper[4664]: I1003 08:02:04.672354 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6fs97" Oct 03 08:02:04 crc kubenswrapper[4664]: I1003 08:02:04.680683 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6fs97"] Oct 03 08:02:04 crc kubenswrapper[4664]: I1003 08:02:04.847398 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d65f7349-61b0-4593-be43-b1bafe1ac639-catalog-content\") pod \"redhat-operators-6fs97\" (UID: \"d65f7349-61b0-4593-be43-b1bafe1ac639\") " pod="openshift-marketplace/redhat-operators-6fs97" Oct 03 08:02:04 crc kubenswrapper[4664]: I1003 08:02:04.847739 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvhw4\" (UniqueName: \"kubernetes.io/projected/d65f7349-61b0-4593-be43-b1bafe1ac639-kube-api-access-tvhw4\") pod \"redhat-operators-6fs97\" (UID: \"d65f7349-61b0-4593-be43-b1bafe1ac639\") " pod="openshift-marketplace/redhat-operators-6fs97" Oct 03 08:02:04 crc kubenswrapper[4664]: I1003 08:02:04.847866 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d65f7349-61b0-4593-be43-b1bafe1ac639-utilities\") pod \"redhat-operators-6fs97\" (UID: \"d65f7349-61b0-4593-be43-b1bafe1ac639\") " pod="openshift-marketplace/redhat-operators-6fs97" Oct 03 08:02:04 crc kubenswrapper[4664]: I1003 08:02:04.949481 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d65f7349-61b0-4593-be43-b1bafe1ac639-catalog-content\") pod \"redhat-operators-6fs97\" (UID: \"d65f7349-61b0-4593-be43-b1bafe1ac639\") " pod="openshift-marketplace/redhat-operators-6fs97" Oct 03 08:02:04 crc kubenswrapper[4664]: I1003 08:02:04.949974 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvhw4\" (UniqueName: \"kubernetes.io/projected/d65f7349-61b0-4593-be43-b1bafe1ac639-kube-api-access-tvhw4\") pod \"redhat-operators-6fs97\" (UID: \"d65f7349-61b0-4593-be43-b1bafe1ac639\") " pod="openshift-marketplace/redhat-operators-6fs97" Oct 03 08:02:04 crc kubenswrapper[4664]: I1003 08:02:04.950110 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d65f7349-61b0-4593-be43-b1bafe1ac639-utilities\") pod \"redhat-operators-6fs97\" (UID: \"d65f7349-61b0-4593-be43-b1bafe1ac639\") " pod="openshift-marketplace/redhat-operators-6fs97" Oct 03 08:02:04 crc kubenswrapper[4664]: I1003 08:02:04.950153 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d65f7349-61b0-4593-be43-b1bafe1ac639-catalog-content\") pod \"redhat-operators-6fs97\" (UID: \"d65f7349-61b0-4593-be43-b1bafe1ac639\") " pod="openshift-marketplace/redhat-operators-6fs97" Oct 03 08:02:04 crc kubenswrapper[4664]: I1003 08:02:04.950522 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d65f7349-61b0-4593-be43-b1bafe1ac639-utilities\") pod \"redhat-operators-6fs97\" (UID: \"d65f7349-61b0-4593-be43-b1bafe1ac639\") " pod="openshift-marketplace/redhat-operators-6fs97" Oct 03 08:02:04 crc kubenswrapper[4664]: I1003 08:02:04.970689 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvhw4\" (UniqueName: \"kubernetes.io/projected/d65f7349-61b0-4593-be43-b1bafe1ac639-kube-api-access-tvhw4\") pod \"redhat-operators-6fs97\" (UID: \"d65f7349-61b0-4593-be43-b1bafe1ac639\") " pod="openshift-marketplace/redhat-operators-6fs97" Oct 03 08:02:04 crc kubenswrapper[4664]: I1003 08:02:04.994411 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6fs97" Oct 03 08:02:05 crc kubenswrapper[4664]: I1003 08:02:05.421436 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6fs97"] Oct 03 08:02:05 crc kubenswrapper[4664]: I1003 08:02:05.799407 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6fs97" event={"ID":"d65f7349-61b0-4593-be43-b1bafe1ac639","Type":"ContainerStarted","Data":"d478f6b544a2773ec03f5ae8ad1dc08e5a0658e63133e7be62b015fd44c13723"} Oct 03 08:02:05 crc kubenswrapper[4664]: I1003 08:02:05.799483 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6fs97" event={"ID":"d65f7349-61b0-4593-be43-b1bafe1ac639","Type":"ContainerStarted","Data":"5a74038cc6ce630987af55cc32442559ef2f912709e2f99e1e5f9ddaabae58b4"} Oct 03 08:02:06 crc kubenswrapper[4664]: I1003 08:02:06.807244 4664 generic.go:334] "Generic (PLEG): container finished" podID="d65f7349-61b0-4593-be43-b1bafe1ac639" containerID="d478f6b544a2773ec03f5ae8ad1dc08e5a0658e63133e7be62b015fd44c13723" exitCode=0 Oct 03 08:02:06 crc kubenswrapper[4664]: I1003 08:02:06.807297 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6fs97" event={"ID":"d65f7349-61b0-4593-be43-b1bafe1ac639","Type":"ContainerDied","Data":"d478f6b544a2773ec03f5ae8ad1dc08e5a0658e63133e7be62b015fd44c13723"} Oct 03 08:02:07 crc kubenswrapper[4664]: I1003 08:02:07.815202 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6fs97" event={"ID":"d65f7349-61b0-4593-be43-b1bafe1ac639","Type":"ContainerStarted","Data":"4c85edb79b7bd28ec1123b79c1796aca1f2dab2191c837e6b052e796a6566e31"} Oct 03 08:02:08 crc kubenswrapper[4664]: I1003 08:02:08.712187 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cb76b8"] Oct 03 08:02:08 crc kubenswrapper[4664]: I1003 08:02:08.713702 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cb76b8" Oct 03 08:02:08 crc kubenswrapper[4664]: I1003 08:02:08.715879 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 03 08:02:08 crc kubenswrapper[4664]: I1003 08:02:08.723952 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cb76b8"] Oct 03 08:02:08 crc kubenswrapper[4664]: I1003 08:02:08.801693 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cd00ff11-ad1a-4739-9a03-5c01723dea02-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cb76b8\" (UID: \"cd00ff11-ad1a-4739-9a03-5c01723dea02\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cb76b8" Oct 03 08:02:08 crc kubenswrapper[4664]: I1003 08:02:08.801768 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgvws\" (UniqueName: \"kubernetes.io/projected/cd00ff11-ad1a-4739-9a03-5c01723dea02-kube-api-access-rgvws\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cb76b8\" (UID: \"cd00ff11-ad1a-4739-9a03-5c01723dea02\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cb76b8" Oct 03 08:02:08 crc kubenswrapper[4664]: I1003 08:02:08.801794 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cd00ff11-ad1a-4739-9a03-5c01723dea02-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cb76b8\" (UID: \"cd00ff11-ad1a-4739-9a03-5c01723dea02\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cb76b8" Oct 03 08:02:08 crc kubenswrapper[4664]: I1003 08:02:08.821697 4664 generic.go:334] "Generic (PLEG): container finished" podID="d65f7349-61b0-4593-be43-b1bafe1ac639" containerID="4c85edb79b7bd28ec1123b79c1796aca1f2dab2191c837e6b052e796a6566e31" exitCode=0 Oct 03 08:02:08 crc kubenswrapper[4664]: I1003 08:02:08.821757 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6fs97" event={"ID":"d65f7349-61b0-4593-be43-b1bafe1ac639","Type":"ContainerDied","Data":"4c85edb79b7bd28ec1123b79c1796aca1f2dab2191c837e6b052e796a6566e31"} Oct 03 08:02:08 crc kubenswrapper[4664]: I1003 08:02:08.903472 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgvws\" (UniqueName: \"kubernetes.io/projected/cd00ff11-ad1a-4739-9a03-5c01723dea02-kube-api-access-rgvws\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cb76b8\" (UID: \"cd00ff11-ad1a-4739-9a03-5c01723dea02\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cb76b8" Oct 03 08:02:08 crc kubenswrapper[4664]: I1003 08:02:08.903535 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cd00ff11-ad1a-4739-9a03-5c01723dea02-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cb76b8\" (UID: \"cd00ff11-ad1a-4739-9a03-5c01723dea02\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cb76b8" Oct 03 08:02:08 crc kubenswrapper[4664]: I1003 08:02:08.903597 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cd00ff11-ad1a-4739-9a03-5c01723dea02-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cb76b8\" (UID: \"cd00ff11-ad1a-4739-9a03-5c01723dea02\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cb76b8" Oct 03 08:02:08 crc kubenswrapper[4664]: I1003 08:02:08.904137 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cd00ff11-ad1a-4739-9a03-5c01723dea02-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cb76b8\" (UID: \"cd00ff11-ad1a-4739-9a03-5c01723dea02\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cb76b8" Oct 03 08:02:08 crc kubenswrapper[4664]: I1003 08:02:08.904176 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cd00ff11-ad1a-4739-9a03-5c01723dea02-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cb76b8\" (UID: \"cd00ff11-ad1a-4739-9a03-5c01723dea02\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cb76b8" Oct 03 08:02:08 crc kubenswrapper[4664]: I1003 08:02:08.923702 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgvws\" (UniqueName: \"kubernetes.io/projected/cd00ff11-ad1a-4739-9a03-5c01723dea02-kube-api-access-rgvws\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cb76b8\" (UID: \"cd00ff11-ad1a-4739-9a03-5c01723dea02\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cb76b8" Oct 03 08:02:08 crc kubenswrapper[4664]: I1003 08:02:08.998926 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wx9d8" Oct 03 08:02:08 crc kubenswrapper[4664]: I1003 08:02:08.998985 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wx9d8" Oct 03 08:02:09 crc kubenswrapper[4664]: I1003 08:02:09.037757 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wx9d8" Oct 03 08:02:09 crc kubenswrapper[4664]: I1003 08:02:09.076760 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cb76b8" Oct 03 08:02:09 crc kubenswrapper[4664]: I1003 08:02:09.264554 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cb76b8"] Oct 03 08:02:09 crc kubenswrapper[4664]: I1003 08:02:09.829255 4664 generic.go:334] "Generic (PLEG): container finished" podID="cd00ff11-ad1a-4739-9a03-5c01723dea02" containerID="8632b19ae50d09a1e2e2fb0d87a2769cee0e0cd9f240336b938745c2cf016813" exitCode=0 Oct 03 08:02:09 crc kubenswrapper[4664]: I1003 08:02:09.829337 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cb76b8" event={"ID":"cd00ff11-ad1a-4739-9a03-5c01723dea02","Type":"ContainerDied","Data":"8632b19ae50d09a1e2e2fb0d87a2769cee0e0cd9f240336b938745c2cf016813"} Oct 03 08:02:09 crc kubenswrapper[4664]: I1003 08:02:09.829374 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cb76b8" event={"ID":"cd00ff11-ad1a-4739-9a03-5c01723dea02","Type":"ContainerStarted","Data":"6b2c6e4afbc0b52e19c0c263a8dfe3cd39fc8879235639eefe10f10221f1c7c8"} Oct 03 08:02:09 crc kubenswrapper[4664]: I1003 08:02:09.840338 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6fs97" event={"ID":"d65f7349-61b0-4593-be43-b1bafe1ac639","Type":"ContainerStarted","Data":"d1ea5fb44bdddd63deaacbd7a118462c75c7b0d866bf2c31d5ebd3b95ec76b75"} Oct 03 08:02:09 crc kubenswrapper[4664]: I1003 08:02:09.875022 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6fs97" podStartSLOduration=3.137872999 podStartE2EDuration="5.87500169s" podCreationTimestamp="2025-10-03 08:02:04 +0000 UTC" firstStartedPulling="2025-10-03 08:02:06.809815733 +0000 UTC m=+827.631006223" lastFinishedPulling="2025-10-03 08:02:09.546944424 +0000 UTC m=+830.368134914" observedRunningTime="2025-10-03 08:02:09.867655479 +0000 UTC m=+830.688845979" watchObservedRunningTime="2025-10-03 08:02:09.87500169 +0000 UTC m=+830.696192200" Oct 03 08:02:09 crc kubenswrapper[4664]: I1003 08:02:09.883980 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wx9d8" Oct 03 08:02:11 crc kubenswrapper[4664]: I1003 08:02:11.855727 4664 generic.go:334] "Generic (PLEG): container finished" podID="cd00ff11-ad1a-4739-9a03-5c01723dea02" containerID="63fbcb1d1aee400363dda8ccd1900d67cfd0e4e86ef596d2604377ab59902611" exitCode=0 Oct 03 08:02:11 crc kubenswrapper[4664]: I1003 08:02:11.855866 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cb76b8" event={"ID":"cd00ff11-ad1a-4739-9a03-5c01723dea02","Type":"ContainerDied","Data":"63fbcb1d1aee400363dda8ccd1900d67cfd0e4e86ef596d2604377ab59902611"} Oct 03 08:02:11 crc kubenswrapper[4664]: I1003 08:02:11.987332 4664 patch_prober.go:28] interesting pod/machine-config-daemon-x9dgm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:02:11 crc kubenswrapper[4664]: I1003 08:02:11.987397 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:02:12 crc kubenswrapper[4664]: I1003 08:02:12.658175 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wx9d8"] Oct 03 08:02:12 crc kubenswrapper[4664]: I1003 08:02:12.658727 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wx9d8" podUID="b698d75f-d663-48cc-8881-8635d75bf8ac" containerName="registry-server" containerID="cri-o://bc7c69215c5841a34870f5394edaa81d451d79d4e101d9e1642b5ede1394d098" gracePeriod=2 Oct 03 08:02:12 crc kubenswrapper[4664]: I1003 08:02:12.870037 4664 generic.go:334] "Generic (PLEG): container finished" podID="cd00ff11-ad1a-4739-9a03-5c01723dea02" containerID="7589d034819b64ab59b146455969a464458fd3853d34475a97e63a1af9c9d6a5" exitCode=0 Oct 03 08:02:12 crc kubenswrapper[4664]: I1003 08:02:12.870097 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cb76b8" event={"ID":"cd00ff11-ad1a-4739-9a03-5c01723dea02","Type":"ContainerDied","Data":"7589d034819b64ab59b146455969a464458fd3853d34475a97e63a1af9c9d6a5"} Oct 03 08:02:12 crc kubenswrapper[4664]: I1003 08:02:12.874001 4664 generic.go:334] "Generic (PLEG): container finished" podID="b698d75f-d663-48cc-8881-8635d75bf8ac" containerID="bc7c69215c5841a34870f5394edaa81d451d79d4e101d9e1642b5ede1394d098" exitCode=0 Oct 03 08:02:12 crc kubenswrapper[4664]: I1003 08:02:12.874052 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wx9d8" event={"ID":"b698d75f-d663-48cc-8881-8635d75bf8ac","Type":"ContainerDied","Data":"bc7c69215c5841a34870f5394edaa81d451d79d4e101d9e1642b5ede1394d098"} Oct 03 08:02:13 crc kubenswrapper[4664]: I1003 08:02:13.086551 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wx9d8" Oct 03 08:02:13 crc kubenswrapper[4664]: I1003 08:02:13.264078 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b698d75f-d663-48cc-8881-8635d75bf8ac-utilities\") pod \"b698d75f-d663-48cc-8881-8635d75bf8ac\" (UID: \"b698d75f-d663-48cc-8881-8635d75bf8ac\") " Oct 03 08:02:13 crc kubenswrapper[4664]: I1003 08:02:13.264133 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b698d75f-d663-48cc-8881-8635d75bf8ac-catalog-content\") pod \"b698d75f-d663-48cc-8881-8635d75bf8ac\" (UID: \"b698d75f-d663-48cc-8881-8635d75bf8ac\") " Oct 03 08:02:13 crc kubenswrapper[4664]: I1003 08:02:13.264361 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44525\" (UniqueName: \"kubernetes.io/projected/b698d75f-d663-48cc-8881-8635d75bf8ac-kube-api-access-44525\") pod \"b698d75f-d663-48cc-8881-8635d75bf8ac\" (UID: \"b698d75f-d663-48cc-8881-8635d75bf8ac\") " Oct 03 08:02:13 crc kubenswrapper[4664]: I1003 08:02:13.264989 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b698d75f-d663-48cc-8881-8635d75bf8ac-utilities" (OuterVolumeSpecName: "utilities") pod "b698d75f-d663-48cc-8881-8635d75bf8ac" (UID: "b698d75f-d663-48cc-8881-8635d75bf8ac"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:02:13 crc kubenswrapper[4664]: I1003 08:02:13.269368 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b698d75f-d663-48cc-8881-8635d75bf8ac-kube-api-access-44525" (OuterVolumeSpecName: "kube-api-access-44525") pod "b698d75f-d663-48cc-8881-8635d75bf8ac" (UID: "b698d75f-d663-48cc-8881-8635d75bf8ac"). InnerVolumeSpecName "kube-api-access-44525". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:02:13 crc kubenswrapper[4664]: I1003 08:02:13.309670 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b698d75f-d663-48cc-8881-8635d75bf8ac-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b698d75f-d663-48cc-8881-8635d75bf8ac" (UID: "b698d75f-d663-48cc-8881-8635d75bf8ac"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:02:13 crc kubenswrapper[4664]: I1003 08:02:13.366251 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44525\" (UniqueName: \"kubernetes.io/projected/b698d75f-d663-48cc-8881-8635d75bf8ac-kube-api-access-44525\") on node \"crc\" DevicePath \"\"" Oct 03 08:02:13 crc kubenswrapper[4664]: I1003 08:02:13.366346 4664 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b698d75f-d663-48cc-8881-8635d75bf8ac-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:02:13 crc kubenswrapper[4664]: I1003 08:02:13.366356 4664 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b698d75f-d663-48cc-8881-8635d75bf8ac-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:02:13 crc kubenswrapper[4664]: I1003 08:02:13.888571 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wx9d8" Oct 03 08:02:13 crc kubenswrapper[4664]: I1003 08:02:13.896840 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wx9d8" event={"ID":"b698d75f-d663-48cc-8881-8635d75bf8ac","Type":"ContainerDied","Data":"6dbadd33494f72c5b26f1daa111a66f37d173221aa8ae256f3041994c56e4da7"} Oct 03 08:02:13 crc kubenswrapper[4664]: I1003 08:02:13.896925 4664 scope.go:117] "RemoveContainer" containerID="bc7c69215c5841a34870f5394edaa81d451d79d4e101d9e1642b5ede1394d098" Oct 03 08:02:13 crc kubenswrapper[4664]: I1003 08:02:13.924505 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wx9d8"] Oct 03 08:02:13 crc kubenswrapper[4664]: I1003 08:02:13.931504 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wx9d8"] Oct 03 08:02:13 crc kubenswrapper[4664]: I1003 08:02:13.934711 4664 scope.go:117] "RemoveContainer" containerID="4158210d727c6303a4c5b3163be301a5fc2625c538525fc2cc0cfd52f683a6e3" Oct 03 08:02:13 crc kubenswrapper[4664]: I1003 08:02:13.951521 4664 scope.go:117] "RemoveContainer" containerID="a3e0f7b5e55164ef784eab022024e272d92d6632d72dbefd655cefcebde7d1b8" Oct 03 08:02:14 crc kubenswrapper[4664]: I1003 08:02:14.139465 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cb76b8" Oct 03 08:02:14 crc kubenswrapper[4664]: I1003 08:02:14.280469 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cd00ff11-ad1a-4739-9a03-5c01723dea02-util\") pod \"cd00ff11-ad1a-4739-9a03-5c01723dea02\" (UID: \"cd00ff11-ad1a-4739-9a03-5c01723dea02\") " Oct 03 08:02:14 crc kubenswrapper[4664]: I1003 08:02:14.280551 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cd00ff11-ad1a-4739-9a03-5c01723dea02-bundle\") pod \"cd00ff11-ad1a-4739-9a03-5c01723dea02\" (UID: \"cd00ff11-ad1a-4739-9a03-5c01723dea02\") " Oct 03 08:02:14 crc kubenswrapper[4664]: I1003 08:02:14.280652 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgvws\" (UniqueName: \"kubernetes.io/projected/cd00ff11-ad1a-4739-9a03-5c01723dea02-kube-api-access-rgvws\") pod \"cd00ff11-ad1a-4739-9a03-5c01723dea02\" (UID: \"cd00ff11-ad1a-4739-9a03-5c01723dea02\") " Oct 03 08:02:14 crc kubenswrapper[4664]: I1003 08:02:14.281387 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd00ff11-ad1a-4739-9a03-5c01723dea02-bundle" (OuterVolumeSpecName: "bundle") pod "cd00ff11-ad1a-4739-9a03-5c01723dea02" (UID: "cd00ff11-ad1a-4739-9a03-5c01723dea02"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:02:14 crc kubenswrapper[4664]: I1003 08:02:14.285875 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd00ff11-ad1a-4739-9a03-5c01723dea02-kube-api-access-rgvws" (OuterVolumeSpecName: "kube-api-access-rgvws") pod "cd00ff11-ad1a-4739-9a03-5c01723dea02" (UID: "cd00ff11-ad1a-4739-9a03-5c01723dea02"). InnerVolumeSpecName "kube-api-access-rgvws". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:02:14 crc kubenswrapper[4664]: I1003 08:02:14.294451 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd00ff11-ad1a-4739-9a03-5c01723dea02-util" (OuterVolumeSpecName: "util") pod "cd00ff11-ad1a-4739-9a03-5c01723dea02" (UID: "cd00ff11-ad1a-4739-9a03-5c01723dea02"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:02:14 crc kubenswrapper[4664]: I1003 08:02:14.382276 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgvws\" (UniqueName: \"kubernetes.io/projected/cd00ff11-ad1a-4739-9a03-5c01723dea02-kube-api-access-rgvws\") on node \"crc\" DevicePath \"\"" Oct 03 08:02:14 crc kubenswrapper[4664]: I1003 08:02:14.382318 4664 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cd00ff11-ad1a-4739-9a03-5c01723dea02-util\") on node \"crc\" DevicePath \"\"" Oct 03 08:02:14 crc kubenswrapper[4664]: I1003 08:02:14.382329 4664 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cd00ff11-ad1a-4739-9a03-5c01723dea02-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:02:14 crc kubenswrapper[4664]: I1003 08:02:14.895480 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cb76b8" event={"ID":"cd00ff11-ad1a-4739-9a03-5c01723dea02","Type":"ContainerDied","Data":"6b2c6e4afbc0b52e19c0c263a8dfe3cd39fc8879235639eefe10f10221f1c7c8"} Oct 03 08:02:14 crc kubenswrapper[4664]: I1003 08:02:14.895522 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b2c6e4afbc0b52e19c0c263a8dfe3cd39fc8879235639eefe10f10221f1c7c8" Oct 03 08:02:14 crc kubenswrapper[4664]: I1003 08:02:14.895493 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cb76b8" Oct 03 08:02:14 crc kubenswrapper[4664]: I1003 08:02:14.995312 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6fs97" Oct 03 08:02:14 crc kubenswrapper[4664]: I1003 08:02:14.995361 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6fs97" Oct 03 08:02:15 crc kubenswrapper[4664]: I1003 08:02:15.036509 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6fs97" Oct 03 08:02:15 crc kubenswrapper[4664]: I1003 08:02:15.882643 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b698d75f-d663-48cc-8881-8635d75bf8ac" path="/var/lib/kubelet/pods/b698d75f-d663-48cc-8881-8635d75bf8ac/volumes" Oct 03 08:02:15 crc kubenswrapper[4664]: I1003 08:02:15.952116 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6fs97" Oct 03 08:02:17 crc kubenswrapper[4664]: I1003 08:02:17.003516 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-fj4m9"] Oct 03 08:02:17 crc kubenswrapper[4664]: E1003 08:02:17.004446 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd00ff11-ad1a-4739-9a03-5c01723dea02" containerName="extract" Oct 03 08:02:17 crc kubenswrapper[4664]: I1003 08:02:17.004523 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd00ff11-ad1a-4739-9a03-5c01723dea02" containerName="extract" Oct 03 08:02:17 crc kubenswrapper[4664]: E1003 08:02:17.004635 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b698d75f-d663-48cc-8881-8635d75bf8ac" containerName="extract-content" Oct 03 08:02:17 crc kubenswrapper[4664]: I1003 08:02:17.004696 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="b698d75f-d663-48cc-8881-8635d75bf8ac" containerName="extract-content" Oct 03 08:02:17 crc kubenswrapper[4664]: E1003 08:02:17.004751 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b698d75f-d663-48cc-8881-8635d75bf8ac" containerName="extract-utilities" Oct 03 08:02:17 crc kubenswrapper[4664]: I1003 08:02:17.004838 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="b698d75f-d663-48cc-8881-8635d75bf8ac" containerName="extract-utilities" Oct 03 08:02:17 crc kubenswrapper[4664]: E1003 08:02:17.004898 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b698d75f-d663-48cc-8881-8635d75bf8ac" containerName="registry-server" Oct 03 08:02:17 crc kubenswrapper[4664]: I1003 08:02:17.004944 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="b698d75f-d663-48cc-8881-8635d75bf8ac" containerName="registry-server" Oct 03 08:02:17 crc kubenswrapper[4664]: E1003 08:02:17.004994 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd00ff11-ad1a-4739-9a03-5c01723dea02" containerName="util" Oct 03 08:02:17 crc kubenswrapper[4664]: I1003 08:02:17.005055 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd00ff11-ad1a-4739-9a03-5c01723dea02" containerName="util" Oct 03 08:02:17 crc kubenswrapper[4664]: E1003 08:02:17.005147 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd00ff11-ad1a-4739-9a03-5c01723dea02" containerName="pull" Oct 03 08:02:17 crc kubenswrapper[4664]: I1003 08:02:17.005213 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd00ff11-ad1a-4739-9a03-5c01723dea02" containerName="pull" Oct 03 08:02:17 crc kubenswrapper[4664]: I1003 08:02:17.005360 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd00ff11-ad1a-4739-9a03-5c01723dea02" containerName="extract" Oct 03 08:02:17 crc kubenswrapper[4664]: I1003 08:02:17.005422 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="b698d75f-d663-48cc-8881-8635d75bf8ac" containerName="registry-server" Oct 03 08:02:17 crc kubenswrapper[4664]: I1003 08:02:17.005983 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-fj4m9" Oct 03 08:02:17 crc kubenswrapper[4664]: I1003 08:02:17.007958 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-2ncph" Oct 03 08:02:17 crc kubenswrapper[4664]: I1003 08:02:17.008541 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 03 08:02:17 crc kubenswrapper[4664]: I1003 08:02:17.008912 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 03 08:02:17 crc kubenswrapper[4664]: I1003 08:02:17.031765 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-fj4m9"] Oct 03 08:02:17 crc kubenswrapper[4664]: I1003 08:02:17.120959 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl5d9\" (UniqueName: \"kubernetes.io/projected/ef13533a-c837-42d9-87f7-2025250fd36f-kube-api-access-pl5d9\") pod \"nmstate-operator-858ddd8f98-fj4m9\" (UID: \"ef13533a-c837-42d9-87f7-2025250fd36f\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-fj4m9" Oct 03 08:02:17 crc kubenswrapper[4664]: I1003 08:02:17.221837 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pl5d9\" (UniqueName: \"kubernetes.io/projected/ef13533a-c837-42d9-87f7-2025250fd36f-kube-api-access-pl5d9\") pod \"nmstate-operator-858ddd8f98-fj4m9\" (UID: \"ef13533a-c837-42d9-87f7-2025250fd36f\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-fj4m9" Oct 03 08:02:17 crc kubenswrapper[4664]: I1003 08:02:17.242528 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl5d9\" (UniqueName: \"kubernetes.io/projected/ef13533a-c837-42d9-87f7-2025250fd36f-kube-api-access-pl5d9\") pod \"nmstate-operator-858ddd8f98-fj4m9\" (UID: \"ef13533a-c837-42d9-87f7-2025250fd36f\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-fj4m9" Oct 03 08:02:17 crc kubenswrapper[4664]: I1003 08:02:17.257455 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6fs97"] Oct 03 08:02:17 crc kubenswrapper[4664]: I1003 08:02:17.320304 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-fj4m9" Oct 03 08:02:17 crc kubenswrapper[4664]: I1003 08:02:17.749975 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-fj4m9"] Oct 03 08:02:17 crc kubenswrapper[4664]: I1003 08:02:17.914252 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-fj4m9" event={"ID":"ef13533a-c837-42d9-87f7-2025250fd36f","Type":"ContainerStarted","Data":"68e9c1742a9ed6c0753bcd4777654711d88f59ef879ce4e93bd82255edf640f9"} Oct 03 08:02:17 crc kubenswrapper[4664]: I1003 08:02:17.914993 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6fs97" podUID="d65f7349-61b0-4593-be43-b1bafe1ac639" containerName="registry-server" containerID="cri-o://d1ea5fb44bdddd63deaacbd7a118462c75c7b0d866bf2c31d5ebd3b95ec76b75" gracePeriod=2 Oct 03 08:02:18 crc kubenswrapper[4664]: I1003 08:02:18.247872 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6fs97" Oct 03 08:02:18 crc kubenswrapper[4664]: I1003 08:02:18.445435 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvhw4\" (UniqueName: \"kubernetes.io/projected/d65f7349-61b0-4593-be43-b1bafe1ac639-kube-api-access-tvhw4\") pod \"d65f7349-61b0-4593-be43-b1bafe1ac639\" (UID: \"d65f7349-61b0-4593-be43-b1bafe1ac639\") " Oct 03 08:02:18 crc kubenswrapper[4664]: I1003 08:02:18.445809 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d65f7349-61b0-4593-be43-b1bafe1ac639-utilities\") pod \"d65f7349-61b0-4593-be43-b1bafe1ac639\" (UID: \"d65f7349-61b0-4593-be43-b1bafe1ac639\") " Oct 03 08:02:18 crc kubenswrapper[4664]: I1003 08:02:18.445868 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d65f7349-61b0-4593-be43-b1bafe1ac639-catalog-content\") pod \"d65f7349-61b0-4593-be43-b1bafe1ac639\" (UID: \"d65f7349-61b0-4593-be43-b1bafe1ac639\") " Oct 03 08:02:18 crc kubenswrapper[4664]: I1003 08:02:18.446554 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d65f7349-61b0-4593-be43-b1bafe1ac639-utilities" (OuterVolumeSpecName: "utilities") pod "d65f7349-61b0-4593-be43-b1bafe1ac639" (UID: "d65f7349-61b0-4593-be43-b1bafe1ac639"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:02:18 crc kubenswrapper[4664]: I1003 08:02:18.450236 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d65f7349-61b0-4593-be43-b1bafe1ac639-kube-api-access-tvhw4" (OuterVolumeSpecName: "kube-api-access-tvhw4") pod "d65f7349-61b0-4593-be43-b1bafe1ac639" (UID: "d65f7349-61b0-4593-be43-b1bafe1ac639"). InnerVolumeSpecName "kube-api-access-tvhw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:02:18 crc kubenswrapper[4664]: I1003 08:02:18.525679 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d65f7349-61b0-4593-be43-b1bafe1ac639-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d65f7349-61b0-4593-be43-b1bafe1ac639" (UID: "d65f7349-61b0-4593-be43-b1bafe1ac639"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:02:18 crc kubenswrapper[4664]: I1003 08:02:18.547656 4664 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d65f7349-61b0-4593-be43-b1bafe1ac639-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:02:18 crc kubenswrapper[4664]: I1003 08:02:18.547724 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvhw4\" (UniqueName: \"kubernetes.io/projected/d65f7349-61b0-4593-be43-b1bafe1ac639-kube-api-access-tvhw4\") on node \"crc\" DevicePath \"\"" Oct 03 08:02:18 crc kubenswrapper[4664]: I1003 08:02:18.547740 4664 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d65f7349-61b0-4593-be43-b1bafe1ac639-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:02:18 crc kubenswrapper[4664]: I1003 08:02:18.924112 4664 generic.go:334] "Generic (PLEG): container finished" podID="d65f7349-61b0-4593-be43-b1bafe1ac639" containerID="d1ea5fb44bdddd63deaacbd7a118462c75c7b0d866bf2c31d5ebd3b95ec76b75" exitCode=0 Oct 03 08:02:18 crc kubenswrapper[4664]: I1003 08:02:18.924165 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6fs97" event={"ID":"d65f7349-61b0-4593-be43-b1bafe1ac639","Type":"ContainerDied","Data":"d1ea5fb44bdddd63deaacbd7a118462c75c7b0d866bf2c31d5ebd3b95ec76b75"} Oct 03 08:02:18 crc kubenswrapper[4664]: I1003 08:02:18.924183 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6fs97" Oct 03 08:02:18 crc kubenswrapper[4664]: I1003 08:02:18.924198 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6fs97" event={"ID":"d65f7349-61b0-4593-be43-b1bafe1ac639","Type":"ContainerDied","Data":"5a74038cc6ce630987af55cc32442559ef2f912709e2f99e1e5f9ddaabae58b4"} Oct 03 08:02:18 crc kubenswrapper[4664]: I1003 08:02:18.924222 4664 scope.go:117] "RemoveContainer" containerID="d1ea5fb44bdddd63deaacbd7a118462c75c7b0d866bf2c31d5ebd3b95ec76b75" Oct 03 08:02:18 crc kubenswrapper[4664]: I1003 08:02:18.945314 4664 scope.go:117] "RemoveContainer" containerID="4c85edb79b7bd28ec1123b79c1796aca1f2dab2191c837e6b052e796a6566e31" Oct 03 08:02:18 crc kubenswrapper[4664]: I1003 08:02:18.961201 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6fs97"] Oct 03 08:02:18 crc kubenswrapper[4664]: I1003 08:02:18.964781 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6fs97"] Oct 03 08:02:18 crc kubenswrapper[4664]: I1003 08:02:18.970089 4664 scope.go:117] "RemoveContainer" containerID="d478f6b544a2773ec03f5ae8ad1dc08e5a0658e63133e7be62b015fd44c13723" Oct 03 08:02:18 crc kubenswrapper[4664]: I1003 08:02:18.989241 4664 scope.go:117] "RemoveContainer" containerID="d1ea5fb44bdddd63deaacbd7a118462c75c7b0d866bf2c31d5ebd3b95ec76b75" Oct 03 08:02:18 crc kubenswrapper[4664]: E1003 08:02:18.990439 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1ea5fb44bdddd63deaacbd7a118462c75c7b0d866bf2c31d5ebd3b95ec76b75\": container with ID starting with d1ea5fb44bdddd63deaacbd7a118462c75c7b0d866bf2c31d5ebd3b95ec76b75 not found: ID does not exist" containerID="d1ea5fb44bdddd63deaacbd7a118462c75c7b0d866bf2c31d5ebd3b95ec76b75" Oct 03 08:02:18 crc kubenswrapper[4664]: I1003 08:02:18.990496 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1ea5fb44bdddd63deaacbd7a118462c75c7b0d866bf2c31d5ebd3b95ec76b75"} err="failed to get container status \"d1ea5fb44bdddd63deaacbd7a118462c75c7b0d866bf2c31d5ebd3b95ec76b75\": rpc error: code = NotFound desc = could not find container \"d1ea5fb44bdddd63deaacbd7a118462c75c7b0d866bf2c31d5ebd3b95ec76b75\": container with ID starting with d1ea5fb44bdddd63deaacbd7a118462c75c7b0d866bf2c31d5ebd3b95ec76b75 not found: ID does not exist" Oct 03 08:02:18 crc kubenswrapper[4664]: I1003 08:02:18.990536 4664 scope.go:117] "RemoveContainer" containerID="4c85edb79b7bd28ec1123b79c1796aca1f2dab2191c837e6b052e796a6566e31" Oct 03 08:02:18 crc kubenswrapper[4664]: E1003 08:02:18.991005 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c85edb79b7bd28ec1123b79c1796aca1f2dab2191c837e6b052e796a6566e31\": container with ID starting with 4c85edb79b7bd28ec1123b79c1796aca1f2dab2191c837e6b052e796a6566e31 not found: ID does not exist" containerID="4c85edb79b7bd28ec1123b79c1796aca1f2dab2191c837e6b052e796a6566e31" Oct 03 08:02:18 crc kubenswrapper[4664]: I1003 08:02:18.991139 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c85edb79b7bd28ec1123b79c1796aca1f2dab2191c837e6b052e796a6566e31"} err="failed to get container status \"4c85edb79b7bd28ec1123b79c1796aca1f2dab2191c837e6b052e796a6566e31\": rpc error: code = NotFound desc = could not find container \"4c85edb79b7bd28ec1123b79c1796aca1f2dab2191c837e6b052e796a6566e31\": container with ID starting with 4c85edb79b7bd28ec1123b79c1796aca1f2dab2191c837e6b052e796a6566e31 not found: ID does not exist" Oct 03 08:02:18 crc kubenswrapper[4664]: I1003 08:02:18.991271 4664 scope.go:117] "RemoveContainer" containerID="d478f6b544a2773ec03f5ae8ad1dc08e5a0658e63133e7be62b015fd44c13723" Oct 03 08:02:18 crc kubenswrapper[4664]: E1003 08:02:18.992168 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d478f6b544a2773ec03f5ae8ad1dc08e5a0658e63133e7be62b015fd44c13723\": container with ID starting with d478f6b544a2773ec03f5ae8ad1dc08e5a0658e63133e7be62b015fd44c13723 not found: ID does not exist" containerID="d478f6b544a2773ec03f5ae8ad1dc08e5a0658e63133e7be62b015fd44c13723" Oct 03 08:02:18 crc kubenswrapper[4664]: I1003 08:02:18.992198 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d478f6b544a2773ec03f5ae8ad1dc08e5a0658e63133e7be62b015fd44c13723"} err="failed to get container status \"d478f6b544a2773ec03f5ae8ad1dc08e5a0658e63133e7be62b015fd44c13723\": rpc error: code = NotFound desc = could not find container \"d478f6b544a2773ec03f5ae8ad1dc08e5a0658e63133e7be62b015fd44c13723\": container with ID starting with d478f6b544a2773ec03f5ae8ad1dc08e5a0658e63133e7be62b015fd44c13723 not found: ID does not exist" Oct 03 08:02:19 crc kubenswrapper[4664]: I1003 08:02:19.884637 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d65f7349-61b0-4593-be43-b1bafe1ac639" path="/var/lib/kubelet/pods/d65f7349-61b0-4593-be43-b1bafe1ac639/volumes" Oct 03 08:02:19 crc kubenswrapper[4664]: I1003 08:02:19.932301 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-fj4m9" event={"ID":"ef13533a-c837-42d9-87f7-2025250fd36f","Type":"ContainerStarted","Data":"ef6e6fa0e927ed362db4c28d03f58e3f2aab62ef0b8e686987c0ca17ad501017"} Oct 03 08:02:19 crc kubenswrapper[4664]: I1003 08:02:19.949975 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-858ddd8f98-fj4m9" podStartSLOduration=2.111157674 podStartE2EDuration="3.949955731s" podCreationTimestamp="2025-10-03 08:02:16 +0000 UTC" firstStartedPulling="2025-10-03 08:02:17.758380732 +0000 UTC m=+838.579571232" lastFinishedPulling="2025-10-03 08:02:19.597178799 +0000 UTC m=+840.418369289" observedRunningTime="2025-10-03 08:02:19.948345267 +0000 UTC m=+840.769535767" watchObservedRunningTime="2025-10-03 08:02:19.949955731 +0000 UTC m=+840.771146221" Oct 03 08:02:20 crc kubenswrapper[4664]: I1003 08:02:20.115274 4664 scope.go:117] "RemoveContainer" containerID="482e54714945acaea85fdeeb4b89eb9b16568c96319d07eb812ef88bd5faeb85" Oct 03 08:02:20 crc kubenswrapper[4664]: I1003 08:02:20.808491 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-5cmx6"] Oct 03 08:02:20 crc kubenswrapper[4664]: E1003 08:02:20.809160 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d65f7349-61b0-4593-be43-b1bafe1ac639" containerName="extract-content" Oct 03 08:02:20 crc kubenswrapper[4664]: I1003 08:02:20.809181 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="d65f7349-61b0-4593-be43-b1bafe1ac639" containerName="extract-content" Oct 03 08:02:20 crc kubenswrapper[4664]: E1003 08:02:20.809198 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d65f7349-61b0-4593-be43-b1bafe1ac639" containerName="registry-server" Oct 03 08:02:20 crc kubenswrapper[4664]: I1003 08:02:20.809206 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="d65f7349-61b0-4593-be43-b1bafe1ac639" containerName="registry-server" Oct 03 08:02:20 crc kubenswrapper[4664]: E1003 08:02:20.809237 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d65f7349-61b0-4593-be43-b1bafe1ac639" containerName="extract-utilities" Oct 03 08:02:20 crc kubenswrapper[4664]: I1003 08:02:20.809245 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="d65f7349-61b0-4593-be43-b1bafe1ac639" containerName="extract-utilities" Oct 03 08:02:20 crc kubenswrapper[4664]: I1003 08:02:20.809354 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="d65f7349-61b0-4593-be43-b1bafe1ac639" containerName="registry-server" Oct 03 08:02:20 crc kubenswrapper[4664]: I1003 08:02:20.810081 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-5cmx6" Oct 03 08:02:20 crc kubenswrapper[4664]: I1003 08:02:20.812981 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-6zflz" Oct 03 08:02:20 crc kubenswrapper[4664]: I1003 08:02:20.813782 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-jct4m"] Oct 03 08:02:20 crc kubenswrapper[4664]: I1003 08:02:20.816186 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-jct4m" Oct 03 08:02:20 crc kubenswrapper[4664]: I1003 08:02:20.817682 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 03 08:02:20 crc kubenswrapper[4664]: I1003 08:02:20.830785 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-jct4m"] Oct 03 08:02:20 crc kubenswrapper[4664]: I1003 08:02:20.835037 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-5cmx6"] Oct 03 08:02:20 crc kubenswrapper[4664]: I1003 08:02:20.844416 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-flh2z"] Oct 03 08:02:20 crc kubenswrapper[4664]: I1003 08:02:20.845129 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-flh2z" Oct 03 08:02:20 crc kubenswrapper[4664]: I1003 08:02:20.884137 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/cff70c7f-0a7c-4288-ae5a-74a88043586b-ovs-socket\") pod \"nmstate-handler-flh2z\" (UID: \"cff70c7f-0a7c-4288-ae5a-74a88043586b\") " pod="openshift-nmstate/nmstate-handler-flh2z" Oct 03 08:02:20 crc kubenswrapper[4664]: I1003 08:02:20.884234 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/cff70c7f-0a7c-4288-ae5a-74a88043586b-nmstate-lock\") pod \"nmstate-handler-flh2z\" (UID: \"cff70c7f-0a7c-4288-ae5a-74a88043586b\") " pod="openshift-nmstate/nmstate-handler-flh2z" Oct 03 08:02:20 crc kubenswrapper[4664]: I1003 08:02:20.884283 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgg74\" (UniqueName: \"kubernetes.io/projected/cff70c7f-0a7c-4288-ae5a-74a88043586b-kube-api-access-xgg74\") pod \"nmstate-handler-flh2z\" (UID: \"cff70c7f-0a7c-4288-ae5a-74a88043586b\") " pod="openshift-nmstate/nmstate-handler-flh2z" Oct 03 08:02:20 crc kubenswrapper[4664]: I1003 08:02:20.884342 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/c3c37a07-83fe-42c4-89b3-ab591db694aa-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-jct4m\" (UID: \"c3c37a07-83fe-42c4-89b3-ab591db694aa\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-jct4m" Oct 03 08:02:20 crc kubenswrapper[4664]: I1003 08:02:20.884367 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/cff70c7f-0a7c-4288-ae5a-74a88043586b-dbus-socket\") pod \"nmstate-handler-flh2z\" (UID: \"cff70c7f-0a7c-4288-ae5a-74a88043586b\") " pod="openshift-nmstate/nmstate-handler-flh2z" Oct 03 08:02:20 crc kubenswrapper[4664]: I1003 08:02:20.884386 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bl4h2\" (UniqueName: \"kubernetes.io/projected/53e89226-668e-4f4e-9035-293df2a37944-kube-api-access-bl4h2\") pod \"nmstate-metrics-fdff9cb8d-5cmx6\" (UID: \"53e89226-668e-4f4e-9035-293df2a37944\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-5cmx6" Oct 03 08:02:20 crc kubenswrapper[4664]: I1003 08:02:20.884409 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvq2d\" (UniqueName: \"kubernetes.io/projected/c3c37a07-83fe-42c4-89b3-ab591db694aa-kube-api-access-qvq2d\") pod \"nmstate-webhook-6cdbc54649-jct4m\" (UID: \"c3c37a07-83fe-42c4-89b3-ab591db694aa\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-jct4m" Oct 03 08:02:20 crc kubenswrapper[4664]: I1003 08:02:20.941513 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-72cm2_6998d742-8d17-4f20-ab52-c30d9f7b0b89/kube-multus/2.log" Oct 03 08:02:20 crc kubenswrapper[4664]: I1003 08:02:20.957116 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-l6sj8"] Oct 03 08:02:20 crc kubenswrapper[4664]: I1003 08:02:20.957855 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-l6sj8" Oct 03 08:02:20 crc kubenswrapper[4664]: I1003 08:02:20.959515 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 03 08:02:20 crc kubenswrapper[4664]: I1003 08:02:20.960871 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 03 08:02:20 crc kubenswrapper[4664]: I1003 08:02:20.961065 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-ncqnh" Oct 03 08:02:20 crc kubenswrapper[4664]: I1003 08:02:20.977837 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-l6sj8"] Oct 03 08:02:20 crc kubenswrapper[4664]: I1003 08:02:20.985181 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/c3c37a07-83fe-42c4-89b3-ab591db694aa-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-jct4m\" (UID: \"c3c37a07-83fe-42c4-89b3-ab591db694aa\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-jct4m" Oct 03 08:02:20 crc kubenswrapper[4664]: I1003 08:02:20.985229 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/cff70c7f-0a7c-4288-ae5a-74a88043586b-dbus-socket\") pod \"nmstate-handler-flh2z\" (UID: \"cff70c7f-0a7c-4288-ae5a-74a88043586b\") " pod="openshift-nmstate/nmstate-handler-flh2z" Oct 03 08:02:20 crc kubenswrapper[4664]: I1003 08:02:20.985259 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bl4h2\" (UniqueName: \"kubernetes.io/projected/53e89226-668e-4f4e-9035-293df2a37944-kube-api-access-bl4h2\") pod \"nmstate-metrics-fdff9cb8d-5cmx6\" (UID: \"53e89226-668e-4f4e-9035-293df2a37944\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-5cmx6" Oct 03 08:02:20 crc kubenswrapper[4664]: I1003 08:02:20.985285 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvq2d\" (UniqueName: \"kubernetes.io/projected/c3c37a07-83fe-42c4-89b3-ab591db694aa-kube-api-access-qvq2d\") pod \"nmstate-webhook-6cdbc54649-jct4m\" (UID: \"c3c37a07-83fe-42c4-89b3-ab591db694aa\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-jct4m" Oct 03 08:02:20 crc kubenswrapper[4664]: I1003 08:02:20.985335 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9hp7\" (UniqueName: \"kubernetes.io/projected/6888a579-64c6-4313-bb63-5d8e09d9389c-kube-api-access-v9hp7\") pod \"nmstate-console-plugin-6b874cbd85-l6sj8\" (UID: \"6888a579-64c6-4313-bb63-5d8e09d9389c\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-l6sj8" Oct 03 08:02:20 crc kubenswrapper[4664]: I1003 08:02:20.985383 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/cff70c7f-0a7c-4288-ae5a-74a88043586b-ovs-socket\") pod \"nmstate-handler-flh2z\" (UID: \"cff70c7f-0a7c-4288-ae5a-74a88043586b\") " pod="openshift-nmstate/nmstate-handler-flh2z" Oct 03 08:02:20 crc kubenswrapper[4664]: E1003 08:02:20.985376 4664 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Oct 03 08:02:20 crc kubenswrapper[4664]: I1003 08:02:20.985431 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/cff70c7f-0a7c-4288-ae5a-74a88043586b-nmstate-lock\") pod \"nmstate-handler-flh2z\" (UID: \"cff70c7f-0a7c-4288-ae5a-74a88043586b\") " pod="openshift-nmstate/nmstate-handler-flh2z" Oct 03 08:02:20 crc kubenswrapper[4664]: E1003 08:02:20.985454 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3c37a07-83fe-42c4-89b3-ab591db694aa-tls-key-pair podName:c3c37a07-83fe-42c4-89b3-ab591db694aa nodeName:}" failed. No retries permitted until 2025-10-03 08:02:21.485432432 +0000 UTC m=+842.306622922 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/c3c37a07-83fe-42c4-89b3-ab591db694aa-tls-key-pair") pod "nmstate-webhook-6cdbc54649-jct4m" (UID: "c3c37a07-83fe-42c4-89b3-ab591db694aa") : secret "openshift-nmstate-webhook" not found Oct 03 08:02:20 crc kubenswrapper[4664]: I1003 08:02:20.985477 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/cff70c7f-0a7c-4288-ae5a-74a88043586b-nmstate-lock\") pod \"nmstate-handler-flh2z\" (UID: \"cff70c7f-0a7c-4288-ae5a-74a88043586b\") " pod="openshift-nmstate/nmstate-handler-flh2z" Oct 03 08:02:20 crc kubenswrapper[4664]: I1003 08:02:20.985478 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/6888a579-64c6-4313-bb63-5d8e09d9389c-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-l6sj8\" (UID: \"6888a579-64c6-4313-bb63-5d8e09d9389c\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-l6sj8" Oct 03 08:02:20 crc kubenswrapper[4664]: I1003 08:02:20.985492 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/cff70c7f-0a7c-4288-ae5a-74a88043586b-ovs-socket\") pod \"nmstate-handler-flh2z\" (UID: \"cff70c7f-0a7c-4288-ae5a-74a88043586b\") " pod="openshift-nmstate/nmstate-handler-flh2z" Oct 03 08:02:20 crc kubenswrapper[4664]: I1003 08:02:20.985525 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgg74\" (UniqueName: \"kubernetes.io/projected/cff70c7f-0a7c-4288-ae5a-74a88043586b-kube-api-access-xgg74\") pod \"nmstate-handler-flh2z\" (UID: \"cff70c7f-0a7c-4288-ae5a-74a88043586b\") " pod="openshift-nmstate/nmstate-handler-flh2z" Oct 03 08:02:20 crc kubenswrapper[4664]: I1003 08:02:20.985551 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6888a579-64c6-4313-bb63-5d8e09d9389c-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-l6sj8\" (UID: \"6888a579-64c6-4313-bb63-5d8e09d9389c\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-l6sj8" Oct 03 08:02:20 crc kubenswrapper[4664]: I1003 08:02:20.985704 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/cff70c7f-0a7c-4288-ae5a-74a88043586b-dbus-socket\") pod \"nmstate-handler-flh2z\" (UID: \"cff70c7f-0a7c-4288-ae5a-74a88043586b\") " pod="openshift-nmstate/nmstate-handler-flh2z" Oct 03 08:02:21 crc kubenswrapper[4664]: I1003 08:02:21.028596 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvq2d\" (UniqueName: \"kubernetes.io/projected/c3c37a07-83fe-42c4-89b3-ab591db694aa-kube-api-access-qvq2d\") pod \"nmstate-webhook-6cdbc54649-jct4m\" (UID: \"c3c37a07-83fe-42c4-89b3-ab591db694aa\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-jct4m" Oct 03 08:02:21 crc kubenswrapper[4664]: I1003 08:02:21.033166 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgg74\" (UniqueName: \"kubernetes.io/projected/cff70c7f-0a7c-4288-ae5a-74a88043586b-kube-api-access-xgg74\") pod \"nmstate-handler-flh2z\" (UID: \"cff70c7f-0a7c-4288-ae5a-74a88043586b\") " pod="openshift-nmstate/nmstate-handler-flh2z" Oct 03 08:02:21 crc kubenswrapper[4664]: I1003 08:02:21.043334 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bl4h2\" (UniqueName: \"kubernetes.io/projected/53e89226-668e-4f4e-9035-293df2a37944-kube-api-access-bl4h2\") pod \"nmstate-metrics-fdff9cb8d-5cmx6\" (UID: \"53e89226-668e-4f4e-9035-293df2a37944\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-5cmx6" Oct 03 08:02:21 crc kubenswrapper[4664]: I1003 08:02:21.086493 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/6888a579-64c6-4313-bb63-5d8e09d9389c-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-l6sj8\" (UID: \"6888a579-64c6-4313-bb63-5d8e09d9389c\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-l6sj8" Oct 03 08:02:21 crc kubenswrapper[4664]: I1003 08:02:21.086546 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6888a579-64c6-4313-bb63-5d8e09d9389c-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-l6sj8\" (UID: \"6888a579-64c6-4313-bb63-5d8e09d9389c\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-l6sj8" Oct 03 08:02:21 crc kubenswrapper[4664]: I1003 08:02:21.086659 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9hp7\" (UniqueName: \"kubernetes.io/projected/6888a579-64c6-4313-bb63-5d8e09d9389c-kube-api-access-v9hp7\") pod \"nmstate-console-plugin-6b874cbd85-l6sj8\" (UID: \"6888a579-64c6-4313-bb63-5d8e09d9389c\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-l6sj8" Oct 03 08:02:21 crc kubenswrapper[4664]: E1003 08:02:21.087035 4664 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Oct 03 08:02:21 crc kubenswrapper[4664]: E1003 08:02:21.087092 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6888a579-64c6-4313-bb63-5d8e09d9389c-plugin-serving-cert podName:6888a579-64c6-4313-bb63-5d8e09d9389c nodeName:}" failed. No retries permitted until 2025-10-03 08:02:21.58707388 +0000 UTC m=+842.408264370 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/6888a579-64c6-4313-bb63-5d8e09d9389c-plugin-serving-cert") pod "nmstate-console-plugin-6b874cbd85-l6sj8" (UID: "6888a579-64c6-4313-bb63-5d8e09d9389c") : secret "plugin-serving-cert" not found Oct 03 08:02:21 crc kubenswrapper[4664]: I1003 08:02:21.088001 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6888a579-64c6-4313-bb63-5d8e09d9389c-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-l6sj8\" (UID: \"6888a579-64c6-4313-bb63-5d8e09d9389c\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-l6sj8" Oct 03 08:02:21 crc kubenswrapper[4664]: I1003 08:02:21.109857 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9hp7\" (UniqueName: \"kubernetes.io/projected/6888a579-64c6-4313-bb63-5d8e09d9389c-kube-api-access-v9hp7\") pod \"nmstate-console-plugin-6b874cbd85-l6sj8\" (UID: \"6888a579-64c6-4313-bb63-5d8e09d9389c\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-l6sj8" Oct 03 08:02:21 crc kubenswrapper[4664]: I1003 08:02:21.128894 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-5cmx6" Oct 03 08:02:21 crc kubenswrapper[4664]: I1003 08:02:21.162160 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-flh2z" Oct 03 08:02:21 crc kubenswrapper[4664]: I1003 08:02:21.192252 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6ccf4b568b-f9rcf"] Oct 03 08:02:21 crc kubenswrapper[4664]: I1003 08:02:21.193397 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6ccf4b568b-f9rcf" Oct 03 08:02:21 crc kubenswrapper[4664]: W1003 08:02:21.198880 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcff70c7f_0a7c_4288_ae5a_74a88043586b.slice/crio-d527e5d22cecb48033e744a8765e7383e875cf0d516c0b1745af8b9faf857e97 WatchSource:0}: Error finding container d527e5d22cecb48033e744a8765e7383e875cf0d516c0b1745af8b9faf857e97: Status 404 returned error can't find the container with id d527e5d22cecb48033e744a8765e7383e875cf0d516c0b1745af8b9faf857e97 Oct 03 08:02:21 crc kubenswrapper[4664]: I1003 08:02:21.214431 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6ccf4b568b-f9rcf"] Oct 03 08:02:21 crc kubenswrapper[4664]: I1003 08:02:21.391916 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0e40d30f-350b-4cf6-a323-5fb6e82bf733-service-ca\") pod \"console-6ccf4b568b-f9rcf\" (UID: \"0e40d30f-350b-4cf6-a323-5fb6e82bf733\") " pod="openshift-console/console-6ccf4b568b-f9rcf" Oct 03 08:02:21 crc kubenswrapper[4664]: I1003 08:02:21.391963 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqj2s\" (UniqueName: \"kubernetes.io/projected/0e40d30f-350b-4cf6-a323-5fb6e82bf733-kube-api-access-jqj2s\") pod \"console-6ccf4b568b-f9rcf\" (UID: \"0e40d30f-350b-4cf6-a323-5fb6e82bf733\") " pod="openshift-console/console-6ccf4b568b-f9rcf" Oct 03 08:02:21 crc kubenswrapper[4664]: I1003 08:02:21.391995 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0e40d30f-350b-4cf6-a323-5fb6e82bf733-console-config\") pod \"console-6ccf4b568b-f9rcf\" (UID: \"0e40d30f-350b-4cf6-a323-5fb6e82bf733\") " pod="openshift-console/console-6ccf4b568b-f9rcf" Oct 03 08:02:21 crc kubenswrapper[4664]: I1003 08:02:21.392027 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0e40d30f-350b-4cf6-a323-5fb6e82bf733-console-serving-cert\") pod \"console-6ccf4b568b-f9rcf\" (UID: \"0e40d30f-350b-4cf6-a323-5fb6e82bf733\") " pod="openshift-console/console-6ccf4b568b-f9rcf" Oct 03 08:02:21 crc kubenswrapper[4664]: I1003 08:02:21.392095 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0e40d30f-350b-4cf6-a323-5fb6e82bf733-oauth-serving-cert\") pod \"console-6ccf4b568b-f9rcf\" (UID: \"0e40d30f-350b-4cf6-a323-5fb6e82bf733\") " pod="openshift-console/console-6ccf4b568b-f9rcf" Oct 03 08:02:21 crc kubenswrapper[4664]: I1003 08:02:21.392144 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0e40d30f-350b-4cf6-a323-5fb6e82bf733-console-oauth-config\") pod \"console-6ccf4b568b-f9rcf\" (UID: \"0e40d30f-350b-4cf6-a323-5fb6e82bf733\") " pod="openshift-console/console-6ccf4b568b-f9rcf" Oct 03 08:02:21 crc kubenswrapper[4664]: I1003 08:02:21.392194 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e40d30f-350b-4cf6-a323-5fb6e82bf733-trusted-ca-bundle\") pod \"console-6ccf4b568b-f9rcf\" (UID: \"0e40d30f-350b-4cf6-a323-5fb6e82bf733\") " pod="openshift-console/console-6ccf4b568b-f9rcf" Oct 03 08:02:21 crc kubenswrapper[4664]: I1003 08:02:21.493069 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e40d30f-350b-4cf6-a323-5fb6e82bf733-trusted-ca-bundle\") pod \"console-6ccf4b568b-f9rcf\" (UID: \"0e40d30f-350b-4cf6-a323-5fb6e82bf733\") " pod="openshift-console/console-6ccf4b568b-f9rcf" Oct 03 08:02:21 crc kubenswrapper[4664]: I1003 08:02:21.493137 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0e40d30f-350b-4cf6-a323-5fb6e82bf733-service-ca\") pod \"console-6ccf4b568b-f9rcf\" (UID: \"0e40d30f-350b-4cf6-a323-5fb6e82bf733\") " pod="openshift-console/console-6ccf4b568b-f9rcf" Oct 03 08:02:21 crc kubenswrapper[4664]: I1003 08:02:21.493172 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqj2s\" (UniqueName: \"kubernetes.io/projected/0e40d30f-350b-4cf6-a323-5fb6e82bf733-kube-api-access-jqj2s\") pod \"console-6ccf4b568b-f9rcf\" (UID: \"0e40d30f-350b-4cf6-a323-5fb6e82bf733\") " pod="openshift-console/console-6ccf4b568b-f9rcf" Oct 03 08:02:21 crc kubenswrapper[4664]: I1003 08:02:21.493211 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0e40d30f-350b-4cf6-a323-5fb6e82bf733-console-config\") pod \"console-6ccf4b568b-f9rcf\" (UID: \"0e40d30f-350b-4cf6-a323-5fb6e82bf733\") " pod="openshift-console/console-6ccf4b568b-f9rcf" Oct 03 08:02:21 crc kubenswrapper[4664]: I1003 08:02:21.493243 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/c3c37a07-83fe-42c4-89b3-ab591db694aa-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-jct4m\" (UID: \"c3c37a07-83fe-42c4-89b3-ab591db694aa\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-jct4m" Oct 03 08:02:21 crc kubenswrapper[4664]: I1003 08:02:21.493275 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0e40d30f-350b-4cf6-a323-5fb6e82bf733-console-serving-cert\") pod \"console-6ccf4b568b-f9rcf\" (UID: \"0e40d30f-350b-4cf6-a323-5fb6e82bf733\") " pod="openshift-console/console-6ccf4b568b-f9rcf" Oct 03 08:02:21 crc kubenswrapper[4664]: I1003 08:02:21.493307 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0e40d30f-350b-4cf6-a323-5fb6e82bf733-oauth-serving-cert\") pod \"console-6ccf4b568b-f9rcf\" (UID: \"0e40d30f-350b-4cf6-a323-5fb6e82bf733\") " pod="openshift-console/console-6ccf4b568b-f9rcf" Oct 03 08:02:21 crc kubenswrapper[4664]: I1003 08:02:21.493341 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0e40d30f-350b-4cf6-a323-5fb6e82bf733-console-oauth-config\") pod \"console-6ccf4b568b-f9rcf\" (UID: \"0e40d30f-350b-4cf6-a323-5fb6e82bf733\") " pod="openshift-console/console-6ccf4b568b-f9rcf" Oct 03 08:02:21 crc kubenswrapper[4664]: I1003 08:02:21.494338 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0e40d30f-350b-4cf6-a323-5fb6e82bf733-service-ca\") pod \"console-6ccf4b568b-f9rcf\" (UID: \"0e40d30f-350b-4cf6-a323-5fb6e82bf733\") " pod="openshift-console/console-6ccf4b568b-f9rcf" Oct 03 08:02:21 crc kubenswrapper[4664]: I1003 08:02:21.494353 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0e40d30f-350b-4cf6-a323-5fb6e82bf733-oauth-serving-cert\") pod \"console-6ccf4b568b-f9rcf\" (UID: \"0e40d30f-350b-4cf6-a323-5fb6e82bf733\") " pod="openshift-console/console-6ccf4b568b-f9rcf" Oct 03 08:02:21 crc kubenswrapper[4664]: I1003 08:02:21.494976 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e40d30f-350b-4cf6-a323-5fb6e82bf733-trusted-ca-bundle\") pod \"console-6ccf4b568b-f9rcf\" (UID: \"0e40d30f-350b-4cf6-a323-5fb6e82bf733\") " pod="openshift-console/console-6ccf4b568b-f9rcf" Oct 03 08:02:21 crc kubenswrapper[4664]: I1003 08:02:21.495291 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0e40d30f-350b-4cf6-a323-5fb6e82bf733-console-config\") pod \"console-6ccf4b568b-f9rcf\" (UID: \"0e40d30f-350b-4cf6-a323-5fb6e82bf733\") " pod="openshift-console/console-6ccf4b568b-f9rcf" Oct 03 08:02:21 crc kubenswrapper[4664]: I1003 08:02:21.498415 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/c3c37a07-83fe-42c4-89b3-ab591db694aa-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-jct4m\" (UID: \"c3c37a07-83fe-42c4-89b3-ab591db694aa\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-jct4m" Oct 03 08:02:21 crc kubenswrapper[4664]: I1003 08:02:21.498467 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0e40d30f-350b-4cf6-a323-5fb6e82bf733-console-oauth-config\") pod \"console-6ccf4b568b-f9rcf\" (UID: \"0e40d30f-350b-4cf6-a323-5fb6e82bf733\") " pod="openshift-console/console-6ccf4b568b-f9rcf" Oct 03 08:02:21 crc kubenswrapper[4664]: I1003 08:02:21.500986 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0e40d30f-350b-4cf6-a323-5fb6e82bf733-console-serving-cert\") pod \"console-6ccf4b568b-f9rcf\" (UID: \"0e40d30f-350b-4cf6-a323-5fb6e82bf733\") " pod="openshift-console/console-6ccf4b568b-f9rcf" Oct 03 08:02:21 crc kubenswrapper[4664]: I1003 08:02:21.509430 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqj2s\" (UniqueName: \"kubernetes.io/projected/0e40d30f-350b-4cf6-a323-5fb6e82bf733-kube-api-access-jqj2s\") pod \"console-6ccf4b568b-f9rcf\" (UID: \"0e40d30f-350b-4cf6-a323-5fb6e82bf733\") " pod="openshift-console/console-6ccf4b568b-f9rcf" Oct 03 08:02:21 crc kubenswrapper[4664]: I1003 08:02:21.515645 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6ccf4b568b-f9rcf" Oct 03 08:02:21 crc kubenswrapper[4664]: I1003 08:02:21.594567 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/6888a579-64c6-4313-bb63-5d8e09d9389c-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-l6sj8\" (UID: \"6888a579-64c6-4313-bb63-5d8e09d9389c\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-l6sj8" Oct 03 08:02:21 crc kubenswrapper[4664]: I1003 08:02:21.598304 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/6888a579-64c6-4313-bb63-5d8e09d9389c-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-l6sj8\" (UID: \"6888a579-64c6-4313-bb63-5d8e09d9389c\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-l6sj8" Oct 03 08:02:21 crc kubenswrapper[4664]: I1003 08:02:21.607873 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-5cmx6"] Oct 03 08:02:21 crc kubenswrapper[4664]: I1003 08:02:21.741117 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-jct4m" Oct 03 08:02:21 crc kubenswrapper[4664]: I1003 08:02:21.870321 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-l6sj8" Oct 03 08:02:21 crc kubenswrapper[4664]: I1003 08:02:21.906741 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6ccf4b568b-f9rcf"] Oct 03 08:02:21 crc kubenswrapper[4664]: W1003 08:02:21.928037 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e40d30f_350b_4cf6_a323_5fb6e82bf733.slice/crio-8d68413da015806aab0dca65c4d6ffbeda854bdc4ceb3e72f2b170a1773c3670 WatchSource:0}: Error finding container 8d68413da015806aab0dca65c4d6ffbeda854bdc4ceb3e72f2b170a1773c3670: Status 404 returned error can't find the container with id 8d68413da015806aab0dca65c4d6ffbeda854bdc4ceb3e72f2b170a1773c3670 Oct 03 08:02:21 crc kubenswrapper[4664]: I1003 08:02:21.963979 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-5cmx6" event={"ID":"53e89226-668e-4f4e-9035-293df2a37944","Type":"ContainerStarted","Data":"c53741ff6c2539d4996f5517ffc014638a2bf6726b00c2bddfc8bc3d0a0d8daa"} Oct 03 08:02:21 crc kubenswrapper[4664]: I1003 08:02:21.966292 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6ccf4b568b-f9rcf" event={"ID":"0e40d30f-350b-4cf6-a323-5fb6e82bf733","Type":"ContainerStarted","Data":"8d68413da015806aab0dca65c4d6ffbeda854bdc4ceb3e72f2b170a1773c3670"} Oct 03 08:02:21 crc kubenswrapper[4664]: I1003 08:02:21.968303 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-flh2z" event={"ID":"cff70c7f-0a7c-4288-ae5a-74a88043586b","Type":"ContainerStarted","Data":"d527e5d22cecb48033e744a8765e7383e875cf0d516c0b1745af8b9faf857e97"} Oct 03 08:02:22 crc kubenswrapper[4664]: I1003 08:02:22.099693 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-l6sj8"] Oct 03 08:02:22 crc kubenswrapper[4664]: I1003 08:02:22.173873 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-jct4m"] Oct 03 08:02:22 crc kubenswrapper[4664]: W1003 08:02:22.179882 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3c37a07_83fe_42c4_89b3_ab591db694aa.slice/crio-b9048cf711a2bc142493f81e4bd250165ce74deb2dd22860f0aa2c55283a056b WatchSource:0}: Error finding container b9048cf711a2bc142493f81e4bd250165ce74deb2dd22860f0aa2c55283a056b: Status 404 returned error can't find the container with id b9048cf711a2bc142493f81e4bd250165ce74deb2dd22860f0aa2c55283a056b Oct 03 08:02:22 crc kubenswrapper[4664]: I1003 08:02:22.974307 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-l6sj8" event={"ID":"6888a579-64c6-4313-bb63-5d8e09d9389c","Type":"ContainerStarted","Data":"b12e3bf183f40dcd842e655eace13b0ccdc366ddcb819fcd9f64fe907f86cfff"} Oct 03 08:02:22 crc kubenswrapper[4664]: I1003 08:02:22.978447 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6ccf4b568b-f9rcf" event={"ID":"0e40d30f-350b-4cf6-a323-5fb6e82bf733","Type":"ContainerStarted","Data":"80afd3bfb315fba02186f0f5fbad5cd3fe9ed04d088f132db5b373f1a558baaa"} Oct 03 08:02:22 crc kubenswrapper[4664]: I1003 08:02:22.979377 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-jct4m" event={"ID":"c3c37a07-83fe-42c4-89b3-ab591db694aa","Type":"ContainerStarted","Data":"b9048cf711a2bc142493f81e4bd250165ce74deb2dd22860f0aa2c55283a056b"} Oct 03 08:02:23 crc kubenswrapper[4664]: I1003 08:02:23.000492 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6ccf4b568b-f9rcf" podStartSLOduration=2.000475128 podStartE2EDuration="2.000475128s" podCreationTimestamp="2025-10-03 08:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:02:22.998270877 +0000 UTC m=+843.819461377" watchObservedRunningTime="2025-10-03 08:02:23.000475128 +0000 UTC m=+843.821665608" Oct 03 08:02:23 crc kubenswrapper[4664]: I1003 08:02:23.990207 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-flh2z" event={"ID":"cff70c7f-0a7c-4288-ae5a-74a88043586b","Type":"ContainerStarted","Data":"cf2bc5dce02d1027ac6f536486ce50a785983c589f49024a66be2757c21cd4fe"} Oct 03 08:02:23 crc kubenswrapper[4664]: I1003 08:02:23.990801 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-flh2z" Oct 03 08:02:23 crc kubenswrapper[4664]: I1003 08:02:23.993696 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-5cmx6" event={"ID":"53e89226-668e-4f4e-9035-293df2a37944","Type":"ContainerStarted","Data":"5ec28e202d5dab350fe5ee31b8320a0f0e5490fcb7dd4532475f1cac2ce92855"} Oct 03 08:02:24 crc kubenswrapper[4664]: I1003 08:02:24.001875 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-jct4m" event={"ID":"c3c37a07-83fe-42c4-89b3-ab591db694aa","Type":"ContainerStarted","Data":"1086c214b49b411ecb630328cdd724929fbabd37ab5d7f3bdbf467425fa7f2fb"} Oct 03 08:02:24 crc kubenswrapper[4664]: I1003 08:02:24.001955 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-jct4m" Oct 03 08:02:24 crc kubenswrapper[4664]: I1003 08:02:24.010321 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-flh2z" podStartSLOduration=1.8251209830000001 podStartE2EDuration="4.010303868s" podCreationTimestamp="2025-10-03 08:02:20 +0000 UTC" firstStartedPulling="2025-10-03 08:02:21.20449602 +0000 UTC m=+842.025686510" lastFinishedPulling="2025-10-03 08:02:23.389678905 +0000 UTC m=+844.210869395" observedRunningTime="2025-10-03 08:02:24.008034586 +0000 UTC m=+844.829225086" watchObservedRunningTime="2025-10-03 08:02:24.010303868 +0000 UTC m=+844.831494358" Oct 03 08:02:24 crc kubenswrapper[4664]: I1003 08:02:24.033684 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-jct4m" podStartSLOduration=2.818076512 podStartE2EDuration="4.033666086s" podCreationTimestamp="2025-10-03 08:02:20 +0000 UTC" firstStartedPulling="2025-10-03 08:02:22.182034288 +0000 UTC m=+843.003224778" lastFinishedPulling="2025-10-03 08:02:23.397623872 +0000 UTC m=+844.218814352" observedRunningTime="2025-10-03 08:02:24.032986908 +0000 UTC m=+844.854177408" watchObservedRunningTime="2025-10-03 08:02:24.033666086 +0000 UTC m=+844.854856576" Oct 03 08:02:25 crc kubenswrapper[4664]: I1003 08:02:25.008198 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-l6sj8" event={"ID":"6888a579-64c6-4313-bb63-5d8e09d9389c","Type":"ContainerStarted","Data":"9d15813b7e4eb6fba6e916213a70ab1164d13bb0d13a77af9e14b43dc743b808"} Oct 03 08:02:27 crc kubenswrapper[4664]: I1003 08:02:27.020118 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-5cmx6" event={"ID":"53e89226-668e-4f4e-9035-293df2a37944","Type":"ContainerStarted","Data":"4078836c3b4f4d638ba204689d5d59806c4d884e5737c1de833e048409e0cdde"} Oct 03 08:02:27 crc kubenswrapper[4664]: I1003 08:02:27.040518 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-l6sj8" podStartSLOduration=4.744662739 podStartE2EDuration="7.040487107s" podCreationTimestamp="2025-10-03 08:02:20 +0000 UTC" firstStartedPulling="2025-10-03 08:02:22.109395683 +0000 UTC m=+842.930586173" lastFinishedPulling="2025-10-03 08:02:24.405220051 +0000 UTC m=+845.226410541" observedRunningTime="2025-10-03 08:02:25.026316186 +0000 UTC m=+845.847506686" watchObservedRunningTime="2025-10-03 08:02:27.040487107 +0000 UTC m=+847.861677627" Oct 03 08:02:27 crc kubenswrapper[4664]: I1003 08:02:27.043677 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-5cmx6" podStartSLOduration=2.432522214 podStartE2EDuration="7.043666124s" podCreationTimestamp="2025-10-03 08:02:20 +0000 UTC" firstStartedPulling="2025-10-03 08:02:21.617723914 +0000 UTC m=+842.438914404" lastFinishedPulling="2025-10-03 08:02:26.228867834 +0000 UTC m=+847.050058314" observedRunningTime="2025-10-03 08:02:27.038576374 +0000 UTC m=+847.859766904" watchObservedRunningTime="2025-10-03 08:02:27.043666124 +0000 UTC m=+847.864856674" Oct 03 08:02:31 crc kubenswrapper[4664]: I1003 08:02:31.190185 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-flh2z" Oct 03 08:02:31 crc kubenswrapper[4664]: I1003 08:02:31.516643 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6ccf4b568b-f9rcf" Oct 03 08:02:31 crc kubenswrapper[4664]: I1003 08:02:31.516742 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6ccf4b568b-f9rcf" Oct 03 08:02:31 crc kubenswrapper[4664]: I1003 08:02:31.524177 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6ccf4b568b-f9rcf" Oct 03 08:02:32 crc kubenswrapper[4664]: I1003 08:02:32.058383 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6ccf4b568b-f9rcf" Oct 03 08:02:32 crc kubenswrapper[4664]: I1003 08:02:32.117684 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-2m6x7"] Oct 03 08:02:41 crc kubenswrapper[4664]: I1003 08:02:41.747442 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-jct4m" Oct 03 08:02:41 crc kubenswrapper[4664]: I1003 08:02:41.987584 4664 patch_prober.go:28] interesting pod/machine-config-daemon-x9dgm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:02:41 crc kubenswrapper[4664]: I1003 08:02:41.987683 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:02:41 crc kubenswrapper[4664]: I1003 08:02:41.987739 4664 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" Oct 03 08:02:41 crc kubenswrapper[4664]: I1003 08:02:41.988452 4664 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"33a410bbdb246cf2e9dcb8e9de77a40e30f71ec5cde831e8cfca46d88165b8b1"} pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 08:02:41 crc kubenswrapper[4664]: I1003 08:02:41.988542 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" containerID="cri-o://33a410bbdb246cf2e9dcb8e9de77a40e30f71ec5cde831e8cfca46d88165b8b1" gracePeriod=600 Oct 03 08:02:42 crc kubenswrapper[4664]: I1003 08:02:42.114066 4664 generic.go:334] "Generic (PLEG): container finished" podID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerID="33a410bbdb246cf2e9dcb8e9de77a40e30f71ec5cde831e8cfca46d88165b8b1" exitCode=0 Oct 03 08:02:42 crc kubenswrapper[4664]: I1003 08:02:42.114133 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" event={"ID":"598b81ce-0ce7-498f-9337-ae5e6e64682b","Type":"ContainerDied","Data":"33a410bbdb246cf2e9dcb8e9de77a40e30f71ec5cde831e8cfca46d88165b8b1"} Oct 03 08:02:42 crc kubenswrapper[4664]: I1003 08:02:42.114559 4664 scope.go:117] "RemoveContainer" containerID="2b9821c1193b1d9cb01e00b77753645dc25b62e22b09d41a84e0eb1787f597c7" Oct 03 08:02:43 crc kubenswrapper[4664]: I1003 08:02:43.123097 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" event={"ID":"598b81ce-0ce7-498f-9337-ae5e6e64682b","Type":"ContainerStarted","Data":"d72d28e356e7ba889e503f6d77ef4dcc3b64c797b9e1df46488fe0f1d0abb973"} Oct 03 08:02:55 crc kubenswrapper[4664]: I1003 08:02:55.915865 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xd4q2"] Oct 03 08:02:55 crc kubenswrapper[4664]: I1003 08:02:55.917743 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xd4q2" Oct 03 08:02:55 crc kubenswrapper[4664]: I1003 08:02:55.919878 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 03 08:02:55 crc kubenswrapper[4664]: I1003 08:02:55.923741 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xd4q2"] Oct 03 08:02:55 crc kubenswrapper[4664]: I1003 08:02:55.979686 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xd4q2\" (UID: \"bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xd4q2" Oct 03 08:02:55 crc kubenswrapper[4664]: I1003 08:02:55.979776 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9w6z\" (UniqueName: \"kubernetes.io/projected/bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49-kube-api-access-f9w6z\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xd4q2\" (UID: \"bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xd4q2" Oct 03 08:02:55 crc kubenswrapper[4664]: I1003 08:02:55.979968 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xd4q2\" (UID: \"bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xd4q2" Oct 03 08:02:56 crc kubenswrapper[4664]: I1003 08:02:56.081429 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xd4q2\" (UID: \"bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xd4q2" Oct 03 08:02:56 crc kubenswrapper[4664]: I1003 08:02:56.081497 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9w6z\" (UniqueName: \"kubernetes.io/projected/bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49-kube-api-access-f9w6z\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xd4q2\" (UID: \"bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xd4q2" Oct 03 08:02:56 crc kubenswrapper[4664]: I1003 08:02:56.081555 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xd4q2\" (UID: \"bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xd4q2" Oct 03 08:02:56 crc kubenswrapper[4664]: I1003 08:02:56.081962 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xd4q2\" (UID: \"bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xd4q2" Oct 03 08:02:56 crc kubenswrapper[4664]: I1003 08:02:56.082030 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xd4q2\" (UID: \"bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xd4q2" Oct 03 08:02:56 crc kubenswrapper[4664]: I1003 08:02:56.100474 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9w6z\" (UniqueName: \"kubernetes.io/projected/bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49-kube-api-access-f9w6z\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xd4q2\" (UID: \"bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xd4q2" Oct 03 08:02:56 crc kubenswrapper[4664]: I1003 08:02:56.234775 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xd4q2" Oct 03 08:02:56 crc kubenswrapper[4664]: I1003 08:02:56.628987 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xd4q2"] Oct 03 08:02:56 crc kubenswrapper[4664]: W1003 08:02:56.635955 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf4d43c8_bf96_44f4_9c95_0d6bd4ba3b49.slice/crio-2d26b64d3ba3a8dcf1c6788253ce5767f7f7473aef32e6b534187ae06af05612 WatchSource:0}: Error finding container 2d26b64d3ba3a8dcf1c6788253ce5767f7f7473aef32e6b534187ae06af05612: Status 404 returned error can't find the container with id 2d26b64d3ba3a8dcf1c6788253ce5767f7f7473aef32e6b534187ae06af05612 Oct 03 08:02:57 crc kubenswrapper[4664]: I1003 08:02:57.167118 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-2m6x7" podUID="ceef7ba8-f996-4b56-a477-23873e39cde7" containerName="console" containerID="cri-o://086931ca54355c86710ba259601c4f4286a007200d30c9e44598c5566186b8a5" gracePeriod=15 Oct 03 08:02:57 crc kubenswrapper[4664]: I1003 08:02:57.205128 4664 generic.go:334] "Generic (PLEG): container finished" podID="bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49" containerID="0019186ca0613687d97b5f16287dfbd64d0cd2c6f05d67aafff73db9dc2389c3" exitCode=0 Oct 03 08:02:57 crc kubenswrapper[4664]: I1003 08:02:57.205229 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xd4q2" event={"ID":"bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49","Type":"ContainerDied","Data":"0019186ca0613687d97b5f16287dfbd64d0cd2c6f05d67aafff73db9dc2389c3"} Oct 03 08:02:57 crc kubenswrapper[4664]: I1003 08:02:57.205259 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xd4q2" event={"ID":"bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49","Type":"ContainerStarted","Data":"2d26b64d3ba3a8dcf1c6788253ce5767f7f7473aef32e6b534187ae06af05612"} Oct 03 08:02:57 crc kubenswrapper[4664]: I1003 08:02:57.484334 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-2m6x7_ceef7ba8-f996-4b56-a477-23873e39cde7/console/0.log" Oct 03 08:02:57 crc kubenswrapper[4664]: I1003 08:02:57.484405 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-2m6x7" Oct 03 08:02:57 crc kubenswrapper[4664]: I1003 08:02:57.604471 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ceef7ba8-f996-4b56-a477-23873e39cde7-oauth-serving-cert\") pod \"ceef7ba8-f996-4b56-a477-23873e39cde7\" (UID: \"ceef7ba8-f996-4b56-a477-23873e39cde7\") " Oct 03 08:02:57 crc kubenswrapper[4664]: I1003 08:02:57.604565 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ceef7ba8-f996-4b56-a477-23873e39cde7-console-oauth-config\") pod \"ceef7ba8-f996-4b56-a477-23873e39cde7\" (UID: \"ceef7ba8-f996-4b56-a477-23873e39cde7\") " Oct 03 08:02:57 crc kubenswrapper[4664]: I1003 08:02:57.604618 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ceef7ba8-f996-4b56-a477-23873e39cde7-trusted-ca-bundle\") pod \"ceef7ba8-f996-4b56-a477-23873e39cde7\" (UID: \"ceef7ba8-f996-4b56-a477-23873e39cde7\") " Oct 03 08:02:57 crc kubenswrapper[4664]: I1003 08:02:57.604642 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ceef7ba8-f996-4b56-a477-23873e39cde7-console-serving-cert\") pod \"ceef7ba8-f996-4b56-a477-23873e39cde7\" (UID: \"ceef7ba8-f996-4b56-a477-23873e39cde7\") " Oct 03 08:02:57 crc kubenswrapper[4664]: I1003 08:02:57.604675 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ceef7ba8-f996-4b56-a477-23873e39cde7-service-ca\") pod \"ceef7ba8-f996-4b56-a477-23873e39cde7\" (UID: \"ceef7ba8-f996-4b56-a477-23873e39cde7\") " Oct 03 08:02:57 crc kubenswrapper[4664]: I1003 08:02:57.604694 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bwfx\" (UniqueName: \"kubernetes.io/projected/ceef7ba8-f996-4b56-a477-23873e39cde7-kube-api-access-2bwfx\") pod \"ceef7ba8-f996-4b56-a477-23873e39cde7\" (UID: \"ceef7ba8-f996-4b56-a477-23873e39cde7\") " Oct 03 08:02:57 crc kubenswrapper[4664]: I1003 08:02:57.604735 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ceef7ba8-f996-4b56-a477-23873e39cde7-console-config\") pod \"ceef7ba8-f996-4b56-a477-23873e39cde7\" (UID: \"ceef7ba8-f996-4b56-a477-23873e39cde7\") " Oct 03 08:02:57 crc kubenswrapper[4664]: I1003 08:02:57.605363 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ceef7ba8-f996-4b56-a477-23873e39cde7-console-config" (OuterVolumeSpecName: "console-config") pod "ceef7ba8-f996-4b56-a477-23873e39cde7" (UID: "ceef7ba8-f996-4b56-a477-23873e39cde7"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:02:57 crc kubenswrapper[4664]: I1003 08:02:57.605422 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ceef7ba8-f996-4b56-a477-23873e39cde7-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ceef7ba8-f996-4b56-a477-23873e39cde7" (UID: "ceef7ba8-f996-4b56-a477-23873e39cde7"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:02:57 crc kubenswrapper[4664]: I1003 08:02:57.605467 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ceef7ba8-f996-4b56-a477-23873e39cde7-service-ca" (OuterVolumeSpecName: "service-ca") pod "ceef7ba8-f996-4b56-a477-23873e39cde7" (UID: "ceef7ba8-f996-4b56-a477-23873e39cde7"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:02:57 crc kubenswrapper[4664]: I1003 08:02:57.605673 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ceef7ba8-f996-4b56-a477-23873e39cde7-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "ceef7ba8-f996-4b56-a477-23873e39cde7" (UID: "ceef7ba8-f996-4b56-a477-23873e39cde7"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:02:57 crc kubenswrapper[4664]: I1003 08:02:57.610849 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ceef7ba8-f996-4b56-a477-23873e39cde7-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ceef7ba8-f996-4b56-a477-23873e39cde7" (UID: "ceef7ba8-f996-4b56-a477-23873e39cde7"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:02:57 crc kubenswrapper[4664]: I1003 08:02:57.611056 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ceef7ba8-f996-4b56-a477-23873e39cde7-kube-api-access-2bwfx" (OuterVolumeSpecName: "kube-api-access-2bwfx") pod "ceef7ba8-f996-4b56-a477-23873e39cde7" (UID: "ceef7ba8-f996-4b56-a477-23873e39cde7"). InnerVolumeSpecName "kube-api-access-2bwfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:02:57 crc kubenswrapper[4664]: I1003 08:02:57.611134 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ceef7ba8-f996-4b56-a477-23873e39cde7-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ceef7ba8-f996-4b56-a477-23873e39cde7" (UID: "ceef7ba8-f996-4b56-a477-23873e39cde7"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:02:57 crc kubenswrapper[4664]: I1003 08:02:57.705865 4664 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ceef7ba8-f996-4b56-a477-23873e39cde7-console-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:02:57 crc kubenswrapper[4664]: I1003 08:02:57.705910 4664 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ceef7ba8-f996-4b56-a477-23873e39cde7-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:02:57 crc kubenswrapper[4664]: I1003 08:02:57.705924 4664 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ceef7ba8-f996-4b56-a477-23873e39cde7-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:02:57 crc kubenswrapper[4664]: I1003 08:02:57.705936 4664 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ceef7ba8-f996-4b56-a477-23873e39cde7-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:02:57 crc kubenswrapper[4664]: I1003 08:02:57.705946 4664 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ceef7ba8-f996-4b56-a477-23873e39cde7-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:02:57 crc kubenswrapper[4664]: I1003 08:02:57.705957 4664 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ceef7ba8-f996-4b56-a477-23873e39cde7-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 08:02:57 crc kubenswrapper[4664]: I1003 08:02:57.705967 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bwfx\" (UniqueName: \"kubernetes.io/projected/ceef7ba8-f996-4b56-a477-23873e39cde7-kube-api-access-2bwfx\") on node \"crc\" DevicePath \"\"" Oct 03 08:02:58 crc kubenswrapper[4664]: I1003 08:02:58.211434 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-2m6x7_ceef7ba8-f996-4b56-a477-23873e39cde7/console/0.log" Oct 03 08:02:58 crc kubenswrapper[4664]: I1003 08:02:58.211481 4664 generic.go:334] "Generic (PLEG): container finished" podID="ceef7ba8-f996-4b56-a477-23873e39cde7" containerID="086931ca54355c86710ba259601c4f4286a007200d30c9e44598c5566186b8a5" exitCode=2 Oct 03 08:02:58 crc kubenswrapper[4664]: I1003 08:02:58.211519 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-2m6x7" event={"ID":"ceef7ba8-f996-4b56-a477-23873e39cde7","Type":"ContainerDied","Data":"086931ca54355c86710ba259601c4f4286a007200d30c9e44598c5566186b8a5"} Oct 03 08:02:58 crc kubenswrapper[4664]: I1003 08:02:58.211562 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-2m6x7" event={"ID":"ceef7ba8-f996-4b56-a477-23873e39cde7","Type":"ContainerDied","Data":"680a65dcdfed2bdc8b9984b9cfa41a1021e3051184894e539050a71b14a4bdc3"} Oct 03 08:02:58 crc kubenswrapper[4664]: I1003 08:02:58.211579 4664 scope.go:117] "RemoveContainer" containerID="086931ca54355c86710ba259601c4f4286a007200d30c9e44598c5566186b8a5" Oct 03 08:02:58 crc kubenswrapper[4664]: I1003 08:02:58.211534 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-2m6x7" Oct 03 08:02:58 crc kubenswrapper[4664]: I1003 08:02:58.232836 4664 scope.go:117] "RemoveContainer" containerID="086931ca54355c86710ba259601c4f4286a007200d30c9e44598c5566186b8a5" Oct 03 08:02:58 crc kubenswrapper[4664]: I1003 08:02:58.235158 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-2m6x7"] Oct 03 08:02:58 crc kubenswrapper[4664]: E1003 08:02:58.236005 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"086931ca54355c86710ba259601c4f4286a007200d30c9e44598c5566186b8a5\": container with ID starting with 086931ca54355c86710ba259601c4f4286a007200d30c9e44598c5566186b8a5 not found: ID does not exist" containerID="086931ca54355c86710ba259601c4f4286a007200d30c9e44598c5566186b8a5" Oct 03 08:02:58 crc kubenswrapper[4664]: I1003 08:02:58.236069 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"086931ca54355c86710ba259601c4f4286a007200d30c9e44598c5566186b8a5"} err="failed to get container status \"086931ca54355c86710ba259601c4f4286a007200d30c9e44598c5566186b8a5\": rpc error: code = NotFound desc = could not find container \"086931ca54355c86710ba259601c4f4286a007200d30c9e44598c5566186b8a5\": container with ID starting with 086931ca54355c86710ba259601c4f4286a007200d30c9e44598c5566186b8a5 not found: ID does not exist" Oct 03 08:02:58 crc kubenswrapper[4664]: I1003 08:02:58.238791 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-2m6x7"] Oct 03 08:02:59 crc kubenswrapper[4664]: I1003 08:02:59.223280 4664 generic.go:334] "Generic (PLEG): container finished" podID="bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49" containerID="2d4ec824808b08b684181520ce3d3c047e7336b8f2498d8b755bb498e6f9647c" exitCode=0 Oct 03 08:02:59 crc kubenswrapper[4664]: I1003 08:02:59.223513 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xd4q2" event={"ID":"bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49","Type":"ContainerDied","Data":"2d4ec824808b08b684181520ce3d3c047e7336b8f2498d8b755bb498e6f9647c"} Oct 03 08:02:59 crc kubenswrapper[4664]: I1003 08:02:59.882283 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ceef7ba8-f996-4b56-a477-23873e39cde7" path="/var/lib/kubelet/pods/ceef7ba8-f996-4b56-a477-23873e39cde7/volumes" Oct 03 08:03:00 crc kubenswrapper[4664]: I1003 08:03:00.230479 4664 generic.go:334] "Generic (PLEG): container finished" podID="bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49" containerID="b2419c3307d34a6ddc9b7546ab5e2048305d5d08d03815975200fd7506a01bc8" exitCode=0 Oct 03 08:03:00 crc kubenswrapper[4664]: I1003 08:03:00.230546 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xd4q2" event={"ID":"bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49","Type":"ContainerDied","Data":"b2419c3307d34a6ddc9b7546ab5e2048305d5d08d03815975200fd7506a01bc8"} Oct 03 08:03:01 crc kubenswrapper[4664]: I1003 08:03:01.461980 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xd4q2" Oct 03 08:03:01 crc kubenswrapper[4664]: I1003 08:03:01.568875 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9w6z\" (UniqueName: \"kubernetes.io/projected/bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49-kube-api-access-f9w6z\") pod \"bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49\" (UID: \"bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49\") " Oct 03 08:03:01 crc kubenswrapper[4664]: I1003 08:03:01.569005 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49-util\") pod \"bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49\" (UID: \"bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49\") " Oct 03 08:03:01 crc kubenswrapper[4664]: I1003 08:03:01.569721 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49-bundle\") pod \"bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49\" (UID: \"bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49\") " Oct 03 08:03:01 crc kubenswrapper[4664]: I1003 08:03:01.571528 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49-bundle" (OuterVolumeSpecName: "bundle") pod "bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49" (UID: "bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:03:01 crc kubenswrapper[4664]: I1003 08:03:01.576291 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49-kube-api-access-f9w6z" (OuterVolumeSpecName: "kube-api-access-f9w6z") pod "bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49" (UID: "bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49"). InnerVolumeSpecName "kube-api-access-f9w6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:03:01 crc kubenswrapper[4664]: I1003 08:03:01.584754 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49-util" (OuterVolumeSpecName: "util") pod "bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49" (UID: "bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:03:01 crc kubenswrapper[4664]: I1003 08:03:01.671592 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9w6z\" (UniqueName: \"kubernetes.io/projected/bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49-kube-api-access-f9w6z\") on node \"crc\" DevicePath \"\"" Oct 03 08:03:01 crc kubenswrapper[4664]: I1003 08:03:01.671649 4664 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49-util\") on node \"crc\" DevicePath \"\"" Oct 03 08:03:01 crc kubenswrapper[4664]: I1003 08:03:01.671663 4664 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:03:02 crc kubenswrapper[4664]: I1003 08:03:02.247133 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xd4q2" event={"ID":"bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49","Type":"ContainerDied","Data":"2d26b64d3ba3a8dcf1c6788253ce5767f7f7473aef32e6b534187ae06af05612"} Oct 03 08:03:02 crc kubenswrapper[4664]: I1003 08:03:02.247175 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d26b64d3ba3a8dcf1c6788253ce5767f7f7473aef32e6b534187ae06af05612" Oct 03 08:03:02 crc kubenswrapper[4664]: I1003 08:03:02.247196 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xd4q2" Oct 03 08:03:11 crc kubenswrapper[4664]: I1003 08:03:11.800321 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-78d5f5dfc9-nqdfv"] Oct 03 08:03:11 crc kubenswrapper[4664]: E1003 08:03:11.801641 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceef7ba8-f996-4b56-a477-23873e39cde7" containerName="console" Oct 03 08:03:11 crc kubenswrapper[4664]: I1003 08:03:11.801661 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceef7ba8-f996-4b56-a477-23873e39cde7" containerName="console" Oct 03 08:03:11 crc kubenswrapper[4664]: E1003 08:03:11.801684 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49" containerName="extract" Oct 03 08:03:11 crc kubenswrapper[4664]: I1003 08:03:11.801691 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49" containerName="extract" Oct 03 08:03:11 crc kubenswrapper[4664]: E1003 08:03:11.801705 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49" containerName="util" Oct 03 08:03:11 crc kubenswrapper[4664]: I1003 08:03:11.801713 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49" containerName="util" Oct 03 08:03:11 crc kubenswrapper[4664]: E1003 08:03:11.801733 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49" containerName="pull" Oct 03 08:03:11 crc kubenswrapper[4664]: I1003 08:03:11.801741 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49" containerName="pull" Oct 03 08:03:11 crc kubenswrapper[4664]: I1003 08:03:11.801894 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="ceef7ba8-f996-4b56-a477-23873e39cde7" containerName="console" Oct 03 08:03:11 crc kubenswrapper[4664]: I1003 08:03:11.801908 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49" containerName="extract" Oct 03 08:03:11 crc kubenswrapper[4664]: I1003 08:03:11.802572 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-78d5f5dfc9-nqdfv" Oct 03 08:03:11 crc kubenswrapper[4664]: I1003 08:03:11.810112 4664 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-4zgxj" Oct 03 08:03:11 crc kubenswrapper[4664]: I1003 08:03:11.810411 4664 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 03 08:03:11 crc kubenswrapper[4664]: I1003 08:03:11.810552 4664 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 03 08:03:11 crc kubenswrapper[4664]: I1003 08:03:11.810764 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 03 08:03:11 crc kubenswrapper[4664]: I1003 08:03:11.810766 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 03 08:03:11 crc kubenswrapper[4664]: I1003 08:03:11.831926 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-78d5f5dfc9-nqdfv"] Oct 03 08:03:11 crc kubenswrapper[4664]: I1003 08:03:11.911414 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfwnl\" (UniqueName: \"kubernetes.io/projected/3493d554-e23b-40b4-b583-45650c6e2b0a-kube-api-access-bfwnl\") pod \"metallb-operator-controller-manager-78d5f5dfc9-nqdfv\" (UID: \"3493d554-e23b-40b4-b583-45650c6e2b0a\") " pod="metallb-system/metallb-operator-controller-manager-78d5f5dfc9-nqdfv" Oct 03 08:03:11 crc kubenswrapper[4664]: I1003 08:03:11.911504 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3493d554-e23b-40b4-b583-45650c6e2b0a-webhook-cert\") pod \"metallb-operator-controller-manager-78d5f5dfc9-nqdfv\" (UID: \"3493d554-e23b-40b4-b583-45650c6e2b0a\") " pod="metallb-system/metallb-operator-controller-manager-78d5f5dfc9-nqdfv" Oct 03 08:03:11 crc kubenswrapper[4664]: I1003 08:03:11.911620 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3493d554-e23b-40b4-b583-45650c6e2b0a-apiservice-cert\") pod \"metallb-operator-controller-manager-78d5f5dfc9-nqdfv\" (UID: \"3493d554-e23b-40b4-b583-45650c6e2b0a\") " pod="metallb-system/metallb-operator-controller-manager-78d5f5dfc9-nqdfv" Oct 03 08:03:12 crc kubenswrapper[4664]: I1003 08:03:12.012692 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3493d554-e23b-40b4-b583-45650c6e2b0a-webhook-cert\") pod \"metallb-operator-controller-manager-78d5f5dfc9-nqdfv\" (UID: \"3493d554-e23b-40b4-b583-45650c6e2b0a\") " pod="metallb-system/metallb-operator-controller-manager-78d5f5dfc9-nqdfv" Oct 03 08:03:12 crc kubenswrapper[4664]: I1003 08:03:12.012816 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3493d554-e23b-40b4-b583-45650c6e2b0a-apiservice-cert\") pod \"metallb-operator-controller-manager-78d5f5dfc9-nqdfv\" (UID: \"3493d554-e23b-40b4-b583-45650c6e2b0a\") " pod="metallb-system/metallb-operator-controller-manager-78d5f5dfc9-nqdfv" Oct 03 08:03:12 crc kubenswrapper[4664]: I1003 08:03:12.012869 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfwnl\" (UniqueName: \"kubernetes.io/projected/3493d554-e23b-40b4-b583-45650c6e2b0a-kube-api-access-bfwnl\") pod \"metallb-operator-controller-manager-78d5f5dfc9-nqdfv\" (UID: \"3493d554-e23b-40b4-b583-45650c6e2b0a\") " pod="metallb-system/metallb-operator-controller-manager-78d5f5dfc9-nqdfv" Oct 03 08:03:12 crc kubenswrapper[4664]: I1003 08:03:12.020732 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3493d554-e23b-40b4-b583-45650c6e2b0a-apiservice-cert\") pod \"metallb-operator-controller-manager-78d5f5dfc9-nqdfv\" (UID: \"3493d554-e23b-40b4-b583-45650c6e2b0a\") " pod="metallb-system/metallb-operator-controller-manager-78d5f5dfc9-nqdfv" Oct 03 08:03:12 crc kubenswrapper[4664]: I1003 08:03:12.021061 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3493d554-e23b-40b4-b583-45650c6e2b0a-webhook-cert\") pod \"metallb-operator-controller-manager-78d5f5dfc9-nqdfv\" (UID: \"3493d554-e23b-40b4-b583-45650c6e2b0a\") " pod="metallb-system/metallb-operator-controller-manager-78d5f5dfc9-nqdfv" Oct 03 08:03:12 crc kubenswrapper[4664]: I1003 08:03:12.029219 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-5df98bb99-xxf5h"] Oct 03 08:03:12 crc kubenswrapper[4664]: I1003 08:03:12.030234 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5df98bb99-xxf5h" Oct 03 08:03:12 crc kubenswrapper[4664]: I1003 08:03:12.033653 4664 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-p4t8g" Oct 03 08:03:12 crc kubenswrapper[4664]: I1003 08:03:12.034155 4664 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 03 08:03:12 crc kubenswrapper[4664]: I1003 08:03:12.034260 4664 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 03 08:03:12 crc kubenswrapper[4664]: I1003 08:03:12.037730 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfwnl\" (UniqueName: \"kubernetes.io/projected/3493d554-e23b-40b4-b583-45650c6e2b0a-kube-api-access-bfwnl\") pod \"metallb-operator-controller-manager-78d5f5dfc9-nqdfv\" (UID: \"3493d554-e23b-40b4-b583-45650c6e2b0a\") " pod="metallb-system/metallb-operator-controller-manager-78d5f5dfc9-nqdfv" Oct 03 08:03:12 crc kubenswrapper[4664]: I1003 08:03:12.055361 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5df98bb99-xxf5h"] Oct 03 08:03:12 crc kubenswrapper[4664]: I1003 08:03:12.116100 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/10515fdf-6940-4258-bd80-714dbb847505-apiservice-cert\") pod \"metallb-operator-webhook-server-5df98bb99-xxf5h\" (UID: \"10515fdf-6940-4258-bd80-714dbb847505\") " pod="metallb-system/metallb-operator-webhook-server-5df98bb99-xxf5h" Oct 03 08:03:12 crc kubenswrapper[4664]: I1003 08:03:12.116155 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/10515fdf-6940-4258-bd80-714dbb847505-webhook-cert\") pod \"metallb-operator-webhook-server-5df98bb99-xxf5h\" (UID: \"10515fdf-6940-4258-bd80-714dbb847505\") " pod="metallb-system/metallb-operator-webhook-server-5df98bb99-xxf5h" Oct 03 08:03:12 crc kubenswrapper[4664]: I1003 08:03:12.116189 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k44rj\" (UniqueName: \"kubernetes.io/projected/10515fdf-6940-4258-bd80-714dbb847505-kube-api-access-k44rj\") pod \"metallb-operator-webhook-server-5df98bb99-xxf5h\" (UID: \"10515fdf-6940-4258-bd80-714dbb847505\") " pod="metallb-system/metallb-operator-webhook-server-5df98bb99-xxf5h" Oct 03 08:03:12 crc kubenswrapper[4664]: I1003 08:03:12.128397 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-78d5f5dfc9-nqdfv" Oct 03 08:03:12 crc kubenswrapper[4664]: I1003 08:03:12.217739 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k44rj\" (UniqueName: \"kubernetes.io/projected/10515fdf-6940-4258-bd80-714dbb847505-kube-api-access-k44rj\") pod \"metallb-operator-webhook-server-5df98bb99-xxf5h\" (UID: \"10515fdf-6940-4258-bd80-714dbb847505\") " pod="metallb-system/metallb-operator-webhook-server-5df98bb99-xxf5h" Oct 03 08:03:12 crc kubenswrapper[4664]: I1003 08:03:12.217870 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/10515fdf-6940-4258-bd80-714dbb847505-apiservice-cert\") pod \"metallb-operator-webhook-server-5df98bb99-xxf5h\" (UID: \"10515fdf-6940-4258-bd80-714dbb847505\") " pod="metallb-system/metallb-operator-webhook-server-5df98bb99-xxf5h" Oct 03 08:03:12 crc kubenswrapper[4664]: I1003 08:03:12.217893 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/10515fdf-6940-4258-bd80-714dbb847505-webhook-cert\") pod \"metallb-operator-webhook-server-5df98bb99-xxf5h\" (UID: \"10515fdf-6940-4258-bd80-714dbb847505\") " pod="metallb-system/metallb-operator-webhook-server-5df98bb99-xxf5h" Oct 03 08:03:12 crc kubenswrapper[4664]: I1003 08:03:12.223383 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/10515fdf-6940-4258-bd80-714dbb847505-webhook-cert\") pod \"metallb-operator-webhook-server-5df98bb99-xxf5h\" (UID: \"10515fdf-6940-4258-bd80-714dbb847505\") " pod="metallb-system/metallb-operator-webhook-server-5df98bb99-xxf5h" Oct 03 08:03:12 crc kubenswrapper[4664]: I1003 08:03:12.223429 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/10515fdf-6940-4258-bd80-714dbb847505-apiservice-cert\") pod \"metallb-operator-webhook-server-5df98bb99-xxf5h\" (UID: \"10515fdf-6940-4258-bd80-714dbb847505\") " pod="metallb-system/metallb-operator-webhook-server-5df98bb99-xxf5h" Oct 03 08:03:12 crc kubenswrapper[4664]: I1003 08:03:12.240914 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k44rj\" (UniqueName: \"kubernetes.io/projected/10515fdf-6940-4258-bd80-714dbb847505-kube-api-access-k44rj\") pod \"metallb-operator-webhook-server-5df98bb99-xxf5h\" (UID: \"10515fdf-6940-4258-bd80-714dbb847505\") " pod="metallb-system/metallb-operator-webhook-server-5df98bb99-xxf5h" Oct 03 08:03:12 crc kubenswrapper[4664]: I1003 08:03:12.386290 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5df98bb99-xxf5h" Oct 03 08:03:12 crc kubenswrapper[4664]: I1003 08:03:12.413921 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-78d5f5dfc9-nqdfv"] Oct 03 08:03:12 crc kubenswrapper[4664]: I1003 08:03:12.593498 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5df98bb99-xxf5h"] Oct 03 08:03:13 crc kubenswrapper[4664]: I1003 08:03:13.305887 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5df98bb99-xxf5h" event={"ID":"10515fdf-6940-4258-bd80-714dbb847505","Type":"ContainerStarted","Data":"b6ef008331b8d581984dea2878364c45a6a4335297ccc61aa22c5d98841f6de5"} Oct 03 08:03:13 crc kubenswrapper[4664]: I1003 08:03:13.308236 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-78d5f5dfc9-nqdfv" event={"ID":"3493d554-e23b-40b4-b583-45650c6e2b0a","Type":"ContainerStarted","Data":"e50b65885fe540d0bc35ce5feb96be2e2b18270ab96d42318ac842e143b33560"} Oct 03 08:03:16 crc kubenswrapper[4664]: I1003 08:03:16.330851 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-78d5f5dfc9-nqdfv" event={"ID":"3493d554-e23b-40b4-b583-45650c6e2b0a","Type":"ContainerStarted","Data":"20d285427817f66ef7eea5f02d3893719486a428811b233c3741cb64f61466d3"} Oct 03 08:03:16 crc kubenswrapper[4664]: I1003 08:03:16.331405 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-78d5f5dfc9-nqdfv" Oct 03 08:03:16 crc kubenswrapper[4664]: I1003 08:03:16.361543 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-78d5f5dfc9-nqdfv" podStartSLOduration=2.50660165 podStartE2EDuration="5.361516979s" podCreationTimestamp="2025-10-03 08:03:11 +0000 UTC" firstStartedPulling="2025-10-03 08:03:12.428269784 +0000 UTC m=+893.249460274" lastFinishedPulling="2025-10-03 08:03:15.283185113 +0000 UTC m=+896.104375603" observedRunningTime="2025-10-03 08:03:16.359203753 +0000 UTC m=+897.180394253" watchObservedRunningTime="2025-10-03 08:03:16.361516979 +0000 UTC m=+897.182707469" Oct 03 08:03:17 crc kubenswrapper[4664]: I1003 08:03:17.337471 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5df98bb99-xxf5h" event={"ID":"10515fdf-6940-4258-bd80-714dbb847505","Type":"ContainerStarted","Data":"e495369174b702a624c1cf516c7101bba8fd024257775e8e18547ce9bf1f5ffc"} Oct 03 08:03:17 crc kubenswrapper[4664]: I1003 08:03:17.357859 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-5df98bb99-xxf5h" podStartSLOduration=1.154152465 podStartE2EDuration="5.357838321s" podCreationTimestamp="2025-10-03 08:03:12 +0000 UTC" firstStartedPulling="2025-10-03 08:03:12.605671749 +0000 UTC m=+893.426862239" lastFinishedPulling="2025-10-03 08:03:16.809357605 +0000 UTC m=+897.630548095" observedRunningTime="2025-10-03 08:03:17.355184146 +0000 UTC m=+898.176374656" watchObservedRunningTime="2025-10-03 08:03:17.357838321 +0000 UTC m=+898.179028811" Oct 03 08:03:18 crc kubenswrapper[4664]: I1003 08:03:18.342958 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-5df98bb99-xxf5h" Oct 03 08:03:32 crc kubenswrapper[4664]: I1003 08:03:32.390554 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-5df98bb99-xxf5h" Oct 03 08:03:52 crc kubenswrapper[4664]: I1003 08:03:52.131955 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-78d5f5dfc9-nqdfv" Oct 03 08:03:52 crc kubenswrapper[4664]: I1003 08:03:52.795895 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-5vzcf"] Oct 03 08:03:52 crc kubenswrapper[4664]: I1003 08:03:52.798103 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-5vzcf" Oct 03 08:03:52 crc kubenswrapper[4664]: I1003 08:03:52.800076 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 03 08:03:52 crc kubenswrapper[4664]: I1003 08:03:52.800110 4664 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 03 08:03:52 crc kubenswrapper[4664]: I1003 08:03:52.800076 4664 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-qdf2r" Oct 03 08:03:52 crc kubenswrapper[4664]: I1003 08:03:52.807045 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-j98p4"] Oct 03 08:03:52 crc kubenswrapper[4664]: I1003 08:03:52.807909 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-j98p4" Oct 03 08:03:52 crc kubenswrapper[4664]: I1003 08:03:52.810107 4664 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 03 08:03:52 crc kubenswrapper[4664]: I1003 08:03:52.825928 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-j98p4"] Oct 03 08:03:52 crc kubenswrapper[4664]: I1003 08:03:52.889205 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/1604a32c-c92b-4bed-9402-1ea47abcc2ea-frr-startup\") pod \"frr-k8s-5vzcf\" (UID: \"1604a32c-c92b-4bed-9402-1ea47abcc2ea\") " pod="metallb-system/frr-k8s-5vzcf" Oct 03 08:03:52 crc kubenswrapper[4664]: I1003 08:03:52.889263 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/1604a32c-c92b-4bed-9402-1ea47abcc2ea-reloader\") pod \"frr-k8s-5vzcf\" (UID: \"1604a32c-c92b-4bed-9402-1ea47abcc2ea\") " pod="metallb-system/frr-k8s-5vzcf" Oct 03 08:03:52 crc kubenswrapper[4664]: I1003 08:03:52.889297 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/1604a32c-c92b-4bed-9402-1ea47abcc2ea-frr-sockets\") pod \"frr-k8s-5vzcf\" (UID: \"1604a32c-c92b-4bed-9402-1ea47abcc2ea\") " pod="metallb-system/frr-k8s-5vzcf" Oct 03 08:03:52 crc kubenswrapper[4664]: I1003 08:03:52.889325 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwn2h\" (UniqueName: \"kubernetes.io/projected/28c891c2-52c6-400d-95c8-5c29bb6962a7-kube-api-access-kwn2h\") pod \"frr-k8s-webhook-server-64bf5d555-j98p4\" (UID: \"28c891c2-52c6-400d-95c8-5c29bb6962a7\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-j98p4" Oct 03 08:03:52 crc kubenswrapper[4664]: I1003 08:03:52.889420 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r764s\" (UniqueName: \"kubernetes.io/projected/1604a32c-c92b-4bed-9402-1ea47abcc2ea-kube-api-access-r764s\") pod \"frr-k8s-5vzcf\" (UID: \"1604a32c-c92b-4bed-9402-1ea47abcc2ea\") " pod="metallb-system/frr-k8s-5vzcf" Oct 03 08:03:52 crc kubenswrapper[4664]: I1003 08:03:52.889450 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/1604a32c-c92b-4bed-9402-1ea47abcc2ea-frr-conf\") pod \"frr-k8s-5vzcf\" (UID: \"1604a32c-c92b-4bed-9402-1ea47abcc2ea\") " pod="metallb-system/frr-k8s-5vzcf" Oct 03 08:03:52 crc kubenswrapper[4664]: I1003 08:03:52.889478 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/28c891c2-52c6-400d-95c8-5c29bb6962a7-cert\") pod \"frr-k8s-webhook-server-64bf5d555-j98p4\" (UID: \"28c891c2-52c6-400d-95c8-5c29bb6962a7\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-j98p4" Oct 03 08:03:52 crc kubenswrapper[4664]: I1003 08:03:52.889499 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1604a32c-c92b-4bed-9402-1ea47abcc2ea-metrics-certs\") pod \"frr-k8s-5vzcf\" (UID: \"1604a32c-c92b-4bed-9402-1ea47abcc2ea\") " pod="metallb-system/frr-k8s-5vzcf" Oct 03 08:03:52 crc kubenswrapper[4664]: I1003 08:03:52.889524 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/1604a32c-c92b-4bed-9402-1ea47abcc2ea-metrics\") pod \"frr-k8s-5vzcf\" (UID: \"1604a32c-c92b-4bed-9402-1ea47abcc2ea\") " pod="metallb-system/frr-k8s-5vzcf" Oct 03 08:03:52 crc kubenswrapper[4664]: I1003 08:03:52.917342 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-rq5kc"] Oct 03 08:03:52 crc kubenswrapper[4664]: I1003 08:03:52.918369 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-rq5kc" Oct 03 08:03:52 crc kubenswrapper[4664]: I1003 08:03:52.921925 4664 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 03 08:03:52 crc kubenswrapper[4664]: I1003 08:03:52.922159 4664 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 03 08:03:52 crc kubenswrapper[4664]: I1003 08:03:52.924426 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-nlqsf"] Oct 03 08:03:52 crc kubenswrapper[4664]: I1003 08:03:52.924907 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 03 08:03:52 crc kubenswrapper[4664]: I1003 08:03:52.925300 4664 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-4b8cc" Oct 03 08:03:52 crc kubenswrapper[4664]: I1003 08:03:52.925720 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-nlqsf" Oct 03 08:03:52 crc kubenswrapper[4664]: I1003 08:03:52.928741 4664 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 03 08:03:52 crc kubenswrapper[4664]: I1003 08:03:52.934484 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-nlqsf"] Oct 03 08:03:52 crc kubenswrapper[4664]: I1003 08:03:52.991080 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/1604a32c-c92b-4bed-9402-1ea47abcc2ea-frr-startup\") pod \"frr-k8s-5vzcf\" (UID: \"1604a32c-c92b-4bed-9402-1ea47abcc2ea\") " pod="metallb-system/frr-k8s-5vzcf" Oct 03 08:03:52 crc kubenswrapper[4664]: I1003 08:03:52.991124 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/1604a32c-c92b-4bed-9402-1ea47abcc2ea-reloader\") pod \"frr-k8s-5vzcf\" (UID: \"1604a32c-c92b-4bed-9402-1ea47abcc2ea\") " pod="metallb-system/frr-k8s-5vzcf" Oct 03 08:03:52 crc kubenswrapper[4664]: I1003 08:03:52.991164 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/1604a32c-c92b-4bed-9402-1ea47abcc2ea-frr-sockets\") pod \"frr-k8s-5vzcf\" (UID: \"1604a32c-c92b-4bed-9402-1ea47abcc2ea\") " pod="metallb-system/frr-k8s-5vzcf" Oct 03 08:03:52 crc kubenswrapper[4664]: I1003 08:03:52.991185 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwn2h\" (UniqueName: \"kubernetes.io/projected/28c891c2-52c6-400d-95c8-5c29bb6962a7-kube-api-access-kwn2h\") pod \"frr-k8s-webhook-server-64bf5d555-j98p4\" (UID: \"28c891c2-52c6-400d-95c8-5c29bb6962a7\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-j98p4" Oct 03 08:03:52 crc kubenswrapper[4664]: I1003 08:03:52.991216 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-499km\" (UniqueName: \"kubernetes.io/projected/876be351-8fa7-4af6-b979-17941886901e-kube-api-access-499km\") pod \"speaker-rq5kc\" (UID: \"876be351-8fa7-4af6-b979-17941886901e\") " pod="metallb-system/speaker-rq5kc" Oct 03 08:03:52 crc kubenswrapper[4664]: I1003 08:03:52.991237 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/876be351-8fa7-4af6-b979-17941886901e-metallb-excludel2\") pod \"speaker-rq5kc\" (UID: \"876be351-8fa7-4af6-b979-17941886901e\") " pod="metallb-system/speaker-rq5kc" Oct 03 08:03:52 crc kubenswrapper[4664]: I1003 08:03:52.991263 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r764s\" (UniqueName: \"kubernetes.io/projected/1604a32c-c92b-4bed-9402-1ea47abcc2ea-kube-api-access-r764s\") pod \"frr-k8s-5vzcf\" (UID: \"1604a32c-c92b-4bed-9402-1ea47abcc2ea\") " pod="metallb-system/frr-k8s-5vzcf" Oct 03 08:03:52 crc kubenswrapper[4664]: I1003 08:03:52.991282 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/1604a32c-c92b-4bed-9402-1ea47abcc2ea-frr-conf\") pod \"frr-k8s-5vzcf\" (UID: \"1604a32c-c92b-4bed-9402-1ea47abcc2ea\") " pod="metallb-system/frr-k8s-5vzcf" Oct 03 08:03:52 crc kubenswrapper[4664]: I1003 08:03:52.991299 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6adfa358-0d56-406d-ac16-a982e3049c21-metrics-certs\") pod \"controller-68d546b9d8-nlqsf\" (UID: \"6adfa358-0d56-406d-ac16-a982e3049c21\") " pod="metallb-system/controller-68d546b9d8-nlqsf" Oct 03 08:03:52 crc kubenswrapper[4664]: I1003 08:03:52.991315 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57xsq\" (UniqueName: \"kubernetes.io/projected/6adfa358-0d56-406d-ac16-a982e3049c21-kube-api-access-57xsq\") pod \"controller-68d546b9d8-nlqsf\" (UID: \"6adfa358-0d56-406d-ac16-a982e3049c21\") " pod="metallb-system/controller-68d546b9d8-nlqsf" Oct 03 08:03:52 crc kubenswrapper[4664]: I1003 08:03:52.991333 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/876be351-8fa7-4af6-b979-17941886901e-metrics-certs\") pod \"speaker-rq5kc\" (UID: \"876be351-8fa7-4af6-b979-17941886901e\") " pod="metallb-system/speaker-rq5kc" Oct 03 08:03:52 crc kubenswrapper[4664]: I1003 08:03:52.991361 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/28c891c2-52c6-400d-95c8-5c29bb6962a7-cert\") pod \"frr-k8s-webhook-server-64bf5d555-j98p4\" (UID: \"28c891c2-52c6-400d-95c8-5c29bb6962a7\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-j98p4" Oct 03 08:03:52 crc kubenswrapper[4664]: I1003 08:03:52.991378 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1604a32c-c92b-4bed-9402-1ea47abcc2ea-metrics-certs\") pod \"frr-k8s-5vzcf\" (UID: \"1604a32c-c92b-4bed-9402-1ea47abcc2ea\") " pod="metallb-system/frr-k8s-5vzcf" Oct 03 08:03:52 crc kubenswrapper[4664]: I1003 08:03:52.991392 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6adfa358-0d56-406d-ac16-a982e3049c21-cert\") pod \"controller-68d546b9d8-nlqsf\" (UID: \"6adfa358-0d56-406d-ac16-a982e3049c21\") " pod="metallb-system/controller-68d546b9d8-nlqsf" Oct 03 08:03:52 crc kubenswrapper[4664]: I1003 08:03:52.991410 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/876be351-8fa7-4af6-b979-17941886901e-memberlist\") pod \"speaker-rq5kc\" (UID: \"876be351-8fa7-4af6-b979-17941886901e\") " pod="metallb-system/speaker-rq5kc" Oct 03 08:03:52 crc kubenswrapper[4664]: I1003 08:03:52.991425 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/1604a32c-c92b-4bed-9402-1ea47abcc2ea-metrics\") pod \"frr-k8s-5vzcf\" (UID: \"1604a32c-c92b-4bed-9402-1ea47abcc2ea\") " pod="metallb-system/frr-k8s-5vzcf" Oct 03 08:03:52 crc kubenswrapper[4664]: I1003 08:03:52.992232 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/1604a32c-c92b-4bed-9402-1ea47abcc2ea-frr-startup\") pod \"frr-k8s-5vzcf\" (UID: \"1604a32c-c92b-4bed-9402-1ea47abcc2ea\") " pod="metallb-system/frr-k8s-5vzcf" Oct 03 08:03:52 crc kubenswrapper[4664]: I1003 08:03:52.992266 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/1604a32c-c92b-4bed-9402-1ea47abcc2ea-frr-sockets\") pod \"frr-k8s-5vzcf\" (UID: \"1604a32c-c92b-4bed-9402-1ea47abcc2ea\") " pod="metallb-system/frr-k8s-5vzcf" Oct 03 08:03:52 crc kubenswrapper[4664]: E1003 08:03:52.992341 4664 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Oct 03 08:03:52 crc kubenswrapper[4664]: I1003 08:03:52.992386 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/1604a32c-c92b-4bed-9402-1ea47abcc2ea-frr-conf\") pod \"frr-k8s-5vzcf\" (UID: \"1604a32c-c92b-4bed-9402-1ea47abcc2ea\") " pod="metallb-system/frr-k8s-5vzcf" Oct 03 08:03:52 crc kubenswrapper[4664]: E1003 08:03:52.992392 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28c891c2-52c6-400d-95c8-5c29bb6962a7-cert podName:28c891c2-52c6-400d-95c8-5c29bb6962a7 nodeName:}" failed. No retries permitted until 2025-10-03 08:03:53.492376895 +0000 UTC m=+934.313567385 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/28c891c2-52c6-400d-95c8-5c29bb6962a7-cert") pod "frr-k8s-webhook-server-64bf5d555-j98p4" (UID: "28c891c2-52c6-400d-95c8-5c29bb6962a7") : secret "frr-k8s-webhook-server-cert" not found Oct 03 08:03:52 crc kubenswrapper[4664]: I1003 08:03:52.992663 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/1604a32c-c92b-4bed-9402-1ea47abcc2ea-metrics\") pod \"frr-k8s-5vzcf\" (UID: \"1604a32c-c92b-4bed-9402-1ea47abcc2ea\") " pod="metallb-system/frr-k8s-5vzcf" Oct 03 08:03:52 crc kubenswrapper[4664]: E1003 08:03:52.992390 4664 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Oct 03 08:03:52 crc kubenswrapper[4664]: I1003 08:03:52.992719 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/1604a32c-c92b-4bed-9402-1ea47abcc2ea-reloader\") pod \"frr-k8s-5vzcf\" (UID: \"1604a32c-c92b-4bed-9402-1ea47abcc2ea\") " pod="metallb-system/frr-k8s-5vzcf" Oct 03 08:03:52 crc kubenswrapper[4664]: E1003 08:03:52.992727 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1604a32c-c92b-4bed-9402-1ea47abcc2ea-metrics-certs podName:1604a32c-c92b-4bed-9402-1ea47abcc2ea nodeName:}" failed. No retries permitted until 2025-10-03 08:03:53.492710974 +0000 UTC m=+934.313901464 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1604a32c-c92b-4bed-9402-1ea47abcc2ea-metrics-certs") pod "frr-k8s-5vzcf" (UID: "1604a32c-c92b-4bed-9402-1ea47abcc2ea") : secret "frr-k8s-certs-secret" not found Oct 03 08:03:53 crc kubenswrapper[4664]: I1003 08:03:53.016447 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r764s\" (UniqueName: \"kubernetes.io/projected/1604a32c-c92b-4bed-9402-1ea47abcc2ea-kube-api-access-r764s\") pod \"frr-k8s-5vzcf\" (UID: \"1604a32c-c92b-4bed-9402-1ea47abcc2ea\") " pod="metallb-system/frr-k8s-5vzcf" Oct 03 08:03:53 crc kubenswrapper[4664]: I1003 08:03:53.016534 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwn2h\" (UniqueName: \"kubernetes.io/projected/28c891c2-52c6-400d-95c8-5c29bb6962a7-kube-api-access-kwn2h\") pod \"frr-k8s-webhook-server-64bf5d555-j98p4\" (UID: \"28c891c2-52c6-400d-95c8-5c29bb6962a7\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-j98p4" Oct 03 08:03:53 crc kubenswrapper[4664]: I1003 08:03:53.093052 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-499km\" (UniqueName: \"kubernetes.io/projected/876be351-8fa7-4af6-b979-17941886901e-kube-api-access-499km\") pod \"speaker-rq5kc\" (UID: \"876be351-8fa7-4af6-b979-17941886901e\") " pod="metallb-system/speaker-rq5kc" Oct 03 08:03:53 crc kubenswrapper[4664]: I1003 08:03:53.093108 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/876be351-8fa7-4af6-b979-17941886901e-metallb-excludel2\") pod \"speaker-rq5kc\" (UID: \"876be351-8fa7-4af6-b979-17941886901e\") " pod="metallb-system/speaker-rq5kc" Oct 03 08:03:53 crc kubenswrapper[4664]: I1003 08:03:53.093147 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6adfa358-0d56-406d-ac16-a982e3049c21-metrics-certs\") pod \"controller-68d546b9d8-nlqsf\" (UID: \"6adfa358-0d56-406d-ac16-a982e3049c21\") " pod="metallb-system/controller-68d546b9d8-nlqsf" Oct 03 08:03:53 crc kubenswrapper[4664]: I1003 08:03:53.093167 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57xsq\" (UniqueName: \"kubernetes.io/projected/6adfa358-0d56-406d-ac16-a982e3049c21-kube-api-access-57xsq\") pod \"controller-68d546b9d8-nlqsf\" (UID: \"6adfa358-0d56-406d-ac16-a982e3049c21\") " pod="metallb-system/controller-68d546b9d8-nlqsf" Oct 03 08:03:53 crc kubenswrapper[4664]: I1003 08:03:53.093187 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/876be351-8fa7-4af6-b979-17941886901e-metrics-certs\") pod \"speaker-rq5kc\" (UID: \"876be351-8fa7-4af6-b979-17941886901e\") " pod="metallb-system/speaker-rq5kc" Oct 03 08:03:53 crc kubenswrapper[4664]: I1003 08:03:53.093229 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6adfa358-0d56-406d-ac16-a982e3049c21-cert\") pod \"controller-68d546b9d8-nlqsf\" (UID: \"6adfa358-0d56-406d-ac16-a982e3049c21\") " pod="metallb-system/controller-68d546b9d8-nlqsf" Oct 03 08:03:53 crc kubenswrapper[4664]: I1003 08:03:53.093247 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/876be351-8fa7-4af6-b979-17941886901e-memberlist\") pod \"speaker-rq5kc\" (UID: \"876be351-8fa7-4af6-b979-17941886901e\") " pod="metallb-system/speaker-rq5kc" Oct 03 08:03:53 crc kubenswrapper[4664]: E1003 08:03:53.093389 4664 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 03 08:03:53 crc kubenswrapper[4664]: E1003 08:03:53.093451 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/876be351-8fa7-4af6-b979-17941886901e-memberlist podName:876be351-8fa7-4af6-b979-17941886901e nodeName:}" failed. No retries permitted until 2025-10-03 08:03:53.593434239 +0000 UTC m=+934.414624729 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/876be351-8fa7-4af6-b979-17941886901e-memberlist") pod "speaker-rq5kc" (UID: "876be351-8fa7-4af6-b979-17941886901e") : secret "metallb-memberlist" not found Oct 03 08:03:53 crc kubenswrapper[4664]: I1003 08:03:53.094414 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/876be351-8fa7-4af6-b979-17941886901e-metallb-excludel2\") pod \"speaker-rq5kc\" (UID: \"876be351-8fa7-4af6-b979-17941886901e\") " pod="metallb-system/speaker-rq5kc" Oct 03 08:03:53 crc kubenswrapper[4664]: I1003 08:03:53.095163 4664 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 03 08:03:53 crc kubenswrapper[4664]: I1003 08:03:53.098102 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/876be351-8fa7-4af6-b979-17941886901e-metrics-certs\") pod \"speaker-rq5kc\" (UID: \"876be351-8fa7-4af6-b979-17941886901e\") " pod="metallb-system/speaker-rq5kc" Oct 03 08:03:53 crc kubenswrapper[4664]: I1003 08:03:53.098623 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6adfa358-0d56-406d-ac16-a982e3049c21-metrics-certs\") pod \"controller-68d546b9d8-nlqsf\" (UID: \"6adfa358-0d56-406d-ac16-a982e3049c21\") " pod="metallb-system/controller-68d546b9d8-nlqsf" Oct 03 08:03:53 crc kubenswrapper[4664]: I1003 08:03:53.108136 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6adfa358-0d56-406d-ac16-a982e3049c21-cert\") pod \"controller-68d546b9d8-nlqsf\" (UID: \"6adfa358-0d56-406d-ac16-a982e3049c21\") " pod="metallb-system/controller-68d546b9d8-nlqsf" Oct 03 08:03:53 crc kubenswrapper[4664]: I1003 08:03:53.110704 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-499km\" (UniqueName: \"kubernetes.io/projected/876be351-8fa7-4af6-b979-17941886901e-kube-api-access-499km\") pod \"speaker-rq5kc\" (UID: \"876be351-8fa7-4af6-b979-17941886901e\") " pod="metallb-system/speaker-rq5kc" Oct 03 08:03:53 crc kubenswrapper[4664]: I1003 08:03:53.114889 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57xsq\" (UniqueName: \"kubernetes.io/projected/6adfa358-0d56-406d-ac16-a982e3049c21-kube-api-access-57xsq\") pod \"controller-68d546b9d8-nlqsf\" (UID: \"6adfa358-0d56-406d-ac16-a982e3049c21\") " pod="metallb-system/controller-68d546b9d8-nlqsf" Oct 03 08:03:53 crc kubenswrapper[4664]: I1003 08:03:53.263061 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-nlqsf" Oct 03 08:03:53 crc kubenswrapper[4664]: I1003 08:03:53.499340 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/28c891c2-52c6-400d-95c8-5c29bb6962a7-cert\") pod \"frr-k8s-webhook-server-64bf5d555-j98p4\" (UID: \"28c891c2-52c6-400d-95c8-5c29bb6962a7\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-j98p4" Oct 03 08:03:53 crc kubenswrapper[4664]: I1003 08:03:53.499658 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1604a32c-c92b-4bed-9402-1ea47abcc2ea-metrics-certs\") pod \"frr-k8s-5vzcf\" (UID: \"1604a32c-c92b-4bed-9402-1ea47abcc2ea\") " pod="metallb-system/frr-k8s-5vzcf" Oct 03 08:03:53 crc kubenswrapper[4664]: I1003 08:03:53.503334 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/28c891c2-52c6-400d-95c8-5c29bb6962a7-cert\") pod \"frr-k8s-webhook-server-64bf5d555-j98p4\" (UID: \"28c891c2-52c6-400d-95c8-5c29bb6962a7\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-j98p4" Oct 03 08:03:53 crc kubenswrapper[4664]: I1003 08:03:53.503516 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1604a32c-c92b-4bed-9402-1ea47abcc2ea-metrics-certs\") pod \"frr-k8s-5vzcf\" (UID: \"1604a32c-c92b-4bed-9402-1ea47abcc2ea\") " pod="metallb-system/frr-k8s-5vzcf" Oct 03 08:03:53 crc kubenswrapper[4664]: I1003 08:03:53.601514 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/876be351-8fa7-4af6-b979-17941886901e-memberlist\") pod \"speaker-rq5kc\" (UID: \"876be351-8fa7-4af6-b979-17941886901e\") " pod="metallb-system/speaker-rq5kc" Oct 03 08:03:53 crc kubenswrapper[4664]: E1003 08:03:53.601678 4664 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 03 08:03:53 crc kubenswrapper[4664]: E1003 08:03:53.601764 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/876be351-8fa7-4af6-b979-17941886901e-memberlist podName:876be351-8fa7-4af6-b979-17941886901e nodeName:}" failed. No retries permitted until 2025-10-03 08:03:54.601741147 +0000 UTC m=+935.422931637 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/876be351-8fa7-4af6-b979-17941886901e-memberlist") pod "speaker-rq5kc" (UID: "876be351-8fa7-4af6-b979-17941886901e") : secret "metallb-memberlist" not found Oct 03 08:03:53 crc kubenswrapper[4664]: I1003 08:03:53.682131 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-nlqsf"] Oct 03 08:03:53 crc kubenswrapper[4664]: I1003 08:03:53.720063 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-5vzcf" Oct 03 08:03:53 crc kubenswrapper[4664]: I1003 08:03:53.728526 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-j98p4" Oct 03 08:03:53 crc kubenswrapper[4664]: I1003 08:03:53.962086 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-j98p4"] Oct 03 08:03:53 crc kubenswrapper[4664]: W1003 08:03:53.971828 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28c891c2_52c6_400d_95c8_5c29bb6962a7.slice/crio-817218f4bec4bf2da0acc8dc9fb03188c0be7a19638f0d6aa782c103ca2cfaf6 WatchSource:0}: Error finding container 817218f4bec4bf2da0acc8dc9fb03188c0be7a19638f0d6aa782c103ca2cfaf6: Status 404 returned error can't find the container with id 817218f4bec4bf2da0acc8dc9fb03188c0be7a19638f0d6aa782c103ca2cfaf6 Oct 03 08:03:54 crc kubenswrapper[4664]: I1003 08:03:54.536569 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5vzcf" event={"ID":"1604a32c-c92b-4bed-9402-1ea47abcc2ea","Type":"ContainerStarted","Data":"c774dc88986c8cdfda146e9cecce0380de208d45b809fab53e2833f6687c9281"} Oct 03 08:03:54 crc kubenswrapper[4664]: I1003 08:03:54.538699 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-j98p4" event={"ID":"28c891c2-52c6-400d-95c8-5c29bb6962a7","Type":"ContainerStarted","Data":"817218f4bec4bf2da0acc8dc9fb03188c0be7a19638f0d6aa782c103ca2cfaf6"} Oct 03 08:03:54 crc kubenswrapper[4664]: I1003 08:03:54.540544 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-nlqsf" event={"ID":"6adfa358-0d56-406d-ac16-a982e3049c21","Type":"ContainerStarted","Data":"4597827393e4da806e9a62dd78ef61aefdb7f09734bd86dbfdb2d4056c2988ed"} Oct 03 08:03:54 crc kubenswrapper[4664]: I1003 08:03:54.540572 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-nlqsf" event={"ID":"6adfa358-0d56-406d-ac16-a982e3049c21","Type":"ContainerStarted","Data":"c4b16442d5fb6faacd2e8b95321af7dd892050794089746eebcf3460b096b4f8"} Oct 03 08:03:54 crc kubenswrapper[4664]: I1003 08:03:54.540582 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-nlqsf" event={"ID":"6adfa358-0d56-406d-ac16-a982e3049c21","Type":"ContainerStarted","Data":"f8f3a77f0768d0df3de2d025852bc969eb1fc87800dd06de4c4460f0b71de93d"} Oct 03 08:03:54 crc kubenswrapper[4664]: I1003 08:03:54.571481 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-nlqsf" podStartSLOduration=2.571462405 podStartE2EDuration="2.571462405s" podCreationTimestamp="2025-10-03 08:03:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:03:54.566082361 +0000 UTC m=+935.387272861" watchObservedRunningTime="2025-10-03 08:03:54.571462405 +0000 UTC m=+935.392652895" Oct 03 08:03:54 crc kubenswrapper[4664]: I1003 08:03:54.615330 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/876be351-8fa7-4af6-b979-17941886901e-memberlist\") pod \"speaker-rq5kc\" (UID: \"876be351-8fa7-4af6-b979-17941886901e\") " pod="metallb-system/speaker-rq5kc" Oct 03 08:03:54 crc kubenswrapper[4664]: I1003 08:03:54.622032 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/876be351-8fa7-4af6-b979-17941886901e-memberlist\") pod \"speaker-rq5kc\" (UID: \"876be351-8fa7-4af6-b979-17941886901e\") " pod="metallb-system/speaker-rq5kc" Oct 03 08:03:54 crc kubenswrapper[4664]: I1003 08:03:54.744338 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-rq5kc" Oct 03 08:03:55 crc kubenswrapper[4664]: I1003 08:03:55.548837 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-rq5kc" event={"ID":"876be351-8fa7-4af6-b979-17941886901e","Type":"ContainerStarted","Data":"dbc81aa45d46936f53e99fdfbff12a1e991519815bc65ada9980a894700670a3"} Oct 03 08:03:55 crc kubenswrapper[4664]: I1003 08:03:55.549152 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-nlqsf" Oct 03 08:03:55 crc kubenswrapper[4664]: I1003 08:03:55.549164 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-rq5kc" event={"ID":"876be351-8fa7-4af6-b979-17941886901e","Type":"ContainerStarted","Data":"d6d18f776620661f1330a209689c5ec714bcf41e4454cbe32274745969a109d3"} Oct 03 08:03:55 crc kubenswrapper[4664]: I1003 08:03:55.549173 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-rq5kc" event={"ID":"876be351-8fa7-4af6-b979-17941886901e","Type":"ContainerStarted","Data":"0401393e25fef8d3e43d1eac6b8c5a4feaf44d105b0e5138f6d4bae1608a6336"} Oct 03 08:03:55 crc kubenswrapper[4664]: I1003 08:03:55.549758 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-rq5kc" Oct 03 08:03:55 crc kubenswrapper[4664]: I1003 08:03:55.578833 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-rq5kc" podStartSLOduration=3.578810647 podStartE2EDuration="3.578810647s" podCreationTimestamp="2025-10-03 08:03:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:03:55.576108909 +0000 UTC m=+936.397299409" watchObservedRunningTime="2025-10-03 08:03:55.578810647 +0000 UTC m=+936.400001157" Oct 03 08:04:01 crc kubenswrapper[4664]: I1003 08:04:01.607458 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-j98p4" event={"ID":"28c891c2-52c6-400d-95c8-5c29bb6962a7","Type":"ContainerStarted","Data":"86e690767b4d854959b4d0d6a840e0ffe32e52bf603e0645226c9b1c8c7d64c0"} Oct 03 08:04:01 crc kubenswrapper[4664]: I1003 08:04:01.609035 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-j98p4" Oct 03 08:04:01 crc kubenswrapper[4664]: I1003 08:04:01.611247 4664 generic.go:334] "Generic (PLEG): container finished" podID="1604a32c-c92b-4bed-9402-1ea47abcc2ea" containerID="66c8bd9f9b4ff929134ac6e3b2fa468bc73fd65c615d1ed5c644f01096cba809" exitCode=0 Oct 03 08:04:01 crc kubenswrapper[4664]: I1003 08:04:01.611277 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5vzcf" event={"ID":"1604a32c-c92b-4bed-9402-1ea47abcc2ea","Type":"ContainerDied","Data":"66c8bd9f9b4ff929134ac6e3b2fa468bc73fd65c615d1ed5c644f01096cba809"} Oct 03 08:04:01 crc kubenswrapper[4664]: I1003 08:04:01.627060 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-j98p4" podStartSLOduration=2.190782329 podStartE2EDuration="9.627041775s" podCreationTimestamp="2025-10-03 08:03:52 +0000 UTC" firstStartedPulling="2025-10-03 08:03:53.974828815 +0000 UTC m=+934.796019305" lastFinishedPulling="2025-10-03 08:04:01.411088261 +0000 UTC m=+942.232278751" observedRunningTime="2025-10-03 08:04:01.623511604 +0000 UTC m=+942.444702104" watchObservedRunningTime="2025-10-03 08:04:01.627041775 +0000 UTC m=+942.448232265" Oct 03 08:04:02 crc kubenswrapper[4664]: I1003 08:04:02.619330 4664 generic.go:334] "Generic (PLEG): container finished" podID="1604a32c-c92b-4bed-9402-1ea47abcc2ea" containerID="06b5c2ca79cbb316a1c98b8d8ce3f23aeb37a966ff051c10a5f88c7fadab51f7" exitCode=0 Oct 03 08:04:02 crc kubenswrapper[4664]: I1003 08:04:02.619375 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5vzcf" event={"ID":"1604a32c-c92b-4bed-9402-1ea47abcc2ea","Type":"ContainerDied","Data":"06b5c2ca79cbb316a1c98b8d8ce3f23aeb37a966ff051c10a5f88c7fadab51f7"} Oct 03 08:04:03 crc kubenswrapper[4664]: I1003 08:04:03.268260 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-nlqsf" Oct 03 08:04:03 crc kubenswrapper[4664]: I1003 08:04:03.627185 4664 generic.go:334] "Generic (PLEG): container finished" podID="1604a32c-c92b-4bed-9402-1ea47abcc2ea" containerID="045086a2502794c662c84363c5a4eaeee9d110faa7b8ab6d2741950bed609151" exitCode=0 Oct 03 08:04:03 crc kubenswrapper[4664]: I1003 08:04:03.627229 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5vzcf" event={"ID":"1604a32c-c92b-4bed-9402-1ea47abcc2ea","Type":"ContainerDied","Data":"045086a2502794c662c84363c5a4eaeee9d110faa7b8ab6d2741950bed609151"} Oct 03 08:04:04 crc kubenswrapper[4664]: I1003 08:04:04.638447 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5vzcf" event={"ID":"1604a32c-c92b-4bed-9402-1ea47abcc2ea","Type":"ContainerStarted","Data":"626b14253caa250965fa845a062726004eed8fc47d65c626b1388109972512e5"} Oct 03 08:04:04 crc kubenswrapper[4664]: I1003 08:04:04.638824 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-5vzcf" Oct 03 08:04:04 crc kubenswrapper[4664]: I1003 08:04:04.638840 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5vzcf" event={"ID":"1604a32c-c92b-4bed-9402-1ea47abcc2ea","Type":"ContainerStarted","Data":"1b96f11bbf3f141f556babb1cbd33dea380ce30f5fa4e07e593373505984da91"} Oct 03 08:04:04 crc kubenswrapper[4664]: I1003 08:04:04.638854 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5vzcf" event={"ID":"1604a32c-c92b-4bed-9402-1ea47abcc2ea","Type":"ContainerStarted","Data":"3ef57cc77eb7d6d8c308335a798caf7f33972c85c8b591c45c159722f0b48682"} Oct 03 08:04:04 crc kubenswrapper[4664]: I1003 08:04:04.638865 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5vzcf" event={"ID":"1604a32c-c92b-4bed-9402-1ea47abcc2ea","Type":"ContainerStarted","Data":"d00c772b0dfcdad3bf49363915060b4f0d4e14b7d28beba52b23e71053f76a1f"} Oct 03 08:04:04 crc kubenswrapper[4664]: I1003 08:04:04.638875 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5vzcf" event={"ID":"1604a32c-c92b-4bed-9402-1ea47abcc2ea","Type":"ContainerStarted","Data":"32c4ea9e2631ddae277ceb606809474386c1ce33bf52008955ae3917da589e2e"} Oct 03 08:04:04 crc kubenswrapper[4664]: I1003 08:04:04.638886 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5vzcf" event={"ID":"1604a32c-c92b-4bed-9402-1ea47abcc2ea","Type":"ContainerStarted","Data":"3c66b2dc903b5fa0a212e6238f4f67b3175f3bdeedae74fbee241ee6a716442e"} Oct 03 08:04:04 crc kubenswrapper[4664]: I1003 08:04:04.666052 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-5vzcf" podStartSLOduration=5.110180305 podStartE2EDuration="12.666033934s" podCreationTimestamp="2025-10-03 08:03:52 +0000 UTC" firstStartedPulling="2025-10-03 08:03:53.836217819 +0000 UTC m=+934.657408309" lastFinishedPulling="2025-10-03 08:04:01.392071448 +0000 UTC m=+942.213261938" observedRunningTime="2025-10-03 08:04:04.665279842 +0000 UTC m=+945.486470352" watchObservedRunningTime="2025-10-03 08:04:04.666033934 +0000 UTC m=+945.487224434" Oct 03 08:04:08 crc kubenswrapper[4664]: I1003 08:04:08.720589 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-5vzcf" Oct 03 08:04:08 crc kubenswrapper[4664]: I1003 08:04:08.759406 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-5vzcf" Oct 03 08:04:13 crc kubenswrapper[4664]: I1003 08:04:13.726714 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-5vzcf" Oct 03 08:04:13 crc kubenswrapper[4664]: I1003 08:04:13.736687 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-j98p4" Oct 03 08:04:14 crc kubenswrapper[4664]: I1003 08:04:14.750039 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-rq5kc" Oct 03 08:04:17 crc kubenswrapper[4664]: I1003 08:04:17.496909 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-zg5rg"] Oct 03 08:04:17 crc kubenswrapper[4664]: I1003 08:04:17.498081 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-zg5rg" Oct 03 08:04:17 crc kubenswrapper[4664]: I1003 08:04:17.502161 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-4t7s7" Oct 03 08:04:17 crc kubenswrapper[4664]: I1003 08:04:17.502772 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 03 08:04:17 crc kubenswrapper[4664]: I1003 08:04:17.503166 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 03 08:04:17 crc kubenswrapper[4664]: I1003 08:04:17.518695 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-zg5rg"] Oct 03 08:04:17 crc kubenswrapper[4664]: I1003 08:04:17.562604 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbqcx\" (UniqueName: \"kubernetes.io/projected/4fef1902-6311-446b-b38b-4af3c3035b30-kube-api-access-jbqcx\") pod \"openstack-operator-index-zg5rg\" (UID: \"4fef1902-6311-446b-b38b-4af3c3035b30\") " pod="openstack-operators/openstack-operator-index-zg5rg" Oct 03 08:04:17 crc kubenswrapper[4664]: I1003 08:04:17.665366 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbqcx\" (UniqueName: \"kubernetes.io/projected/4fef1902-6311-446b-b38b-4af3c3035b30-kube-api-access-jbqcx\") pod \"openstack-operator-index-zg5rg\" (UID: \"4fef1902-6311-446b-b38b-4af3c3035b30\") " pod="openstack-operators/openstack-operator-index-zg5rg" Oct 03 08:04:17 crc kubenswrapper[4664]: I1003 08:04:17.692675 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbqcx\" (UniqueName: \"kubernetes.io/projected/4fef1902-6311-446b-b38b-4af3c3035b30-kube-api-access-jbqcx\") pod \"openstack-operator-index-zg5rg\" (UID: \"4fef1902-6311-446b-b38b-4af3c3035b30\") " pod="openstack-operators/openstack-operator-index-zg5rg" Oct 03 08:04:17 crc kubenswrapper[4664]: I1003 08:04:17.824020 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-zg5rg" Oct 03 08:04:18 crc kubenswrapper[4664]: I1003 08:04:18.156173 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-zg5rg"] Oct 03 08:04:18 crc kubenswrapper[4664]: I1003 08:04:18.730420 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-zg5rg" event={"ID":"4fef1902-6311-446b-b38b-4af3c3035b30","Type":"ContainerStarted","Data":"15b545182dfaafbf5995a47a42bbab0b6fd69a78343bb09e481ef4eaf91c3355"} Oct 03 08:04:20 crc kubenswrapper[4664]: I1003 08:04:20.742860 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-zg5rg" event={"ID":"4fef1902-6311-446b-b38b-4af3c3035b30","Type":"ContainerStarted","Data":"f85ccd7208ba1bc19e5d17fb6952714c4ce2ee34949506245e0beab208366a6e"} Oct 03 08:04:20 crc kubenswrapper[4664]: I1003 08:04:20.761300 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-zg5rg" podStartSLOduration=1.746096987 podStartE2EDuration="3.761263104s" podCreationTimestamp="2025-10-03 08:04:17 +0000 UTC" firstStartedPulling="2025-10-03 08:04:18.162796209 +0000 UTC m=+958.983986699" lastFinishedPulling="2025-10-03 08:04:20.177962326 +0000 UTC m=+960.999152816" observedRunningTime="2025-10-03 08:04:20.758410063 +0000 UTC m=+961.579600553" watchObservedRunningTime="2025-10-03 08:04:20.761263104 +0000 UTC m=+961.582453614" Oct 03 08:04:27 crc kubenswrapper[4664]: I1003 08:04:27.824065 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-zg5rg" Oct 03 08:04:27 crc kubenswrapper[4664]: I1003 08:04:27.824643 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-zg5rg" Oct 03 08:04:27 crc kubenswrapper[4664]: I1003 08:04:27.863207 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-zg5rg" Oct 03 08:04:28 crc kubenswrapper[4664]: I1003 08:04:28.809791 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-zg5rg" Oct 03 08:04:34 crc kubenswrapper[4664]: I1003 08:04:34.118518 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/a2191722de8a7ecb953dc68cb47425cc2467b1a758620d4a8c9fcb07e5f8s72"] Oct 03 08:04:34 crc kubenswrapper[4664]: I1003 08:04:34.120711 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a2191722de8a7ecb953dc68cb47425cc2467b1a758620d4a8c9fcb07e5f8s72" Oct 03 08:04:34 crc kubenswrapper[4664]: I1003 08:04:34.122739 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-f6fwk" Oct 03 08:04:34 crc kubenswrapper[4664]: I1003 08:04:34.127998 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/a2191722de8a7ecb953dc68cb47425cc2467b1a758620d4a8c9fcb07e5f8s72"] Oct 03 08:04:34 crc kubenswrapper[4664]: I1003 08:04:34.202213 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a1139aaa-91dd-410c-b4f8-695cad546424-util\") pod \"a2191722de8a7ecb953dc68cb47425cc2467b1a758620d4a8c9fcb07e5f8s72\" (UID: \"a1139aaa-91dd-410c-b4f8-695cad546424\") " pod="openstack-operators/a2191722de8a7ecb953dc68cb47425cc2467b1a758620d4a8c9fcb07e5f8s72" Oct 03 08:04:34 crc kubenswrapper[4664]: I1003 08:04:34.202272 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a1139aaa-91dd-410c-b4f8-695cad546424-bundle\") pod \"a2191722de8a7ecb953dc68cb47425cc2467b1a758620d4a8c9fcb07e5f8s72\" (UID: \"a1139aaa-91dd-410c-b4f8-695cad546424\") " pod="openstack-operators/a2191722de8a7ecb953dc68cb47425cc2467b1a758620d4a8c9fcb07e5f8s72" Oct 03 08:04:34 crc kubenswrapper[4664]: I1003 08:04:34.202333 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gntlx\" (UniqueName: \"kubernetes.io/projected/a1139aaa-91dd-410c-b4f8-695cad546424-kube-api-access-gntlx\") pod \"a2191722de8a7ecb953dc68cb47425cc2467b1a758620d4a8c9fcb07e5f8s72\" (UID: \"a1139aaa-91dd-410c-b4f8-695cad546424\") " pod="openstack-operators/a2191722de8a7ecb953dc68cb47425cc2467b1a758620d4a8c9fcb07e5f8s72" Oct 03 08:04:34 crc kubenswrapper[4664]: I1003 08:04:34.303643 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gntlx\" (UniqueName: \"kubernetes.io/projected/a1139aaa-91dd-410c-b4f8-695cad546424-kube-api-access-gntlx\") pod \"a2191722de8a7ecb953dc68cb47425cc2467b1a758620d4a8c9fcb07e5f8s72\" (UID: \"a1139aaa-91dd-410c-b4f8-695cad546424\") " pod="openstack-operators/a2191722de8a7ecb953dc68cb47425cc2467b1a758620d4a8c9fcb07e5f8s72" Oct 03 08:04:34 crc kubenswrapper[4664]: I1003 08:04:34.303977 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a1139aaa-91dd-410c-b4f8-695cad546424-util\") pod \"a2191722de8a7ecb953dc68cb47425cc2467b1a758620d4a8c9fcb07e5f8s72\" (UID: \"a1139aaa-91dd-410c-b4f8-695cad546424\") " pod="openstack-operators/a2191722de8a7ecb953dc68cb47425cc2467b1a758620d4a8c9fcb07e5f8s72" Oct 03 08:04:34 crc kubenswrapper[4664]: I1003 08:04:34.304072 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a1139aaa-91dd-410c-b4f8-695cad546424-bundle\") pod \"a2191722de8a7ecb953dc68cb47425cc2467b1a758620d4a8c9fcb07e5f8s72\" (UID: \"a1139aaa-91dd-410c-b4f8-695cad546424\") " pod="openstack-operators/a2191722de8a7ecb953dc68cb47425cc2467b1a758620d4a8c9fcb07e5f8s72" Oct 03 08:04:34 crc kubenswrapper[4664]: I1003 08:04:34.304551 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a1139aaa-91dd-410c-b4f8-695cad546424-util\") pod \"a2191722de8a7ecb953dc68cb47425cc2467b1a758620d4a8c9fcb07e5f8s72\" (UID: \"a1139aaa-91dd-410c-b4f8-695cad546424\") " pod="openstack-operators/a2191722de8a7ecb953dc68cb47425cc2467b1a758620d4a8c9fcb07e5f8s72" Oct 03 08:04:34 crc kubenswrapper[4664]: I1003 08:04:34.304599 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a1139aaa-91dd-410c-b4f8-695cad546424-bundle\") pod \"a2191722de8a7ecb953dc68cb47425cc2467b1a758620d4a8c9fcb07e5f8s72\" (UID: \"a1139aaa-91dd-410c-b4f8-695cad546424\") " pod="openstack-operators/a2191722de8a7ecb953dc68cb47425cc2467b1a758620d4a8c9fcb07e5f8s72" Oct 03 08:04:34 crc kubenswrapper[4664]: I1003 08:04:34.325365 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gntlx\" (UniqueName: \"kubernetes.io/projected/a1139aaa-91dd-410c-b4f8-695cad546424-kube-api-access-gntlx\") pod \"a2191722de8a7ecb953dc68cb47425cc2467b1a758620d4a8c9fcb07e5f8s72\" (UID: \"a1139aaa-91dd-410c-b4f8-695cad546424\") " pod="openstack-operators/a2191722de8a7ecb953dc68cb47425cc2467b1a758620d4a8c9fcb07e5f8s72" Oct 03 08:04:34 crc kubenswrapper[4664]: I1003 08:04:34.440700 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a2191722de8a7ecb953dc68cb47425cc2467b1a758620d4a8c9fcb07e5f8s72" Oct 03 08:04:34 crc kubenswrapper[4664]: I1003 08:04:34.873021 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/a2191722de8a7ecb953dc68cb47425cc2467b1a758620d4a8c9fcb07e5f8s72"] Oct 03 08:04:34 crc kubenswrapper[4664]: W1003 08:04:34.877113 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1139aaa_91dd_410c_b4f8_695cad546424.slice/crio-ffac6d29316120e0f04b3386bc23fb82033547cda2c4af21d385a2bc384aed23 WatchSource:0}: Error finding container ffac6d29316120e0f04b3386bc23fb82033547cda2c4af21d385a2bc384aed23: Status 404 returned error can't find the container with id ffac6d29316120e0f04b3386bc23fb82033547cda2c4af21d385a2bc384aed23 Oct 03 08:04:35 crc kubenswrapper[4664]: I1003 08:04:35.832411 4664 generic.go:334] "Generic (PLEG): container finished" podID="a1139aaa-91dd-410c-b4f8-695cad546424" containerID="03e23eac1d83c5837d12f4728d521563b36ae00aad17d3e88dd71d757de095b9" exitCode=0 Oct 03 08:04:35 crc kubenswrapper[4664]: I1003 08:04:35.832594 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a2191722de8a7ecb953dc68cb47425cc2467b1a758620d4a8c9fcb07e5f8s72" event={"ID":"a1139aaa-91dd-410c-b4f8-695cad546424","Type":"ContainerDied","Data":"03e23eac1d83c5837d12f4728d521563b36ae00aad17d3e88dd71d757de095b9"} Oct 03 08:04:35 crc kubenswrapper[4664]: I1003 08:04:35.832979 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a2191722de8a7ecb953dc68cb47425cc2467b1a758620d4a8c9fcb07e5f8s72" event={"ID":"a1139aaa-91dd-410c-b4f8-695cad546424","Type":"ContainerStarted","Data":"ffac6d29316120e0f04b3386bc23fb82033547cda2c4af21d385a2bc384aed23"} Oct 03 08:04:36 crc kubenswrapper[4664]: I1003 08:04:36.840211 4664 generic.go:334] "Generic (PLEG): container finished" podID="a1139aaa-91dd-410c-b4f8-695cad546424" containerID="d3f0114126b5e25ffd3dfbbe3e1e3fc7d8cd361da51a5c6ca6817fa38f2ddbe1" exitCode=0 Oct 03 08:04:36 crc kubenswrapper[4664]: I1003 08:04:36.840318 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a2191722de8a7ecb953dc68cb47425cc2467b1a758620d4a8c9fcb07e5f8s72" event={"ID":"a1139aaa-91dd-410c-b4f8-695cad546424","Type":"ContainerDied","Data":"d3f0114126b5e25ffd3dfbbe3e1e3fc7d8cd361da51a5c6ca6817fa38f2ddbe1"} Oct 03 08:04:37 crc kubenswrapper[4664]: I1003 08:04:37.849641 4664 generic.go:334] "Generic (PLEG): container finished" podID="a1139aaa-91dd-410c-b4f8-695cad546424" containerID="b82faa8ea3243a894b8cb3b7fe26ef0df1018edf4c269b03291482974b2cc916" exitCode=0 Oct 03 08:04:37 crc kubenswrapper[4664]: I1003 08:04:37.849684 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a2191722de8a7ecb953dc68cb47425cc2467b1a758620d4a8c9fcb07e5f8s72" event={"ID":"a1139aaa-91dd-410c-b4f8-695cad546424","Type":"ContainerDied","Data":"b82faa8ea3243a894b8cb3b7fe26ef0df1018edf4c269b03291482974b2cc916"} Oct 03 08:04:39 crc kubenswrapper[4664]: I1003 08:04:39.087706 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a2191722de8a7ecb953dc68cb47425cc2467b1a758620d4a8c9fcb07e5f8s72" Oct 03 08:04:39 crc kubenswrapper[4664]: I1003 08:04:39.173024 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gntlx\" (UniqueName: \"kubernetes.io/projected/a1139aaa-91dd-410c-b4f8-695cad546424-kube-api-access-gntlx\") pod \"a1139aaa-91dd-410c-b4f8-695cad546424\" (UID: \"a1139aaa-91dd-410c-b4f8-695cad546424\") " Oct 03 08:04:39 crc kubenswrapper[4664]: I1003 08:04:39.173105 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a1139aaa-91dd-410c-b4f8-695cad546424-util\") pod \"a1139aaa-91dd-410c-b4f8-695cad546424\" (UID: \"a1139aaa-91dd-410c-b4f8-695cad546424\") " Oct 03 08:04:39 crc kubenswrapper[4664]: I1003 08:04:39.173173 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a1139aaa-91dd-410c-b4f8-695cad546424-bundle\") pod \"a1139aaa-91dd-410c-b4f8-695cad546424\" (UID: \"a1139aaa-91dd-410c-b4f8-695cad546424\") " Oct 03 08:04:39 crc kubenswrapper[4664]: I1003 08:04:39.174095 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1139aaa-91dd-410c-b4f8-695cad546424-bundle" (OuterVolumeSpecName: "bundle") pod "a1139aaa-91dd-410c-b4f8-695cad546424" (UID: "a1139aaa-91dd-410c-b4f8-695cad546424"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:04:39 crc kubenswrapper[4664]: I1003 08:04:39.177973 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1139aaa-91dd-410c-b4f8-695cad546424-kube-api-access-gntlx" (OuterVolumeSpecName: "kube-api-access-gntlx") pod "a1139aaa-91dd-410c-b4f8-695cad546424" (UID: "a1139aaa-91dd-410c-b4f8-695cad546424"). InnerVolumeSpecName "kube-api-access-gntlx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:04:39 crc kubenswrapper[4664]: I1003 08:04:39.187559 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1139aaa-91dd-410c-b4f8-695cad546424-util" (OuterVolumeSpecName: "util") pod "a1139aaa-91dd-410c-b4f8-695cad546424" (UID: "a1139aaa-91dd-410c-b4f8-695cad546424"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:04:39 crc kubenswrapper[4664]: I1003 08:04:39.275122 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gntlx\" (UniqueName: \"kubernetes.io/projected/a1139aaa-91dd-410c-b4f8-695cad546424-kube-api-access-gntlx\") on node \"crc\" DevicePath \"\"" Oct 03 08:04:39 crc kubenswrapper[4664]: I1003 08:04:39.275170 4664 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a1139aaa-91dd-410c-b4f8-695cad546424-util\") on node \"crc\" DevicePath \"\"" Oct 03 08:04:39 crc kubenswrapper[4664]: I1003 08:04:39.275182 4664 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a1139aaa-91dd-410c-b4f8-695cad546424-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:04:39 crc kubenswrapper[4664]: I1003 08:04:39.863309 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a2191722de8a7ecb953dc68cb47425cc2467b1a758620d4a8c9fcb07e5f8s72" event={"ID":"a1139aaa-91dd-410c-b4f8-695cad546424","Type":"ContainerDied","Data":"ffac6d29316120e0f04b3386bc23fb82033547cda2c4af21d385a2bc384aed23"} Oct 03 08:04:39 crc kubenswrapper[4664]: I1003 08:04:39.863759 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffac6d29316120e0f04b3386bc23fb82033547cda2c4af21d385a2bc384aed23" Oct 03 08:04:39 crc kubenswrapper[4664]: I1003 08:04:39.863419 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a2191722de8a7ecb953dc68cb47425cc2467b1a758620d4a8c9fcb07e5f8s72" Oct 03 08:04:47 crc kubenswrapper[4664]: I1003 08:04:47.013066 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5597f8fd94-dj29q"] Oct 03 08:04:47 crc kubenswrapper[4664]: E1003 08:04:47.013844 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1139aaa-91dd-410c-b4f8-695cad546424" containerName="util" Oct 03 08:04:47 crc kubenswrapper[4664]: I1003 08:04:47.013858 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1139aaa-91dd-410c-b4f8-695cad546424" containerName="util" Oct 03 08:04:47 crc kubenswrapper[4664]: E1003 08:04:47.013884 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1139aaa-91dd-410c-b4f8-695cad546424" containerName="pull" Oct 03 08:04:47 crc kubenswrapper[4664]: I1003 08:04:47.013891 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1139aaa-91dd-410c-b4f8-695cad546424" containerName="pull" Oct 03 08:04:47 crc kubenswrapper[4664]: E1003 08:04:47.013900 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1139aaa-91dd-410c-b4f8-695cad546424" containerName="extract" Oct 03 08:04:47 crc kubenswrapper[4664]: I1003 08:04:47.013906 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1139aaa-91dd-410c-b4f8-695cad546424" containerName="extract" Oct 03 08:04:47 crc kubenswrapper[4664]: I1003 08:04:47.014017 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1139aaa-91dd-410c-b4f8-695cad546424" containerName="extract" Oct 03 08:04:47 crc kubenswrapper[4664]: I1003 08:04:47.014725 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-5597f8fd94-dj29q" Oct 03 08:04:47 crc kubenswrapper[4664]: I1003 08:04:47.018360 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-nvbv8" Oct 03 08:04:47 crc kubenswrapper[4664]: I1003 08:04:47.044441 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5597f8fd94-dj29q"] Oct 03 08:04:47 crc kubenswrapper[4664]: I1003 08:04:47.080289 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drcnb\" (UniqueName: \"kubernetes.io/projected/4d57e4e0-7496-4ae6-8871-97016022a533-kube-api-access-drcnb\") pod \"openstack-operator-controller-operator-5597f8fd94-dj29q\" (UID: \"4d57e4e0-7496-4ae6-8871-97016022a533\") " pod="openstack-operators/openstack-operator-controller-operator-5597f8fd94-dj29q" Oct 03 08:04:47 crc kubenswrapper[4664]: I1003 08:04:47.181892 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drcnb\" (UniqueName: \"kubernetes.io/projected/4d57e4e0-7496-4ae6-8871-97016022a533-kube-api-access-drcnb\") pod \"openstack-operator-controller-operator-5597f8fd94-dj29q\" (UID: \"4d57e4e0-7496-4ae6-8871-97016022a533\") " pod="openstack-operators/openstack-operator-controller-operator-5597f8fd94-dj29q" Oct 03 08:04:47 crc kubenswrapper[4664]: I1003 08:04:47.202565 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drcnb\" (UniqueName: \"kubernetes.io/projected/4d57e4e0-7496-4ae6-8871-97016022a533-kube-api-access-drcnb\") pod \"openstack-operator-controller-operator-5597f8fd94-dj29q\" (UID: \"4d57e4e0-7496-4ae6-8871-97016022a533\") " pod="openstack-operators/openstack-operator-controller-operator-5597f8fd94-dj29q" Oct 03 08:04:47 crc kubenswrapper[4664]: I1003 08:04:47.333019 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-5597f8fd94-dj29q" Oct 03 08:04:47 crc kubenswrapper[4664]: I1003 08:04:47.608471 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5597f8fd94-dj29q"] Oct 03 08:04:47 crc kubenswrapper[4664]: I1003 08:04:47.927692 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-5597f8fd94-dj29q" event={"ID":"4d57e4e0-7496-4ae6-8871-97016022a533","Type":"ContainerStarted","Data":"e88bb963c85f0bcec2d8394654a1502ebf699c5a10ab6f8c769ea7b5a8d5221f"} Oct 03 08:04:53 crc kubenswrapper[4664]: I1003 08:04:53.006197 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-5597f8fd94-dj29q" event={"ID":"4d57e4e0-7496-4ae6-8871-97016022a533","Type":"ContainerStarted","Data":"d62ab38564a0de1ed4bf35f321ca2b5c9f78ccb74fe367377ac0e4d10181b51b"} Oct 03 08:04:56 crc kubenswrapper[4664]: I1003 08:04:56.024118 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-5597f8fd94-dj29q" event={"ID":"4d57e4e0-7496-4ae6-8871-97016022a533","Type":"ContainerStarted","Data":"73ffc19393e918f75221bea7656863f9eabb20fff55ba447d325703a414d3ed9"} Oct 03 08:04:56 crc kubenswrapper[4664]: I1003 08:04:56.024550 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-5597f8fd94-dj29q" Oct 03 08:04:56 crc kubenswrapper[4664]: I1003 08:04:56.060984 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-5597f8fd94-dj29q" podStartSLOduration=2.719152962 podStartE2EDuration="10.060938941s" podCreationTimestamp="2025-10-03 08:04:46 +0000 UTC" firstStartedPulling="2025-10-03 08:04:47.615123867 +0000 UTC m=+988.436314357" lastFinishedPulling="2025-10-03 08:04:54.956909846 +0000 UTC m=+995.778100336" observedRunningTime="2025-10-03 08:04:56.058000935 +0000 UTC m=+996.879191445" watchObservedRunningTime="2025-10-03 08:04:56.060938941 +0000 UTC m=+996.882129431" Oct 03 08:04:57 crc kubenswrapper[4664]: I1003 08:04:57.032595 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-5597f8fd94-dj29q" Oct 03 08:05:11 crc kubenswrapper[4664]: I1003 08:05:11.987096 4664 patch_prober.go:28] interesting pod/machine-config-daemon-x9dgm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:05:11 crc kubenswrapper[4664]: I1003 08:05:11.987738 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.326060 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-79d68d6c85-mhzlb"] Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.327381 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-mhzlb" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.336672 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6c675fb79f-nw8d7"] Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.338005 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-nw8d7" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.340901 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-5khvm" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.341002 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-l4mwm" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.343135 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-79d68d6c85-mhzlb"] Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.355006 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-bvgv4"] Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.356370 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-bvgv4" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.359716 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-hdsxw" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.373650 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6c675fb79f-nw8d7"] Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.376783 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjbqx\" (UniqueName: \"kubernetes.io/projected/16a5e165-d1b6-4023-b48e-f4b918730203-kube-api-access-cjbqx\") pod \"barbican-operator-controller-manager-6c675fb79f-nw8d7\" (UID: \"16a5e165-d1b6-4023-b48e-f4b918730203\") " pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-nw8d7" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.376834 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9txgb\" (UniqueName: \"kubernetes.io/projected/f1eed736-5b4e-4e82-915d-288d59a82b94-kube-api-access-9txgb\") pod \"designate-operator-controller-manager-75dfd9b554-bvgv4\" (UID: \"f1eed736-5b4e-4e82-915d-288d59a82b94\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-bvgv4" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.376886 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzc9x\" (UniqueName: \"kubernetes.io/projected/30114f47-bc69-49b2-b3ab-d201c2c146ee-kube-api-access-xzc9x\") pod \"cinder-operator-controller-manager-79d68d6c85-mhzlb\" (UID: \"30114f47-bc69-49b2-b3ab-d201c2c146ee\") " pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-mhzlb" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.381896 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-846dff85b5-44trb"] Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.383278 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-44trb" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.388045 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-6d2wr" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.395029 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-bvgv4"] Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.407588 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-599898f689-g5h6q"] Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.409114 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-599898f689-g5h6q" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.413023 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-fxwnv" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.425216 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-846dff85b5-44trb"] Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.439541 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6769b867d9-qlgfl"] Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.440851 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-qlgfl" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.446007 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-x8ff8" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.446737 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-599898f689-g5h6q"] Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.483083 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x8f2\" (UniqueName: \"kubernetes.io/projected/28f90c47-2add-426b-953e-be4842b71cfc-kube-api-access-8x8f2\") pod \"horizon-operator-controller-manager-6769b867d9-qlgfl\" (UID: \"28f90c47-2add-426b-953e-be4842b71cfc\") " pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-qlgfl" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.483164 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhbtl\" (UniqueName: \"kubernetes.io/projected/d80c9513-6585-415d-a747-2ff4bcca33d7-kube-api-access-xhbtl\") pod \"glance-operator-controller-manager-846dff85b5-44trb\" (UID: \"d80c9513-6585-415d-a747-2ff4bcca33d7\") " pod="openstack-operators/glance-operator-controller-manager-846dff85b5-44trb" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.483191 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xc4q\" (UniqueName: \"kubernetes.io/projected/56f9a3d8-3c1e-4d8d-a005-3c5f42beb67e-kube-api-access-4xc4q\") pod \"heat-operator-controller-manager-599898f689-g5h6q\" (UID: \"56f9a3d8-3c1e-4d8d-a005-3c5f42beb67e\") " pod="openstack-operators/heat-operator-controller-manager-599898f689-g5h6q" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.483264 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjbqx\" (UniqueName: \"kubernetes.io/projected/16a5e165-d1b6-4023-b48e-f4b918730203-kube-api-access-cjbqx\") pod \"barbican-operator-controller-manager-6c675fb79f-nw8d7\" (UID: \"16a5e165-d1b6-4023-b48e-f4b918730203\") " pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-nw8d7" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.483290 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9txgb\" (UniqueName: \"kubernetes.io/projected/f1eed736-5b4e-4e82-915d-288d59a82b94-kube-api-access-9txgb\") pod \"designate-operator-controller-manager-75dfd9b554-bvgv4\" (UID: \"f1eed736-5b4e-4e82-915d-288d59a82b94\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-bvgv4" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.483331 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzc9x\" (UniqueName: \"kubernetes.io/projected/30114f47-bc69-49b2-b3ab-d201c2c146ee-kube-api-access-xzc9x\") pod \"cinder-operator-controller-manager-79d68d6c85-mhzlb\" (UID: \"30114f47-bc69-49b2-b3ab-d201c2c146ee\") " pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-mhzlb" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.505227 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6769b867d9-qlgfl"] Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.542026 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-84bc9db6cc-fhpcz"] Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.545978 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzc9x\" (UniqueName: \"kubernetes.io/projected/30114f47-bc69-49b2-b3ab-d201c2c146ee-kube-api-access-xzc9x\") pod \"cinder-operator-controller-manager-79d68d6c85-mhzlb\" (UID: \"30114f47-bc69-49b2-b3ab-d201c2c146ee\") " pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-mhzlb" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.551911 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjbqx\" (UniqueName: \"kubernetes.io/projected/16a5e165-d1b6-4023-b48e-f4b918730203-kube-api-access-cjbqx\") pod \"barbican-operator-controller-manager-6c675fb79f-nw8d7\" (UID: \"16a5e165-d1b6-4023-b48e-f4b918730203\") " pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-nw8d7" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.559527 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-5fbf469cd7-n8zxl"] Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.559847 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-fhpcz" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.562962 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-n8zxl" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.566557 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9txgb\" (UniqueName: \"kubernetes.io/projected/f1eed736-5b4e-4e82-915d-288d59a82b94-kube-api-access-9txgb\") pod \"designate-operator-controller-manager-75dfd9b554-bvgv4\" (UID: \"f1eed736-5b4e-4e82-915d-288d59a82b94\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-bvgv4" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.566692 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-84bc9db6cc-fhpcz"] Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.580967 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5fbf469cd7-n8zxl"] Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.585279 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-nwb8h" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.585472 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.585575 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-ggkcl" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.586278 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47fe850e-9d6e-4a75-9e91-2c87cdb2e8bc-cert\") pod \"infra-operator-controller-manager-5fbf469cd7-n8zxl\" (UID: \"47fe850e-9d6e-4a75-9e91-2c87cdb2e8bc\") " pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-n8zxl" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.586322 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x8f2\" (UniqueName: \"kubernetes.io/projected/28f90c47-2add-426b-953e-be4842b71cfc-kube-api-access-8x8f2\") pod \"horizon-operator-controller-manager-6769b867d9-qlgfl\" (UID: \"28f90c47-2add-426b-953e-be4842b71cfc\") " pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-qlgfl" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.586354 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhbtl\" (UniqueName: \"kubernetes.io/projected/d80c9513-6585-415d-a747-2ff4bcca33d7-kube-api-access-xhbtl\") pod \"glance-operator-controller-manager-846dff85b5-44trb\" (UID: \"d80c9513-6585-415d-a747-2ff4bcca33d7\") " pod="openstack-operators/glance-operator-controller-manager-846dff85b5-44trb" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.586369 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xc4q\" (UniqueName: \"kubernetes.io/projected/56f9a3d8-3c1e-4d8d-a005-3c5f42beb67e-kube-api-access-4xc4q\") pod \"heat-operator-controller-manager-599898f689-g5h6q\" (UID: \"56f9a3d8-3c1e-4d8d-a005-3c5f42beb67e\") " pod="openstack-operators/heat-operator-controller-manager-599898f689-g5h6q" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.586389 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77m69\" (UniqueName: \"kubernetes.io/projected/47fe850e-9d6e-4a75-9e91-2c87cdb2e8bc-kube-api-access-77m69\") pod \"infra-operator-controller-manager-5fbf469cd7-n8zxl\" (UID: \"47fe850e-9d6e-4a75-9e91-2c87cdb2e8bc\") " pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-n8zxl" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.586448 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r62wk\" (UniqueName: \"kubernetes.io/projected/45229d75-9e65-47bb-a54a-b27742fe2717-kube-api-access-r62wk\") pod \"ironic-operator-controller-manager-84bc9db6cc-fhpcz\" (UID: \"45229d75-9e65-47bb-a54a-b27742fe2717\") " pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-fhpcz" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.600849 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7f55849f88-qf7qg"] Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.602322 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-qf7qg" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.617106 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-7sskf" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.634112 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7f55849f88-qf7qg"] Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.655708 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-6fd6854b49-8nvt9"] Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.656116 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-mhzlb" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.657067 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-8nvt9" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.666263 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-8b7n2" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.673250 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhbtl\" (UniqueName: \"kubernetes.io/projected/d80c9513-6585-415d-a747-2ff4bcca33d7-kube-api-access-xhbtl\") pod \"glance-operator-controller-manager-846dff85b5-44trb\" (UID: \"d80c9513-6585-415d-a747-2ff4bcca33d7\") " pod="openstack-operators/glance-operator-controller-manager-846dff85b5-44trb" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.681133 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-nw8d7" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.694065 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47fe850e-9d6e-4a75-9e91-2c87cdb2e8bc-cert\") pod \"infra-operator-controller-manager-5fbf469cd7-n8zxl\" (UID: \"47fe850e-9d6e-4a75-9e91-2c87cdb2e8bc\") " pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-n8zxl" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.694155 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nt47k\" (UniqueName: \"kubernetes.io/projected/738a79fc-a362-4c7c-a101-f8551019a96b-kube-api-access-nt47k\") pod \"manila-operator-controller-manager-6fd6854b49-8nvt9\" (UID: \"738a79fc-a362-4c7c-a101-f8551019a96b\") " pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-8nvt9" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.694232 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4qtq\" (UniqueName: \"kubernetes.io/projected/2ed5f442-2c30-4c29-a65b-4b3d9262cbce-kube-api-access-j4qtq\") pod \"keystone-operator-controller-manager-7f55849f88-qf7qg\" (UID: \"2ed5f442-2c30-4c29-a65b-4b3d9262cbce\") " pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-qf7qg" Oct 03 08:05:14 crc kubenswrapper[4664]: E1003 08:05:14.694837 4664 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 03 08:05:14 crc kubenswrapper[4664]: E1003 08:05:14.694914 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47fe850e-9d6e-4a75-9e91-2c87cdb2e8bc-cert podName:47fe850e-9d6e-4a75-9e91-2c87cdb2e8bc nodeName:}" failed. No retries permitted until 2025-10-03 08:05:15.194894322 +0000 UTC m=+1016.016084812 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/47fe850e-9d6e-4a75-9e91-2c87cdb2e8bc-cert") pod "infra-operator-controller-manager-5fbf469cd7-n8zxl" (UID: "47fe850e-9d6e-4a75-9e91-2c87cdb2e8bc") : secret "infra-operator-webhook-server-cert" not found Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.695338 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x8f2\" (UniqueName: \"kubernetes.io/projected/28f90c47-2add-426b-953e-be4842b71cfc-kube-api-access-8x8f2\") pod \"horizon-operator-controller-manager-6769b867d9-qlgfl\" (UID: \"28f90c47-2add-426b-953e-be4842b71cfc\") " pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-qlgfl" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.695401 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77m69\" (UniqueName: \"kubernetes.io/projected/47fe850e-9d6e-4a75-9e91-2c87cdb2e8bc-kube-api-access-77m69\") pod \"infra-operator-controller-manager-5fbf469cd7-n8zxl\" (UID: \"47fe850e-9d6e-4a75-9e91-2c87cdb2e8bc\") " pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-n8zxl" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.702915 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r62wk\" (UniqueName: \"kubernetes.io/projected/45229d75-9e65-47bb-a54a-b27742fe2717-kube-api-access-r62wk\") pod \"ironic-operator-controller-manager-84bc9db6cc-fhpcz\" (UID: \"45229d75-9e65-47bb-a54a-b27742fe2717\") " pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-fhpcz" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.703567 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-bvgv4" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.718312 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6fd6854b49-8nvt9"] Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.720952 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-44trb" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.728105 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xc4q\" (UniqueName: \"kubernetes.io/projected/56f9a3d8-3c1e-4d8d-a005-3c5f42beb67e-kube-api-access-4xc4q\") pod \"heat-operator-controller-manager-599898f689-g5h6q\" (UID: \"56f9a3d8-3c1e-4d8d-a005-3c5f42beb67e\") " pod="openstack-operators/heat-operator-controller-manager-599898f689-g5h6q" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.735351 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-tz9q5"] Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.738576 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-tz9q5" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.742973 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-599898f689-g5h6q" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.750819 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77m69\" (UniqueName: \"kubernetes.io/projected/47fe850e-9d6e-4a75-9e91-2c87cdb2e8bc-kube-api-access-77m69\") pod \"infra-operator-controller-manager-5fbf469cd7-n8zxl\" (UID: \"47fe850e-9d6e-4a75-9e91-2c87cdb2e8bc\") " pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-n8zxl" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.765191 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-92wk7" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.774711 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-tz9q5"] Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.783343 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r62wk\" (UniqueName: \"kubernetes.io/projected/45229d75-9e65-47bb-a54a-b27742fe2717-kube-api-access-r62wk\") pod \"ironic-operator-controller-manager-84bc9db6cc-fhpcz\" (UID: \"45229d75-9e65-47bb-a54a-b27742fe2717\") " pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-fhpcz" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.789707 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-555c7456bd-59qtz"] Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.790826 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-59qtz" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.794104 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-h6lzt" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.805906 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdkxw\" (UniqueName: \"kubernetes.io/projected/d5285a08-51d1-4f27-b87e-73fc4c0dc037-kube-api-access-qdkxw\") pod \"mariadb-operator-controller-manager-5c468bf4d4-tz9q5\" (UID: \"d5285a08-51d1-4f27-b87e-73fc4c0dc037\") " pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-tz9q5" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.806376 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nt47k\" (UniqueName: \"kubernetes.io/projected/738a79fc-a362-4c7c-a101-f8551019a96b-kube-api-access-nt47k\") pod \"manila-operator-controller-manager-6fd6854b49-8nvt9\" (UID: \"738a79fc-a362-4c7c-a101-f8551019a96b\") " pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-8nvt9" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.806423 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4qtq\" (UniqueName: \"kubernetes.io/projected/2ed5f442-2c30-4c29-a65b-4b3d9262cbce-kube-api-access-j4qtq\") pod \"keystone-operator-controller-manager-7f55849f88-qf7qg\" (UID: \"2ed5f442-2c30-4c29-a65b-4b3d9262cbce\") " pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-qf7qg" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.814170 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-qlgfl" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.835715 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-59d6cfdf45-8jdvq"] Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.837348 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-8jdvq" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.855988 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-c5dzn" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.856322 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6574bf987d-bf92q"] Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.858014 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-bf92q" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.866581 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4qtq\" (UniqueName: \"kubernetes.io/projected/2ed5f442-2c30-4c29-a65b-4b3d9262cbce-kube-api-access-j4qtq\") pod \"keystone-operator-controller-manager-7f55849f88-qf7qg\" (UID: \"2ed5f442-2c30-4c29-a65b-4b3d9262cbce\") " pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-qf7qg" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.869717 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-555c7456bd-59qtz"] Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.870100 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-8bmm6" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.907793 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7jpn\" (UniqueName: \"kubernetes.io/projected/5bedd109-c928-4b87-8593-21f72b0a6165-kube-api-access-c7jpn\") pod \"nova-operator-controller-manager-555c7456bd-59qtz\" (UID: \"5bedd109-c928-4b87-8593-21f72b0a6165\") " pod="openstack-operators/nova-operator-controller-manager-555c7456bd-59qtz" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.907977 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52mc4\" (UniqueName: \"kubernetes.io/projected/a931be65-b5d0-4685-82ea-42102c8235c8-kube-api-access-52mc4\") pod \"octavia-operator-controller-manager-59d6cfdf45-8jdvq\" (UID: \"a931be65-b5d0-4685-82ea-42102c8235c8\") " pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-8jdvq" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.908044 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6bs5\" (UniqueName: \"kubernetes.io/projected/7d9f54b0-ed53-4042-98d1-eb5ceb6f629b-kube-api-access-t6bs5\") pod \"neutron-operator-controller-manager-6574bf987d-bf92q\" (UID: \"7d9f54b0-ed53-4042-98d1-eb5ceb6f629b\") " pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-bf92q" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.908097 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdkxw\" (UniqueName: \"kubernetes.io/projected/d5285a08-51d1-4f27-b87e-73fc4c0dc037-kube-api-access-qdkxw\") pod \"mariadb-operator-controller-manager-5c468bf4d4-tz9q5\" (UID: \"d5285a08-51d1-4f27-b87e-73fc4c0dc037\") " pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-tz9q5" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.907819 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nt47k\" (UniqueName: \"kubernetes.io/projected/738a79fc-a362-4c7c-a101-f8551019a96b-kube-api-access-nt47k\") pod \"manila-operator-controller-manager-6fd6854b49-8nvt9\" (UID: \"738a79fc-a362-4c7c-a101-f8551019a96b\") " pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-8nvt9" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.914088 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-fhpcz" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.917742 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-59d6cfdf45-8jdvq"] Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.951588 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-qf7qg" Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.978178 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6574bf987d-bf92q"] Oct 03 08:05:14 crc kubenswrapper[4664]: I1003 08:05:14.991983 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdkxw\" (UniqueName: \"kubernetes.io/projected/d5285a08-51d1-4f27-b87e-73fc4c0dc037-kube-api-access-qdkxw\") pod \"mariadb-operator-controller-manager-5c468bf4d4-tz9q5\" (UID: \"d5285a08-51d1-4f27-b87e-73fc4c0dc037\") " pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-tz9q5" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.008381 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-7d8bb7f44c-fg9pn"] Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.013217 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52mc4\" (UniqueName: \"kubernetes.io/projected/a931be65-b5d0-4685-82ea-42102c8235c8-kube-api-access-52mc4\") pod \"octavia-operator-controller-manager-59d6cfdf45-8jdvq\" (UID: \"a931be65-b5d0-4685-82ea-42102c8235c8\") " pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-8jdvq" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.013307 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6bs5\" (UniqueName: \"kubernetes.io/projected/7d9f54b0-ed53-4042-98d1-eb5ceb6f629b-kube-api-access-t6bs5\") pod \"neutron-operator-controller-manager-6574bf987d-bf92q\" (UID: \"7d9f54b0-ed53-4042-98d1-eb5ceb6f629b\") " pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-bf92q" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.013402 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7jpn\" (UniqueName: \"kubernetes.io/projected/5bedd109-c928-4b87-8593-21f72b0a6165-kube-api-access-c7jpn\") pod \"nova-operator-controller-manager-555c7456bd-59qtz\" (UID: \"5bedd109-c928-4b87-8593-21f72b0a6165\") " pod="openstack-operators/nova-operator-controller-manager-555c7456bd-59qtz" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.023336 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-fg9pn" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.037926 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-688db7b6c7-ct22p"] Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.039858 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-ct22p" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.045994 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-fv9jj" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.056004 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678qkr5h"] Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.057514 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678qkr5h" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.064579 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-7fbxf" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.073307 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-kxp5m" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.081207 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.099701 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6bs5\" (UniqueName: \"kubernetes.io/projected/7d9f54b0-ed53-4042-98d1-eb5ceb6f629b-kube-api-access-t6bs5\") pod \"neutron-operator-controller-manager-6574bf987d-bf92q\" (UID: \"7d9f54b0-ed53-4042-98d1-eb5ceb6f629b\") " pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-bf92q" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.113072 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7jpn\" (UniqueName: \"kubernetes.io/projected/5bedd109-c928-4b87-8593-21f72b0a6165-kube-api-access-c7jpn\") pod \"nova-operator-controller-manager-555c7456bd-59qtz\" (UID: \"5bedd109-c928-4b87-8593-21f72b0a6165\") " pod="openstack-operators/nova-operator-controller-manager-555c7456bd-59qtz" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.114734 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vhn6\" (UniqueName: \"kubernetes.io/projected/e8cd4885-34c3-4a78-b23d-9e57aa0517ca-kube-api-access-4vhn6\") pod \"ovn-operator-controller-manager-688db7b6c7-ct22p\" (UID: \"e8cd4885-34c3-4a78-b23d-9e57aa0517ca\") " pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-ct22p" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.114848 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kw99n\" (UniqueName: \"kubernetes.io/projected/bca141be-db7d-4e1e-b95f-12f9b63522b7-kube-api-access-kw99n\") pod \"placement-operator-controller-manager-7d8bb7f44c-fg9pn\" (UID: \"bca141be-db7d-4e1e-b95f-12f9b63522b7\") " pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-fg9pn" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.114872 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2784f55b-9b0e-49e0-9a5b-df56008a2be9-cert\") pod \"openstack-baremetal-operator-controller-manager-6f64c4d678qkr5h\" (UID: \"2784f55b-9b0e-49e0-9a5b-df56008a2be9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678qkr5h" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.114899 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snlwk\" (UniqueName: \"kubernetes.io/projected/2784f55b-9b0e-49e0-9a5b-df56008a2be9-kube-api-access-snlwk\") pod \"openstack-baremetal-operator-controller-manager-6f64c4d678qkr5h\" (UID: \"2784f55b-9b0e-49e0-9a5b-df56008a2be9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678qkr5h" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.117268 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52mc4\" (UniqueName: \"kubernetes.io/projected/a931be65-b5d0-4685-82ea-42102c8235c8-kube-api-access-52mc4\") pod \"octavia-operator-controller-manager-59d6cfdf45-8jdvq\" (UID: \"a931be65-b5d0-4685-82ea-42102c8235c8\") " pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-8jdvq" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.137378 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-8nvt9" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.137387 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-7d8bb7f44c-fg9pn"] Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.164400 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-688db7b6c7-ct22p"] Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.193941 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-tz9q5" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.200270 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5db5cf686f-fm85d"] Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.202084 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-fm85d" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.206240 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678qkr5h"] Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.240257 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-kcqct"] Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.244536 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-s9jrk" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.268160 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-59qtz" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.278878 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-kcqct" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.282411 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-8jdvq" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.287297 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-9dshv" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.240607 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kw99n\" (UniqueName: \"kubernetes.io/projected/bca141be-db7d-4e1e-b95f-12f9b63522b7-kube-api-access-kw99n\") pod \"placement-operator-controller-manager-7d8bb7f44c-fg9pn\" (UID: \"bca141be-db7d-4e1e-b95f-12f9b63522b7\") " pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-fg9pn" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.290870 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2784f55b-9b0e-49e0-9a5b-df56008a2be9-cert\") pod \"openstack-baremetal-operator-controller-manager-6f64c4d678qkr5h\" (UID: \"2784f55b-9b0e-49e0-9a5b-df56008a2be9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678qkr5h" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.290963 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snlwk\" (UniqueName: \"kubernetes.io/projected/2784f55b-9b0e-49e0-9a5b-df56008a2be9-kube-api-access-snlwk\") pod \"openstack-baremetal-operator-controller-manager-6f64c4d678qkr5h\" (UID: \"2784f55b-9b0e-49e0-9a5b-df56008a2be9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678qkr5h" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.291157 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vhn6\" (UniqueName: \"kubernetes.io/projected/e8cd4885-34c3-4a78-b23d-9e57aa0517ca-kube-api-access-4vhn6\") pod \"ovn-operator-controller-manager-688db7b6c7-ct22p\" (UID: \"e8cd4885-34c3-4a78-b23d-9e57aa0517ca\") " pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-ct22p" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.291422 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47fe850e-9d6e-4a75-9e91-2c87cdb2e8bc-cert\") pod \"infra-operator-controller-manager-5fbf469cd7-n8zxl\" (UID: \"47fe850e-9d6e-4a75-9e91-2c87cdb2e8bc\") " pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-n8zxl" Oct 03 08:05:15 crc kubenswrapper[4664]: E1003 08:05:15.292911 4664 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 03 08:05:15 crc kubenswrapper[4664]: E1003 08:05:15.293016 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2784f55b-9b0e-49e0-9a5b-df56008a2be9-cert podName:2784f55b-9b0e-49e0-9a5b-df56008a2be9 nodeName:}" failed. No retries permitted until 2025-10-03 08:05:15.792988459 +0000 UTC m=+1016.614178949 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2784f55b-9b0e-49e0-9a5b-df56008a2be9-cert") pod "openstack-baremetal-operator-controller-manager-6f64c4d678qkr5h" (UID: "2784f55b-9b0e-49e0-9a5b-df56008a2be9") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.312601 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-bf92q" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.321314 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47fe850e-9d6e-4a75-9e91-2c87cdb2e8bc-cert\") pod \"infra-operator-controller-manager-5fbf469cd7-n8zxl\" (UID: \"47fe850e-9d6e-4a75-9e91-2c87cdb2e8bc\") " pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-n8zxl" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.366412 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vhn6\" (UniqueName: \"kubernetes.io/projected/e8cd4885-34c3-4a78-b23d-9e57aa0517ca-kube-api-access-4vhn6\") pod \"ovn-operator-controller-manager-688db7b6c7-ct22p\" (UID: \"e8cd4885-34c3-4a78-b23d-9e57aa0517ca\") " pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-ct22p" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.388018 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kw99n\" (UniqueName: \"kubernetes.io/projected/bca141be-db7d-4e1e-b95f-12f9b63522b7-kube-api-access-kw99n\") pod \"placement-operator-controller-manager-7d8bb7f44c-fg9pn\" (UID: \"bca141be-db7d-4e1e-b95f-12f9b63522b7\") " pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-fg9pn" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.388801 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snlwk\" (UniqueName: \"kubernetes.io/projected/2784f55b-9b0e-49e0-9a5b-df56008a2be9-kube-api-access-snlwk\") pod \"openstack-baremetal-operator-controller-manager-6f64c4d678qkr5h\" (UID: \"2784f55b-9b0e-49e0-9a5b-df56008a2be9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678qkr5h" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.388886 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5db5cf686f-fm85d"] Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.395298 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-fg9pn" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.406241 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-kcqct"] Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.412227 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2cp7\" (UniqueName: \"kubernetes.io/projected/294a44e7-d4f2-4162-9f34-2cd9c4a9aa49-kube-api-access-k2cp7\") pod \"swift-operator-controller-manager-6859f9b676-kcqct\" (UID: \"294a44e7-d4f2-4162-9f34-2cd9c4a9aa49\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-kcqct" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.412395 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llhl2\" (UniqueName: \"kubernetes.io/projected/6492ba28-5e2e-42b3-829e-5b666703bd85-kube-api-access-llhl2\") pod \"telemetry-operator-controller-manager-5db5cf686f-fm85d\" (UID: \"6492ba28-5e2e-42b3-829e-5b666703bd85\") " pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-fm85d" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.419184 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-ct22p" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.445796 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-fcd7d9895-8wwpz"] Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.447415 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-8wwpz" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.451328 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-fggkm" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.467934 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-t48px"] Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.469258 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-t48px" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.476648 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-fcd7d9895-8wwpz"] Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.484317 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-4fv7g" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.515425 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2cp7\" (UniqueName: \"kubernetes.io/projected/294a44e7-d4f2-4162-9f34-2cd9c4a9aa49-kube-api-access-k2cp7\") pod \"swift-operator-controller-manager-6859f9b676-kcqct\" (UID: \"294a44e7-d4f2-4162-9f34-2cd9c4a9aa49\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-kcqct" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.515480 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5f5d\" (UniqueName: \"kubernetes.io/projected/07ab2a7c-e330-498d-a1b0-b2155c491839-kube-api-access-s5f5d\") pod \"test-operator-controller-manager-5cd5cb47d7-t48px\" (UID: \"07ab2a7c-e330-498d-a1b0-b2155c491839\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-t48px" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.515515 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llhl2\" (UniqueName: \"kubernetes.io/projected/6492ba28-5e2e-42b3-829e-5b666703bd85-kube-api-access-llhl2\") pod \"telemetry-operator-controller-manager-5db5cf686f-fm85d\" (UID: \"6492ba28-5e2e-42b3-829e-5b666703bd85\") " pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-fm85d" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.515581 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtzjr\" (UniqueName: \"kubernetes.io/projected/25a36ab5-71b4-4660-99a6-86c3c6554c86-kube-api-access-mtzjr\") pod \"watcher-operator-controller-manager-fcd7d9895-8wwpz\" (UID: \"25a36ab5-71b4-4660-99a6-86c3c6554c86\") " pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-8wwpz" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.531500 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-t48px"] Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.534322 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-n8zxl" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.560193 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llhl2\" (UniqueName: \"kubernetes.io/projected/6492ba28-5e2e-42b3-829e-5b666703bd85-kube-api-access-llhl2\") pod \"telemetry-operator-controller-manager-5db5cf686f-fm85d\" (UID: \"6492ba28-5e2e-42b3-829e-5b666703bd85\") " pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-fm85d" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.567755 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6f9d674864-tpv2h"] Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.570371 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6f9d674864-tpv2h" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.576009 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-wwb2v" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.576327 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.578933 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6f9d674864-tpv2h"] Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.580946 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2cp7\" (UniqueName: \"kubernetes.io/projected/294a44e7-d4f2-4162-9f34-2cd9c4a9aa49-kube-api-access-k2cp7\") pod \"swift-operator-controller-manager-6859f9b676-kcqct\" (UID: \"294a44e7-d4f2-4162-9f34-2cd9c4a9aa49\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-kcqct" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.602556 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-s6fqp"] Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.602931 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-fm85d" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.605087 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-s6fqp" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.607487 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-8f87k" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.610443 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-s6fqp"] Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.622317 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtzjr\" (UniqueName: \"kubernetes.io/projected/25a36ab5-71b4-4660-99a6-86c3c6554c86-kube-api-access-mtzjr\") pod \"watcher-operator-controller-manager-fcd7d9895-8wwpz\" (UID: \"25a36ab5-71b4-4660-99a6-86c3c6554c86\") " pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-8wwpz" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.622400 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9d494f25-e54e-4ddf-a4cc-a632d05db780-cert\") pod \"openstack-operator-controller-manager-6f9d674864-tpv2h\" (UID: \"9d494f25-e54e-4ddf-a4cc-a632d05db780\") " pod="openstack-operators/openstack-operator-controller-manager-6f9d674864-tpv2h" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.622435 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nqbm\" (UniqueName: \"kubernetes.io/projected/9d494f25-e54e-4ddf-a4cc-a632d05db780-kube-api-access-6nqbm\") pod \"openstack-operator-controller-manager-6f9d674864-tpv2h\" (UID: \"9d494f25-e54e-4ddf-a4cc-a632d05db780\") " pod="openstack-operators/openstack-operator-controller-manager-6f9d674864-tpv2h" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.622480 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqqx4\" (UniqueName: \"kubernetes.io/projected/928529ba-1444-4ccc-9fab-cac6102c3375-kube-api-access-qqqx4\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-s6fqp\" (UID: \"928529ba-1444-4ccc-9fab-cac6102c3375\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-s6fqp" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.622549 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5f5d\" (UniqueName: \"kubernetes.io/projected/07ab2a7c-e330-498d-a1b0-b2155c491839-kube-api-access-s5f5d\") pod \"test-operator-controller-manager-5cd5cb47d7-t48px\" (UID: \"07ab2a7c-e330-498d-a1b0-b2155c491839\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-t48px" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.660834 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtzjr\" (UniqueName: \"kubernetes.io/projected/25a36ab5-71b4-4660-99a6-86c3c6554c86-kube-api-access-mtzjr\") pod \"watcher-operator-controller-manager-fcd7d9895-8wwpz\" (UID: \"25a36ab5-71b4-4660-99a6-86c3c6554c86\") " pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-8wwpz" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.688326 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5f5d\" (UniqueName: \"kubernetes.io/projected/07ab2a7c-e330-498d-a1b0-b2155c491839-kube-api-access-s5f5d\") pod \"test-operator-controller-manager-5cd5cb47d7-t48px\" (UID: \"07ab2a7c-e330-498d-a1b0-b2155c491839\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-t48px" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.724401 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqqx4\" (UniqueName: \"kubernetes.io/projected/928529ba-1444-4ccc-9fab-cac6102c3375-kube-api-access-qqqx4\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-s6fqp\" (UID: \"928529ba-1444-4ccc-9fab-cac6102c3375\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-s6fqp" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.724529 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9d494f25-e54e-4ddf-a4cc-a632d05db780-cert\") pod \"openstack-operator-controller-manager-6f9d674864-tpv2h\" (UID: \"9d494f25-e54e-4ddf-a4cc-a632d05db780\") " pod="openstack-operators/openstack-operator-controller-manager-6f9d674864-tpv2h" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.724565 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nqbm\" (UniqueName: \"kubernetes.io/projected/9d494f25-e54e-4ddf-a4cc-a632d05db780-kube-api-access-6nqbm\") pod \"openstack-operator-controller-manager-6f9d674864-tpv2h\" (UID: \"9d494f25-e54e-4ddf-a4cc-a632d05db780\") " pod="openstack-operators/openstack-operator-controller-manager-6f9d674864-tpv2h" Oct 03 08:05:15 crc kubenswrapper[4664]: E1003 08:05:15.725172 4664 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 03 08:05:15 crc kubenswrapper[4664]: E1003 08:05:15.726189 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d494f25-e54e-4ddf-a4cc-a632d05db780-cert podName:9d494f25-e54e-4ddf-a4cc-a632d05db780 nodeName:}" failed. No retries permitted until 2025-10-03 08:05:16.226172654 +0000 UTC m=+1017.047363144 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9d494f25-e54e-4ddf-a4cc-a632d05db780-cert") pod "openstack-operator-controller-manager-6f9d674864-tpv2h" (UID: "9d494f25-e54e-4ddf-a4cc-a632d05db780") : secret "webhook-server-cert" not found Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.761382 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqqx4\" (UniqueName: \"kubernetes.io/projected/928529ba-1444-4ccc-9fab-cac6102c3375-kube-api-access-qqqx4\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-s6fqp\" (UID: \"928529ba-1444-4ccc-9fab-cac6102c3375\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-s6fqp" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.782173 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nqbm\" (UniqueName: \"kubernetes.io/projected/9d494f25-e54e-4ddf-a4cc-a632d05db780-kube-api-access-6nqbm\") pod \"openstack-operator-controller-manager-6f9d674864-tpv2h\" (UID: \"9d494f25-e54e-4ddf-a4cc-a632d05db780\") " pod="openstack-operators/openstack-operator-controller-manager-6f9d674864-tpv2h" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.788717 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-t48px" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.816120 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-kcqct" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.820826 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-8wwpz" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.833296 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2784f55b-9b0e-49e0-9a5b-df56008a2be9-cert\") pod \"openstack-baremetal-operator-controller-manager-6f64c4d678qkr5h\" (UID: \"2784f55b-9b0e-49e0-9a5b-df56008a2be9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678qkr5h" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.844656 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2784f55b-9b0e-49e0-9a5b-df56008a2be9-cert\") pod \"openstack-baremetal-operator-controller-manager-6f64c4d678qkr5h\" (UID: \"2784f55b-9b0e-49e0-9a5b-df56008a2be9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678qkr5h" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.888690 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-s6fqp" Oct 03 08:05:15 crc kubenswrapper[4664]: I1003 08:05:15.973787 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-79d68d6c85-mhzlb"] Oct 03 08:05:16 crc kubenswrapper[4664]: I1003 08:05:16.050427 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678qkr5h" Oct 03 08:05:16 crc kubenswrapper[4664]: I1003 08:05:16.217106 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-mhzlb" event={"ID":"30114f47-bc69-49b2-b3ab-d201c2c146ee","Type":"ContainerStarted","Data":"daed6bb39380270900707b866df8e3b6a92c1ab4b7e4139b9944fc4112c85550"} Oct 03 08:05:16 crc kubenswrapper[4664]: I1003 08:05:16.246389 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9d494f25-e54e-4ddf-a4cc-a632d05db780-cert\") pod \"openstack-operator-controller-manager-6f9d674864-tpv2h\" (UID: \"9d494f25-e54e-4ddf-a4cc-a632d05db780\") " pod="openstack-operators/openstack-operator-controller-manager-6f9d674864-tpv2h" Oct 03 08:05:16 crc kubenswrapper[4664]: E1003 08:05:16.246527 4664 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 03 08:05:16 crc kubenswrapper[4664]: E1003 08:05:16.246578 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d494f25-e54e-4ddf-a4cc-a632d05db780-cert podName:9d494f25-e54e-4ddf-a4cc-a632d05db780 nodeName:}" failed. No retries permitted until 2025-10-03 08:05:17.246561472 +0000 UTC m=+1018.067751962 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9d494f25-e54e-4ddf-a4cc-a632d05db780-cert") pod "openstack-operator-controller-manager-6f9d674864-tpv2h" (UID: "9d494f25-e54e-4ddf-a4cc-a632d05db780") : secret "webhook-server-cert" not found Oct 03 08:05:16 crc kubenswrapper[4664]: I1003 08:05:16.729787 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6c675fb79f-nw8d7"] Oct 03 08:05:16 crc kubenswrapper[4664]: W1003 08:05:16.746719 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16a5e165_d1b6_4023_b48e_f4b918730203.slice/crio-ac6881f14da961086153deac9c980679289aa13a9d8d374b9502fb146cbe9ae5 WatchSource:0}: Error finding container ac6881f14da961086153deac9c980679289aa13a9d8d374b9502fb146cbe9ae5: Status 404 returned error can't find the container with id ac6881f14da961086153deac9c980679289aa13a9d8d374b9502fb146cbe9ae5 Oct 03 08:05:16 crc kubenswrapper[4664]: I1003 08:05:16.911771 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-599898f689-g5h6q"] Oct 03 08:05:16 crc kubenswrapper[4664]: I1003 08:05:16.927939 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-bvgv4"] Oct 03 08:05:16 crc kubenswrapper[4664]: W1003 08:05:16.929902 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56f9a3d8_3c1e_4d8d_a005_3c5f42beb67e.slice/crio-ae285fee7f835e1cd5a4e1685f5a68f710644659da7bfd3024f90ca52e1fe266 WatchSource:0}: Error finding container ae285fee7f835e1cd5a4e1685f5a68f710644659da7bfd3024f90ca52e1fe266: Status 404 returned error can't find the container with id ae285fee7f835e1cd5a4e1685f5a68f710644659da7bfd3024f90ca52e1fe266 Oct 03 08:05:16 crc kubenswrapper[4664]: W1003 08:05:16.932473 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1eed736_5b4e_4e82_915d_288d59a82b94.slice/crio-15a2eb0c7d3d524d2b1e925152b9f333c67e89267b11a505a42098665192b3e5 WatchSource:0}: Error finding container 15a2eb0c7d3d524d2b1e925152b9f333c67e89267b11a505a42098665192b3e5: Status 404 returned error can't find the container with id 15a2eb0c7d3d524d2b1e925152b9f333c67e89267b11a505a42098665192b3e5 Oct 03 08:05:17 crc kubenswrapper[4664]: I1003 08:05:17.259544 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-nw8d7" event={"ID":"16a5e165-d1b6-4023-b48e-f4b918730203","Type":"ContainerStarted","Data":"ac6881f14da961086153deac9c980679289aa13a9d8d374b9502fb146cbe9ae5"} Oct 03 08:05:17 crc kubenswrapper[4664]: I1003 08:05:17.269645 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-846dff85b5-44trb"] Oct 03 08:05:17 crc kubenswrapper[4664]: I1003 08:05:17.272392 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9d494f25-e54e-4ddf-a4cc-a632d05db780-cert\") pod \"openstack-operator-controller-manager-6f9d674864-tpv2h\" (UID: \"9d494f25-e54e-4ddf-a4cc-a632d05db780\") " pod="openstack-operators/openstack-operator-controller-manager-6f9d674864-tpv2h" Oct 03 08:05:17 crc kubenswrapper[4664]: I1003 08:05:17.292849 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-59d6cfdf45-8jdvq"] Oct 03 08:05:17 crc kubenswrapper[4664]: I1003 08:05:17.303263 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9d494f25-e54e-4ddf-a4cc-a632d05db780-cert\") pod \"openstack-operator-controller-manager-6f9d674864-tpv2h\" (UID: \"9d494f25-e54e-4ddf-a4cc-a632d05db780\") " pod="openstack-operators/openstack-operator-controller-manager-6f9d674864-tpv2h" Oct 03 08:05:17 crc kubenswrapper[4664]: I1003 08:05:17.308239 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7f55849f88-qf7qg"] Oct 03 08:05:17 crc kubenswrapper[4664]: I1003 08:05:17.318207 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-599898f689-g5h6q" event={"ID":"56f9a3d8-3c1e-4d8d-a005-3c5f42beb67e","Type":"ContainerStarted","Data":"ae285fee7f835e1cd5a4e1685f5a68f710644659da7bfd3024f90ca52e1fe266"} Oct 03 08:05:17 crc kubenswrapper[4664]: I1003 08:05:17.343028 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-84bc9db6cc-fhpcz"] Oct 03 08:05:17 crc kubenswrapper[4664]: I1003 08:05:17.343075 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-bvgv4" event={"ID":"f1eed736-5b4e-4e82-915d-288d59a82b94","Type":"ContainerStarted","Data":"15a2eb0c7d3d524d2b1e925152b9f333c67e89267b11a505a42098665192b3e5"} Oct 03 08:05:17 crc kubenswrapper[4664]: I1003 08:05:17.343474 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6f9d674864-tpv2h" Oct 03 08:05:17 crc kubenswrapper[4664]: I1003 08:05:17.387564 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-555c7456bd-59qtz"] Oct 03 08:05:17 crc kubenswrapper[4664]: I1003 08:05:17.409410 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5fbf469cd7-n8zxl"] Oct 03 08:05:17 crc kubenswrapper[4664]: I1003 08:05:17.434252 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6574bf987d-bf92q"] Oct 03 08:05:17 crc kubenswrapper[4664]: W1003 08:05:17.436535 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5285a08_51d1_4f27_b87e_73fc4c0dc037.slice/crio-9e96f577491d1bf60846282052a513fd61ea5081eb1cce618b4efd0212486820 WatchSource:0}: Error finding container 9e96f577491d1bf60846282052a513fd61ea5081eb1cce618b4efd0212486820: Status 404 returned error can't find the container with id 9e96f577491d1bf60846282052a513fd61ea5081eb1cce618b4efd0212486820 Oct 03 08:05:17 crc kubenswrapper[4664]: I1003 08:05:17.458365 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-7d8bb7f44c-fg9pn"] Oct 03 08:05:17 crc kubenswrapper[4664]: I1003 08:05:17.481024 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-tz9q5"] Oct 03 08:05:17 crc kubenswrapper[4664]: I1003 08:05:17.500082 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6769b867d9-qlgfl"] Oct 03 08:05:17 crc kubenswrapper[4664]: W1003 08:05:17.509623 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod928529ba_1444_4ccc_9fab_cac6102c3375.slice/crio-141a58a1b7089ad72d88990e0b76335d3303ddc100ff1629802a10fcefbcee07 WatchSource:0}: Error finding container 141a58a1b7089ad72d88990e0b76335d3303ddc100ff1629802a10fcefbcee07: Status 404 returned error can't find the container with id 141a58a1b7089ad72d88990e0b76335d3303ddc100ff1629802a10fcefbcee07 Oct 03 08:05:17 crc kubenswrapper[4664]: I1003 08:05:17.514047 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-s6fqp"] Oct 03 08:05:17 crc kubenswrapper[4664]: E1003 08:05:17.518970 4664 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qqqx4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-5f97d8c699-s6fqp_openstack-operators(928529ba-1444-4ccc-9fab-cac6102c3375): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 08:05:17 crc kubenswrapper[4664]: E1003 08:05:17.520368 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-s6fqp" podUID="928529ba-1444-4ccc-9fab-cac6102c3375" Oct 03 08:05:17 crc kubenswrapper[4664]: I1003 08:05:17.526935 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678qkr5h"] Oct 03 08:05:17 crc kubenswrapper[4664]: I1003 08:05:17.549661 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6fd6854b49-8nvt9"] Oct 03 08:05:17 crc kubenswrapper[4664]: I1003 08:05:17.571721 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-688db7b6c7-ct22p"] Oct 03 08:05:17 crc kubenswrapper[4664]: I1003 08:05:17.585878 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-kcqct"] Oct 03 08:05:17 crc kubenswrapper[4664]: I1003 08:05:17.598850 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5db5cf686f-fm85d"] Oct 03 08:05:17 crc kubenswrapper[4664]: E1003 08:05:17.608953 4664 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:8fdb7ea8542adb2eca73f11bd78e6aebceed2ba7a1e9fdd149c75e0049d09ce0,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nt47k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-6fd6854b49-8nvt9_openstack-operators(738a79fc-a362-4c7c-a101-f8551019a96b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 08:05:17 crc kubenswrapper[4664]: I1003 08:05:17.612280 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-t48px"] Oct 03 08:05:17 crc kubenswrapper[4664]: E1003 08:05:17.613462 4664 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-k2cp7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-6859f9b676-kcqct_openstack-operators(294a44e7-d4f2-4162-9f34-2cd9c4a9aa49): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 08:05:17 crc kubenswrapper[4664]: I1003 08:05:17.668178 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-fcd7d9895-8wwpz"] Oct 03 08:05:17 crc kubenswrapper[4664]: E1003 08:05:17.673805 4664 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:5c6ab93b78bd20eb7f1736751a59c1eb33fb06351339563dbefe49ccaaff6e94,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4vhn6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-688db7b6c7-ct22p_openstack-operators(e8cd4885-34c3-4a78-b23d-9e57aa0517ca): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 08:05:17 crc kubenswrapper[4664]: E1003 08:05:17.674024 4664 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:0daf76cc40ab619ae266b11defcc1b65beb22d859369e7b1b04de9169089a4cb,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s5f5d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5cd5cb47d7-t48px_openstack-operators(07ab2a7c-e330-498d-a1b0-b2155c491839): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 08:05:17 crc kubenswrapper[4664]: W1003 08:05:17.677325 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6492ba28_5e2e_42b3_829e_5b666703bd85.slice/crio-2666b828a48bb6f4e3986aa0642f55caa0e6cd96332d933ed3172e5e26e4d37c WatchSource:0}: Error finding container 2666b828a48bb6f4e3986aa0642f55caa0e6cd96332d933ed3172e5e26e4d37c: Status 404 returned error can't find the container with id 2666b828a48bb6f4e3986aa0642f55caa0e6cd96332d933ed3172e5e26e4d37c Oct 03 08:05:17 crc kubenswrapper[4664]: E1003 08:05:17.720059 4664 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:8f5eee2eb7b77432ef1a88ed693ff981514359dfc808581f393bcef252de5cfa,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-llhl2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-5db5cf686f-fm85d_openstack-operators(6492ba28-5e2e-42b3-829e-5b666703bd85): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 08:05:17 crc kubenswrapper[4664]: E1003 08:05:17.726376 4664 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:018151bd5ff830ec03c6b8e3d53cfb9456ca6e1e34793bdd4f7edd39a0146fa6,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mtzjr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-fcd7d9895-8wwpz_openstack-operators(25a36ab5-71b4-4660-99a6-86c3c6554c86): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 08:05:17 crc kubenswrapper[4664]: E1003 08:05:17.863424 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-kcqct" podUID="294a44e7-d4f2-4162-9f34-2cd9c4a9aa49" Oct 03 08:05:17 crc kubenswrapper[4664]: E1003 08:05:17.877713 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-8nvt9" podUID="738a79fc-a362-4c7c-a101-f8551019a96b" Oct 03 08:05:17 crc kubenswrapper[4664]: E1003 08:05:17.936078 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-t48px" podUID="07ab2a7c-e330-498d-a1b0-b2155c491839" Oct 03 08:05:17 crc kubenswrapper[4664]: E1003 08:05:17.960528 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-fm85d" podUID="6492ba28-5e2e-42b3-829e-5b666703bd85" Oct 03 08:05:17 crc kubenswrapper[4664]: E1003 08:05:17.963054 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-ct22p" podUID="e8cd4885-34c3-4a78-b23d-9e57aa0517ca" Oct 03 08:05:17 crc kubenswrapper[4664]: E1003 08:05:17.966595 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-8wwpz" podUID="25a36ab5-71b4-4660-99a6-86c3c6554c86" Oct 03 08:05:18 crc kubenswrapper[4664]: I1003 08:05:18.068569 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6f9d674864-tpv2h"] Oct 03 08:05:18 crc kubenswrapper[4664]: W1003 08:05:18.081934 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d494f25_e54e_4ddf_a4cc_a632d05db780.slice/crio-68358059c67e3823fe3116d8e39c2954f0d3908ccae83322695d7cb99864d951 WatchSource:0}: Error finding container 68358059c67e3823fe3116d8e39c2954f0d3908ccae83322695d7cb99864d951: Status 404 returned error can't find the container with id 68358059c67e3823fe3116d8e39c2954f0d3908ccae83322695d7cb99864d951 Oct 03 08:05:18 crc kubenswrapper[4664]: I1003 08:05:18.361916 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-44trb" event={"ID":"d80c9513-6585-415d-a747-2ff4bcca33d7","Type":"ContainerStarted","Data":"dbe07495c7c169bb0dcc8a99afdb1fb01e1a0fa3611b87e2b7d5b03158857029"} Oct 03 08:05:18 crc kubenswrapper[4664]: I1003 08:05:18.368662 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-kcqct" event={"ID":"294a44e7-d4f2-4162-9f34-2cd9c4a9aa49","Type":"ContainerStarted","Data":"edc98d4a33e6f7f4826543a74162627c9e2fc526a1d8f4fda52708a319baec9f"} Oct 03 08:05:18 crc kubenswrapper[4664]: I1003 08:05:18.368778 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-kcqct" event={"ID":"294a44e7-d4f2-4162-9f34-2cd9c4a9aa49","Type":"ContainerStarted","Data":"aee3a56dd11a9717097d6eec5dd7b28510761886cd0aa90e41692fd40cdce4fe"} Oct 03 08:05:18 crc kubenswrapper[4664]: I1003 08:05:18.370718 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6f9d674864-tpv2h" event={"ID":"9d494f25-e54e-4ddf-a4cc-a632d05db780","Type":"ContainerStarted","Data":"68358059c67e3823fe3116d8e39c2954f0d3908ccae83322695d7cb99864d951"} Oct 03 08:05:18 crc kubenswrapper[4664]: E1003 08:05:18.371192 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed\\\"\"" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-kcqct" podUID="294a44e7-d4f2-4162-9f34-2cd9c4a9aa49" Oct 03 08:05:18 crc kubenswrapper[4664]: I1003 08:05:18.374122 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-qlgfl" event={"ID":"28f90c47-2add-426b-953e-be4842b71cfc","Type":"ContainerStarted","Data":"819f50b8e9816272b152581abf713b1b7aa08c1701b04aa84df6b8c4561d4337"} Oct 03 08:05:18 crc kubenswrapper[4664]: I1003 08:05:18.382169 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-tz9q5" event={"ID":"d5285a08-51d1-4f27-b87e-73fc4c0dc037","Type":"ContainerStarted","Data":"9e96f577491d1bf60846282052a513fd61ea5081eb1cce618b4efd0212486820"} Oct 03 08:05:18 crc kubenswrapper[4664]: I1003 08:05:18.387995 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-8nvt9" event={"ID":"738a79fc-a362-4c7c-a101-f8551019a96b","Type":"ContainerStarted","Data":"e2ed004a9a38fd8dd067c7c572932d16d2e5f83d50e910504b065f1990c74872"} Oct 03 08:05:18 crc kubenswrapper[4664]: I1003 08:05:18.388060 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-8nvt9" event={"ID":"738a79fc-a362-4c7c-a101-f8551019a96b","Type":"ContainerStarted","Data":"1d360104c9423252e4550331616085c3f97eb7c3202307b143f36500f98b8736"} Oct 03 08:05:18 crc kubenswrapper[4664]: I1003 08:05:18.391386 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678qkr5h" event={"ID":"2784f55b-9b0e-49e0-9a5b-df56008a2be9","Type":"ContainerStarted","Data":"e74e5850f71445c3f4957199ded2bca60b8efa72a73b72ee2371520c5221580d"} Oct 03 08:05:18 crc kubenswrapper[4664]: E1003 08:05:18.393581 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:8fdb7ea8542adb2eca73f11bd78e6aebceed2ba7a1e9fdd149c75e0049d09ce0\\\"\"" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-8nvt9" podUID="738a79fc-a362-4c7c-a101-f8551019a96b" Oct 03 08:05:18 crc kubenswrapper[4664]: I1003 08:05:18.402872 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-s6fqp" event={"ID":"928529ba-1444-4ccc-9fab-cac6102c3375","Type":"ContainerStarted","Data":"141a58a1b7089ad72d88990e0b76335d3303ddc100ff1629802a10fcefbcee07"} Oct 03 08:05:18 crc kubenswrapper[4664]: E1003 08:05:18.406388 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-s6fqp" podUID="928529ba-1444-4ccc-9fab-cac6102c3375" Oct 03 08:05:18 crc kubenswrapper[4664]: I1003 08:05:18.413726 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-fm85d" event={"ID":"6492ba28-5e2e-42b3-829e-5b666703bd85","Type":"ContainerStarted","Data":"558e9fad5a13136048cd7fbf21c9bfdf8582741cb78fad38da470c74877099cb"} Oct 03 08:05:18 crc kubenswrapper[4664]: E1003 08:05:18.417084 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:8f5eee2eb7b77432ef1a88ed693ff981514359dfc808581f393bcef252de5cfa\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-fm85d" podUID="6492ba28-5e2e-42b3-829e-5b666703bd85" Oct 03 08:05:18 crc kubenswrapper[4664]: I1003 08:05:18.434455 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-fm85d" event={"ID":"6492ba28-5e2e-42b3-829e-5b666703bd85","Type":"ContainerStarted","Data":"2666b828a48bb6f4e3986aa0642f55caa0e6cd96332d933ed3172e5e26e4d37c"} Oct 03 08:05:18 crc kubenswrapper[4664]: I1003 08:05:18.449338 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-t48px" event={"ID":"07ab2a7c-e330-498d-a1b0-b2155c491839","Type":"ContainerStarted","Data":"f1c772790a2cf926bd120a427dc8e094cd84698241fc4404dd63b57bd652f9ff"} Oct 03 08:05:18 crc kubenswrapper[4664]: I1003 08:05:18.449392 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-t48px" event={"ID":"07ab2a7c-e330-498d-a1b0-b2155c491839","Type":"ContainerStarted","Data":"e6a03c7013cb8305cfdfd44da4ee3fb3150e1be458dda77e55403428b4a53f1e"} Oct 03 08:05:18 crc kubenswrapper[4664]: E1003 08:05:18.459620 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:0daf76cc40ab619ae266b11defcc1b65beb22d859369e7b1b04de9169089a4cb\\\"\"" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-t48px" podUID="07ab2a7c-e330-498d-a1b0-b2155c491839" Oct 03 08:05:18 crc kubenswrapper[4664]: I1003 08:05:18.468721 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-8wwpz" event={"ID":"25a36ab5-71b4-4660-99a6-86c3c6554c86","Type":"ContainerStarted","Data":"70c89f2ff1559ae37d1c114648a09676f2e652aebcc9d42ed59d7f219c003db9"} Oct 03 08:05:18 crc kubenswrapper[4664]: I1003 08:05:18.468773 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-8wwpz" event={"ID":"25a36ab5-71b4-4660-99a6-86c3c6554c86","Type":"ContainerStarted","Data":"68a82d0efab37b7fb8005c91dbd18477fbaeacf119f72e41f94d566e7db2ec2b"} Oct 03 08:05:18 crc kubenswrapper[4664]: E1003 08:05:18.476810 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:018151bd5ff830ec03c6b8e3d53cfb9456ca6e1e34793bdd4f7edd39a0146fa6\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-8wwpz" podUID="25a36ab5-71b4-4660-99a6-86c3c6554c86" Oct 03 08:05:18 crc kubenswrapper[4664]: I1003 08:05:18.480286 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-n8zxl" event={"ID":"47fe850e-9d6e-4a75-9e91-2c87cdb2e8bc","Type":"ContainerStarted","Data":"11c864bf44e147cf77fcfa791b6470276c18f0c436f473c9c38c1ef1a10c41a7"} Oct 03 08:05:18 crc kubenswrapper[4664]: I1003 08:05:18.489436 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-qf7qg" event={"ID":"2ed5f442-2c30-4c29-a65b-4b3d9262cbce","Type":"ContainerStarted","Data":"ffa3cbd1d82dc004518f5d74af72e5ea0524b133ee695312071859808b3b72a3"} Oct 03 08:05:18 crc kubenswrapper[4664]: I1003 08:05:18.496224 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-ct22p" event={"ID":"e8cd4885-34c3-4a78-b23d-9e57aa0517ca","Type":"ContainerStarted","Data":"a4ea3bddbd262e86a7b04751e3d82ebce9eae67319c26d36d2d696545103c020"} Oct 03 08:05:18 crc kubenswrapper[4664]: I1003 08:05:18.496264 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-ct22p" event={"ID":"e8cd4885-34c3-4a78-b23d-9e57aa0517ca","Type":"ContainerStarted","Data":"be1d107b5e8a9f51afb1d78b738d9ecfcdbc30985dbd13eac242c8f9de73f04c"} Oct 03 08:05:18 crc kubenswrapper[4664]: I1003 08:05:18.499856 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-fg9pn" event={"ID":"bca141be-db7d-4e1e-b95f-12f9b63522b7","Type":"ContainerStarted","Data":"61908ec71b26a05f25c167a0da8feb8367c5b2633c2b0e0edafa5e6be951e940"} Oct 03 08:05:18 crc kubenswrapper[4664]: E1003 08:05:18.503631 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:5c6ab93b78bd20eb7f1736751a59c1eb33fb06351339563dbefe49ccaaff6e94\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-ct22p" podUID="e8cd4885-34c3-4a78-b23d-9e57aa0517ca" Oct 03 08:05:18 crc kubenswrapper[4664]: I1003 08:05:18.506659 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-59qtz" event={"ID":"5bedd109-c928-4b87-8593-21f72b0a6165","Type":"ContainerStarted","Data":"97f56398a720028b64c1c9979d58fc66d1c9072b3a9ac6458a108e31e0b38b97"} Oct 03 08:05:18 crc kubenswrapper[4664]: I1003 08:05:18.511981 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-bf92q" event={"ID":"7d9f54b0-ed53-4042-98d1-eb5ceb6f629b","Type":"ContainerStarted","Data":"47c815246da3a3a99336006bef186aa15af18a4caa568c7bcf75d0a913adf3e4"} Oct 03 08:05:18 crc kubenswrapper[4664]: I1003 08:05:18.526892 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-fhpcz" event={"ID":"45229d75-9e65-47bb-a54a-b27742fe2717","Type":"ContainerStarted","Data":"7551925891ac0bbddd775881134e6315406a913cf4297aebbf57608f82691a3b"} Oct 03 08:05:18 crc kubenswrapper[4664]: I1003 08:05:18.536065 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-8jdvq" event={"ID":"a931be65-b5d0-4685-82ea-42102c8235c8","Type":"ContainerStarted","Data":"8779aef530d0c49b65f98ccdb4bee5510d695583430d907ed7f4234e73a5fad2"} Oct 03 08:05:19 crc kubenswrapper[4664]: I1003 08:05:19.562800 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6f9d674864-tpv2h" event={"ID":"9d494f25-e54e-4ddf-a4cc-a632d05db780","Type":"ContainerStarted","Data":"92be7c64e32288cd141ecda8344571b9a13e9fdf4f21edb90e77833de72ab41b"} Oct 03 08:05:19 crc kubenswrapper[4664]: I1003 08:05:19.563104 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6f9d674864-tpv2h" event={"ID":"9d494f25-e54e-4ddf-a4cc-a632d05db780","Type":"ContainerStarted","Data":"e2550e9bda7c87745e1c20fc9e4949a1c71475b1168d125189502f4b41f082ea"} Oct 03 08:05:19 crc kubenswrapper[4664]: I1003 08:05:19.563122 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6f9d674864-tpv2h" Oct 03 08:05:19 crc kubenswrapper[4664]: E1003 08:05:19.566516 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:5c6ab93b78bd20eb7f1736751a59c1eb33fb06351339563dbefe49ccaaff6e94\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-ct22p" podUID="e8cd4885-34c3-4a78-b23d-9e57aa0517ca" Oct 03 08:05:19 crc kubenswrapper[4664]: E1003 08:05:19.567241 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:018151bd5ff830ec03c6b8e3d53cfb9456ca6e1e34793bdd4f7edd39a0146fa6\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-8wwpz" podUID="25a36ab5-71b4-4660-99a6-86c3c6554c86" Oct 03 08:05:19 crc kubenswrapper[4664]: E1003 08:05:19.567225 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:0daf76cc40ab619ae266b11defcc1b65beb22d859369e7b1b04de9169089a4cb\\\"\"" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-t48px" podUID="07ab2a7c-e330-498d-a1b0-b2155c491839" Oct 03 08:05:19 crc kubenswrapper[4664]: E1003 08:05:19.568913 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-s6fqp" podUID="928529ba-1444-4ccc-9fab-cac6102c3375" Oct 03 08:05:19 crc kubenswrapper[4664]: E1003 08:05:19.568944 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:8f5eee2eb7b77432ef1a88ed693ff981514359dfc808581f393bcef252de5cfa\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-fm85d" podUID="6492ba28-5e2e-42b3-829e-5b666703bd85" Oct 03 08:05:19 crc kubenswrapper[4664]: E1003 08:05:19.569535 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed\\\"\"" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-kcqct" podUID="294a44e7-d4f2-4162-9f34-2cd9c4a9aa49" Oct 03 08:05:19 crc kubenswrapper[4664]: E1003 08:05:19.569578 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:8fdb7ea8542adb2eca73f11bd78e6aebceed2ba7a1e9fdd149c75e0049d09ce0\\\"\"" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-8nvt9" podUID="738a79fc-a362-4c7c-a101-f8551019a96b" Oct 03 08:05:19 crc kubenswrapper[4664]: I1003 08:05:19.657002 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6f9d674864-tpv2h" podStartSLOduration=4.656979965 podStartE2EDuration="4.656979965s" podCreationTimestamp="2025-10-03 08:05:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:05:19.647929631 +0000 UTC m=+1020.469120131" watchObservedRunningTime="2025-10-03 08:05:19.656979965 +0000 UTC m=+1020.478170455" Oct 03 08:05:20 crc kubenswrapper[4664]: E1003 08:05:20.571778 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:8f5eee2eb7b77432ef1a88ed693ff981514359dfc808581f393bcef252de5cfa\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-fm85d" podUID="6492ba28-5e2e-42b3-829e-5b666703bd85" Oct 03 08:05:27 crc kubenswrapper[4664]: I1003 08:05:27.351258 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6f9d674864-tpv2h" Oct 03 08:05:30 crc kubenswrapper[4664]: E1003 08:05:30.218946 4664 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:b7409dcf05c85eab205904d29d4276f8e927c772eba6363ecfa21ab10c4aaa01" Oct 03 08:05:30 crc kubenswrapper[4664]: E1003 08:05:30.219441 4664 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:b7409dcf05c85eab205904d29d4276f8e927c772eba6363ecfa21ab10c4aaa01,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-52mc4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-59d6cfdf45-8jdvq_openstack-operators(a931be65-b5d0-4685-82ea-42102c8235c8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 08:05:30 crc kubenswrapper[4664]: E1003 08:05:30.684468 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-8jdvq" podUID="a931be65-b5d0-4685-82ea-42102c8235c8" Oct 03 08:05:31 crc kubenswrapper[4664]: I1003 08:05:31.692510 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-n8zxl" event={"ID":"47fe850e-9d6e-4a75-9e91-2c87cdb2e8bc","Type":"ContainerStarted","Data":"81f76c5ffcae580b68db3626075ba1cef0f9255dc1bbd63304518aca364e18f4"} Oct 03 08:05:31 crc kubenswrapper[4664]: I1003 08:05:31.699799 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-mhzlb" event={"ID":"30114f47-bc69-49b2-b3ab-d201c2c146ee","Type":"ContainerStarted","Data":"c2c839786423ccf19a97d175e7cb568974f5045c8d611802fc77280e245f81cb"} Oct 03 08:05:31 crc kubenswrapper[4664]: I1003 08:05:31.699829 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-mhzlb" event={"ID":"30114f47-bc69-49b2-b3ab-d201c2c146ee","Type":"ContainerStarted","Data":"34e50249858e1d70c55f53c9a9153f9f8ae9ebdd996a1aa7c30dfa672ca5ed37"} Oct 03 08:05:31 crc kubenswrapper[4664]: I1003 08:05:31.699873 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-mhzlb" Oct 03 08:05:31 crc kubenswrapper[4664]: I1003 08:05:31.711875 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-nw8d7" event={"ID":"16a5e165-d1b6-4023-b48e-f4b918730203","Type":"ContainerStarted","Data":"ae3bc6c7e091d469ba74ba0fcb45ce80035ff227c59a118e54ba56c2d2233537"} Oct 03 08:05:31 crc kubenswrapper[4664]: I1003 08:05:31.718694 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678qkr5h" event={"ID":"2784f55b-9b0e-49e0-9a5b-df56008a2be9","Type":"ContainerStarted","Data":"cfed7658a8f0402c81de734603f6d74ea81175cf30cab9d49ca6117db03482fb"} Oct 03 08:05:31 crc kubenswrapper[4664]: I1003 08:05:31.736699 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-mhzlb" podStartSLOduration=3.551240641 podStartE2EDuration="17.736677328s" podCreationTimestamp="2025-10-03 08:05:14 +0000 UTC" firstStartedPulling="2025-10-03 08:05:16.114721308 +0000 UTC m=+1016.935911798" lastFinishedPulling="2025-10-03 08:05:30.300157995 +0000 UTC m=+1031.121348485" observedRunningTime="2025-10-03 08:05:31.72680371 +0000 UTC m=+1032.547994210" watchObservedRunningTime="2025-10-03 08:05:31.736677328 +0000 UTC m=+1032.557867818" Oct 03 08:05:31 crc kubenswrapper[4664]: I1003 08:05:31.738740 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-bvgv4" event={"ID":"f1eed736-5b4e-4e82-915d-288d59a82b94","Type":"ContainerStarted","Data":"b656ee0a64b712296551b27eb17003722f4987f55590a6a2ad74f9d3e8356017"} Oct 03 08:05:31 crc kubenswrapper[4664]: I1003 08:05:31.738805 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-bvgv4" event={"ID":"f1eed736-5b4e-4e82-915d-288d59a82b94","Type":"ContainerStarted","Data":"dcdd0b07b88b11c10e99803570a8ded56073c2a52819a303ea7aa69ac8baba4d"} Oct 03 08:05:31 crc kubenswrapper[4664]: I1003 08:05:31.739721 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-bvgv4" Oct 03 08:05:31 crc kubenswrapper[4664]: I1003 08:05:31.757846 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-8jdvq" event={"ID":"a931be65-b5d0-4685-82ea-42102c8235c8","Type":"ContainerStarted","Data":"e3bce5a054b846d46379fe96270ca8dde47636562da91c62934d6f9bda1fdbaf"} Oct 03 08:05:31 crc kubenswrapper[4664]: E1003 08:05:31.767028 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:b7409dcf05c85eab205904d29d4276f8e927c772eba6363ecfa21ab10c4aaa01\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-8jdvq" podUID="a931be65-b5d0-4685-82ea-42102c8235c8" Oct 03 08:05:31 crc kubenswrapper[4664]: I1003 08:05:31.769497 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-bvgv4" podStartSLOduration=4.36241238 podStartE2EDuration="17.769479344s" podCreationTimestamp="2025-10-03 08:05:14 +0000 UTC" firstStartedPulling="2025-10-03 08:05:16.939818704 +0000 UTC m=+1017.761009194" lastFinishedPulling="2025-10-03 08:05:30.346885668 +0000 UTC m=+1031.168076158" observedRunningTime="2025-10-03 08:05:31.767525468 +0000 UTC m=+1032.588715968" watchObservedRunningTime="2025-10-03 08:05:31.769479344 +0000 UTC m=+1032.590669834" Oct 03 08:05:31 crc kubenswrapper[4664]: I1003 08:05:31.778985 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-59qtz" event={"ID":"5bedd109-c928-4b87-8593-21f72b0a6165","Type":"ContainerStarted","Data":"bf74344ec77ad53c1b783587ea9746d78b2bee965483e79db7ed87b8b74fda97"} Oct 03 08:05:31 crc kubenswrapper[4664]: I1003 08:05:31.839889 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-bf92q" event={"ID":"7d9f54b0-ed53-4042-98d1-eb5ceb6f629b","Type":"ContainerStarted","Data":"c9bca277965ea0c4d1bbe7e00a7119745e03319f5b61cacbb39912b4cc8570f3"} Oct 03 08:05:31 crc kubenswrapper[4664]: I1003 08:05:31.894997 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-fhpcz" event={"ID":"45229d75-9e65-47bb-a54a-b27742fe2717","Type":"ContainerStarted","Data":"9b821f445de1109b5246f4b7b3ad6b1641c08d92778420ffad402b8a366cab77"} Oct 03 08:05:31 crc kubenswrapper[4664]: I1003 08:05:31.896856 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-qlgfl" event={"ID":"28f90c47-2add-426b-953e-be4842b71cfc","Type":"ContainerStarted","Data":"d0f7928a44b8abdd01a5d4f27d027802c38b6ca5a121507e3b8cac636740a3cd"} Oct 03 08:05:31 crc kubenswrapper[4664]: I1003 08:05:31.896901 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-qlgfl" event={"ID":"28f90c47-2add-426b-953e-be4842b71cfc","Type":"ContainerStarted","Data":"838c1cde646771c6b260918c4cd6c1d7d20977fc3064014656d89093df83a94e"} Oct 03 08:05:31 crc kubenswrapper[4664]: I1003 08:05:31.897276 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-qlgfl" Oct 03 08:05:31 crc kubenswrapper[4664]: I1003 08:05:31.910245 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-44trb" event={"ID":"d80c9513-6585-415d-a747-2ff4bcca33d7","Type":"ContainerStarted","Data":"8985ce81ff3e1009e0b6c2639f5636b65f3f852135f0876603cabd0203776901"} Oct 03 08:05:31 crc kubenswrapper[4664]: I1003 08:05:31.910296 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-44trb" event={"ID":"d80c9513-6585-415d-a747-2ff4bcca33d7","Type":"ContainerStarted","Data":"7b2c06aa290f32d374670db3587230e711e8f33afb86feb944fbbbe96c7d4a26"} Oct 03 08:05:31 crc kubenswrapper[4664]: I1003 08:05:31.910367 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-44trb" Oct 03 08:05:31 crc kubenswrapper[4664]: I1003 08:05:31.916693 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-qlgfl" podStartSLOduration=5.002926395 podStartE2EDuration="17.916679638s" podCreationTimestamp="2025-10-03 08:05:14 +0000 UTC" firstStartedPulling="2025-10-03 08:05:17.439211232 +0000 UTC m=+1018.260401722" lastFinishedPulling="2025-10-03 08:05:30.352964475 +0000 UTC m=+1031.174154965" observedRunningTime="2025-10-03 08:05:31.916099721 +0000 UTC m=+1032.737290241" watchObservedRunningTime="2025-10-03 08:05:31.916679638 +0000 UTC m=+1032.737870128" Oct 03 08:05:31 crc kubenswrapper[4664]: I1003 08:05:31.924241 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-qf7qg" event={"ID":"2ed5f442-2c30-4c29-a65b-4b3d9262cbce","Type":"ContainerStarted","Data":"d73b34e92cd1675961686aa3cb6e39c4ba8080226e73e11c79723f2ed8375b34"} Oct 03 08:05:31 crc kubenswrapper[4664]: I1003 08:05:31.951031 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-44trb" podStartSLOduration=4.99281622 podStartE2EDuration="17.95101338s" podCreationTimestamp="2025-10-03 08:05:14 +0000 UTC" firstStartedPulling="2025-10-03 08:05:17.350850344 +0000 UTC m=+1018.172040834" lastFinishedPulling="2025-10-03 08:05:30.309047504 +0000 UTC m=+1031.130237994" observedRunningTime="2025-10-03 08:05:31.94997543 +0000 UTC m=+1032.771165930" watchObservedRunningTime="2025-10-03 08:05:31.95101338 +0000 UTC m=+1032.772203870" Oct 03 08:05:31 crc kubenswrapper[4664]: I1003 08:05:31.958927 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-599898f689-g5h6q" event={"ID":"56f9a3d8-3c1e-4d8d-a005-3c5f42beb67e","Type":"ContainerStarted","Data":"b568349e42dfb6d062c075173afb4e002a6f10ebfdce6285cec7d741f888baab"} Oct 03 08:05:31 crc kubenswrapper[4664]: I1003 08:05:31.985572 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-fg9pn" event={"ID":"bca141be-db7d-4e1e-b95f-12f9b63522b7","Type":"ContainerStarted","Data":"2f191f592a29148a95abda212e92d16d6514c74ce0da0cd5fe3f2759214c9a9e"} Oct 03 08:05:31 crc kubenswrapper[4664]: I1003 08:05:31.991541 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-tz9q5" event={"ID":"d5285a08-51d1-4f27-b87e-73fc4c0dc037","Type":"ContainerStarted","Data":"6557a42ca1df5c58751ee9336b85cd08e3487d5174cf5d3bd352677a7f86a778"} Oct 03 08:05:32 crc kubenswrapper[4664]: I1003 08:05:32.998583 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-fhpcz" event={"ID":"45229d75-9e65-47bb-a54a-b27742fe2717","Type":"ContainerStarted","Data":"f5b7291df09981b568a9f4db25607fd1f082ec005d50f1a301a70abb161ee357"} Oct 03 08:05:32 crc kubenswrapper[4664]: I1003 08:05:32.998750 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-fhpcz" Oct 03 08:05:33 crc kubenswrapper[4664]: I1003 08:05:33.001581 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-fg9pn" event={"ID":"bca141be-db7d-4e1e-b95f-12f9b63522b7","Type":"ContainerStarted","Data":"6d7207461f3440ddd67ccc275c47cf4726a86d708f6e0866f5aa1c5a81f4d583"} Oct 03 08:05:33 crc kubenswrapper[4664]: I1003 08:05:33.001842 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-fg9pn" Oct 03 08:05:33 crc kubenswrapper[4664]: I1003 08:05:33.004332 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-tz9q5" event={"ID":"d5285a08-51d1-4f27-b87e-73fc4c0dc037","Type":"ContainerStarted","Data":"3ae02145077696a76a0f33171897ec9287a0b2b927a6154d46ce36d25d9d0f3b"} Oct 03 08:05:33 crc kubenswrapper[4664]: I1003 08:05:33.004803 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-tz9q5" Oct 03 08:05:33 crc kubenswrapper[4664]: I1003 08:05:33.006831 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-nw8d7" event={"ID":"16a5e165-d1b6-4023-b48e-f4b918730203","Type":"ContainerStarted","Data":"54d592a18263a6baa7d005f65a6fd4d69644e41df32b72541b219ce779a6cb8a"} Oct 03 08:05:33 crc kubenswrapper[4664]: I1003 08:05:33.006977 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-nw8d7" Oct 03 08:05:33 crc kubenswrapper[4664]: I1003 08:05:33.008158 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-n8zxl" event={"ID":"47fe850e-9d6e-4a75-9e91-2c87cdb2e8bc","Type":"ContainerStarted","Data":"151249c9249c6d30bdf9412deaf9fac18468277929495b53a66264a4617a2d4f"} Oct 03 08:05:33 crc kubenswrapper[4664]: I1003 08:05:33.008587 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-n8zxl" Oct 03 08:05:33 crc kubenswrapper[4664]: I1003 08:05:33.011042 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-bf92q" event={"ID":"7d9f54b0-ed53-4042-98d1-eb5ceb6f629b","Type":"ContainerStarted","Data":"f2e0f3df84d3596243ed40bd42c55d987cebaddbb2615af86669a8d1e423567c"} Oct 03 08:05:33 crc kubenswrapper[4664]: I1003 08:05:33.034580 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-bf92q" Oct 03 08:05:33 crc kubenswrapper[4664]: I1003 08:05:33.047706 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-59qtz" event={"ID":"5bedd109-c928-4b87-8593-21f72b0a6165","Type":"ContainerStarted","Data":"b9d1a417d182d57a23c1ecaa9e754ba0f328854af479e96f1d26aa2fb75745d5"} Oct 03 08:05:33 crc kubenswrapper[4664]: I1003 08:05:33.048160 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-59qtz" Oct 03 08:05:33 crc kubenswrapper[4664]: I1003 08:05:33.056936 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678qkr5h" event={"ID":"2784f55b-9b0e-49e0-9a5b-df56008a2be9","Type":"ContainerStarted","Data":"c6eb67f4c2d621574bb5374aea3281478e3c21128d009841247e21f6aca4c975"} Oct 03 08:05:33 crc kubenswrapper[4664]: I1003 08:05:33.058468 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678qkr5h" Oct 03 08:05:33 crc kubenswrapper[4664]: I1003 08:05:33.061205 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-n8zxl" podStartSLOduration=6.126802998 podStartE2EDuration="19.061186034s" podCreationTimestamp="2025-10-03 08:05:14 +0000 UTC" firstStartedPulling="2025-10-03 08:05:17.417939221 +0000 UTC m=+1018.239129711" lastFinishedPulling="2025-10-03 08:05:30.352322257 +0000 UTC m=+1031.173512747" observedRunningTime="2025-10-03 08:05:33.058749443 +0000 UTC m=+1033.879939943" watchObservedRunningTime="2025-10-03 08:05:33.061186034 +0000 UTC m=+1033.882376534" Oct 03 08:05:33 crc kubenswrapper[4664]: I1003 08:05:33.079386 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-fhpcz" podStartSLOduration=6.144898986 podStartE2EDuration="19.079366054s" podCreationTimestamp="2025-10-03 08:05:14 +0000 UTC" firstStartedPulling="2025-10-03 08:05:17.417851509 +0000 UTC m=+1018.239041999" lastFinishedPulling="2025-10-03 08:05:30.352318577 +0000 UTC m=+1031.173509067" observedRunningTime="2025-10-03 08:05:33.032869468 +0000 UTC m=+1033.854059988" watchObservedRunningTime="2025-10-03 08:05:33.079366054 +0000 UTC m=+1033.900556544" Oct 03 08:05:33 crc kubenswrapper[4664]: I1003 08:05:33.079629 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-qf7qg" event={"ID":"2ed5f442-2c30-4c29-a65b-4b3d9262cbce","Type":"ContainerStarted","Data":"20393d08226c205e10db1b7f652fc8d93e8a0c9bfcbcefb04a9ff6f06e1d876c"} Oct 03 08:05:33 crc kubenswrapper[4664]: I1003 08:05:33.080215 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-qf7qg" Oct 03 08:05:33 crc kubenswrapper[4664]: I1003 08:05:33.083307 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-599898f689-g5h6q" event={"ID":"56f9a3d8-3c1e-4d8d-a005-3c5f42beb67e","Type":"ContainerStarted","Data":"91ecf1bb5d1adfd17af7125f73cb1429871cb3d850a1f6e916725973f4c3efd7"} Oct 03 08:05:33 crc kubenswrapper[4664]: I1003 08:05:33.083812 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-599898f689-g5h6q" Oct 03 08:05:33 crc kubenswrapper[4664]: E1003 08:05:33.084486 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:b7409dcf05c85eab205904d29d4276f8e927c772eba6363ecfa21ab10c4aaa01\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-8jdvq" podUID="a931be65-b5d0-4685-82ea-42102c8235c8" Oct 03 08:05:33 crc kubenswrapper[4664]: I1003 08:05:33.091944 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-bf92q" podStartSLOduration=6.174265064 podStartE2EDuration="19.091931891s" podCreationTimestamp="2025-10-03 08:05:14 +0000 UTC" firstStartedPulling="2025-10-03 08:05:17.43159707 +0000 UTC m=+1018.252787560" lastFinishedPulling="2025-10-03 08:05:30.349263897 +0000 UTC m=+1031.170454387" observedRunningTime="2025-10-03 08:05:33.090959302 +0000 UTC m=+1033.912149802" watchObservedRunningTime="2025-10-03 08:05:33.091931891 +0000 UTC m=+1033.913122381" Oct 03 08:05:33 crc kubenswrapper[4664]: I1003 08:05:33.113964 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-tz9q5" podStartSLOduration=6.237713444 podStartE2EDuration="19.113947273s" podCreationTimestamp="2025-10-03 08:05:14 +0000 UTC" firstStartedPulling="2025-10-03 08:05:17.476682835 +0000 UTC m=+1018.297873325" lastFinishedPulling="2025-10-03 08:05:30.352916664 +0000 UTC m=+1031.174107154" observedRunningTime="2025-10-03 08:05:33.110026489 +0000 UTC m=+1033.931216989" watchObservedRunningTime="2025-10-03 08:05:33.113947273 +0000 UTC m=+1033.935137753" Oct 03 08:05:33 crc kubenswrapper[4664]: I1003 08:05:33.135625 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-fg9pn" podStartSLOduration=6.2110322159999996 podStartE2EDuration="19.135584094s" podCreationTimestamp="2025-10-03 08:05:14 +0000 UTC" firstStartedPulling="2025-10-03 08:05:17.425729319 +0000 UTC m=+1018.246919799" lastFinishedPulling="2025-10-03 08:05:30.350281187 +0000 UTC m=+1031.171471677" observedRunningTime="2025-10-03 08:05:33.13166522 +0000 UTC m=+1033.952855720" watchObservedRunningTime="2025-10-03 08:05:33.135584094 +0000 UTC m=+1033.956774594" Oct 03 08:05:33 crc kubenswrapper[4664]: I1003 08:05:33.151175 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-nw8d7" podStartSLOduration=5.548152119 podStartE2EDuration="19.151160328s" podCreationTimestamp="2025-10-03 08:05:14 +0000 UTC" firstStartedPulling="2025-10-03 08:05:16.750145022 +0000 UTC m=+1017.571335512" lastFinishedPulling="2025-10-03 08:05:30.353153221 +0000 UTC m=+1031.174343721" observedRunningTime="2025-10-03 08:05:33.148077959 +0000 UTC m=+1033.969268469" watchObservedRunningTime="2025-10-03 08:05:33.151160328 +0000 UTC m=+1033.972350818" Oct 03 08:05:33 crc kubenswrapper[4664]: I1003 08:05:33.174350 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-qf7qg" podStartSLOduration=6.164817758 podStartE2EDuration="19.174305764s" podCreationTimestamp="2025-10-03 08:05:14 +0000 UTC" firstStartedPulling="2025-10-03 08:05:17.343008346 +0000 UTC m=+1018.164198836" lastFinishedPulling="2025-10-03 08:05:30.352496352 +0000 UTC m=+1031.173686842" observedRunningTime="2025-10-03 08:05:33.170803371 +0000 UTC m=+1033.991993861" watchObservedRunningTime="2025-10-03 08:05:33.174305764 +0000 UTC m=+1033.995496274" Oct 03 08:05:33 crc kubenswrapper[4664]: I1003 08:05:33.190470 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-59qtz" podStartSLOduration=6.255948996 podStartE2EDuration="19.190453225s" podCreationTimestamp="2025-10-03 08:05:14 +0000 UTC" firstStartedPulling="2025-10-03 08:05:17.41719821 +0000 UTC m=+1018.238388700" lastFinishedPulling="2025-10-03 08:05:30.351702439 +0000 UTC m=+1031.172892929" observedRunningTime="2025-10-03 08:05:33.185875871 +0000 UTC m=+1034.007066371" watchObservedRunningTime="2025-10-03 08:05:33.190453225 +0000 UTC m=+1034.011643725" Oct 03 08:05:33 crc kubenswrapper[4664]: I1003 08:05:33.225179 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678qkr5h" podStartSLOduration=6.478944481 podStartE2EDuration="19.225162087s" podCreationTimestamp="2025-10-03 08:05:14 +0000 UTC" firstStartedPulling="2025-10-03 08:05:17.605438521 +0000 UTC m=+1018.426629011" lastFinishedPulling="2025-10-03 08:05:30.351656127 +0000 UTC m=+1031.172846617" observedRunningTime="2025-10-03 08:05:33.221894692 +0000 UTC m=+1034.043085182" watchObservedRunningTime="2025-10-03 08:05:33.225162087 +0000 UTC m=+1034.046352577" Oct 03 08:05:33 crc kubenswrapper[4664]: I1003 08:05:33.248625 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-599898f689-g5h6q" podStartSLOduration=5.829003972 podStartE2EDuration="19.248592241s" podCreationTimestamp="2025-10-03 08:05:14 +0000 UTC" firstStartedPulling="2025-10-03 08:05:16.932490641 +0000 UTC m=+1017.753681131" lastFinishedPulling="2025-10-03 08:05:30.35207891 +0000 UTC m=+1031.173269400" observedRunningTime="2025-10-03 08:05:33.243497142 +0000 UTC m=+1034.064687632" watchObservedRunningTime="2025-10-03 08:05:33.248592241 +0000 UTC m=+1034.069782731" Oct 03 08:05:34 crc kubenswrapper[4664]: I1003 08:05:34.089242 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-kcqct" event={"ID":"294a44e7-d4f2-4162-9f34-2cd9c4a9aa49","Type":"ContainerStarted","Data":"54cbcef24037797f832a06688e37d94eb1ab9e7bef528c8dbe012e601054eae7"} Oct 03 08:05:34 crc kubenswrapper[4664]: I1003 08:05:34.106172 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-kcqct" podStartSLOduration=4.076756068 podStartE2EDuration="20.106153805s" podCreationTimestamp="2025-10-03 08:05:14 +0000 UTC" firstStartedPulling="2025-10-03 08:05:17.61329468 +0000 UTC m=+1018.434485170" lastFinishedPulling="2025-10-03 08:05:33.642692417 +0000 UTC m=+1034.463882907" observedRunningTime="2025-10-03 08:05:34.104389493 +0000 UTC m=+1034.925580003" watchObservedRunningTime="2025-10-03 08:05:34.106153805 +0000 UTC m=+1034.927344295" Oct 03 08:05:35 crc kubenswrapper[4664]: I1003 08:05:35.100188 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-bf92q" Oct 03 08:05:35 crc kubenswrapper[4664]: I1003 08:05:35.100545 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-tz9q5" Oct 03 08:05:35 crc kubenswrapper[4664]: I1003 08:05:35.102370 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-n8zxl" Oct 03 08:05:35 crc kubenswrapper[4664]: I1003 08:05:35.399679 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-fg9pn" Oct 03 08:05:35 crc kubenswrapper[4664]: I1003 08:05:35.816847 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-kcqct" Oct 03 08:05:36 crc kubenswrapper[4664]: I1003 08:05:36.056756 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678qkr5h" Oct 03 08:05:39 crc kubenswrapper[4664]: I1003 08:05:39.128755 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-s6fqp" event={"ID":"928529ba-1444-4ccc-9fab-cac6102c3375","Type":"ContainerStarted","Data":"0a96308b43e174930eddcdcdb8cbaba6c13bad3eae1e7b78913da114d8daf457"} Oct 03 08:05:39 crc kubenswrapper[4664]: I1003 08:05:39.132039 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-t48px" event={"ID":"07ab2a7c-e330-498d-a1b0-b2155c491839","Type":"ContainerStarted","Data":"8ec852ec0ff29d25232890a5f7021da8f8fc1159545b3c441caf2a9ca54742ac"} Oct 03 08:05:39 crc kubenswrapper[4664]: I1003 08:05:39.132559 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-t48px" Oct 03 08:05:39 crc kubenswrapper[4664]: I1003 08:05:39.136640 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-ct22p" event={"ID":"e8cd4885-34c3-4a78-b23d-9e57aa0517ca","Type":"ContainerStarted","Data":"6f5f1c404729b55811d51db0b4f351936b18918b8bfb2e1a99ce3f9e92dbae72"} Oct 03 08:05:39 crc kubenswrapper[4664]: I1003 08:05:39.136832 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-ct22p" Oct 03 08:05:39 crc kubenswrapper[4664]: I1003 08:05:39.150428 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-8wwpz" event={"ID":"25a36ab5-71b4-4660-99a6-86c3c6554c86","Type":"ContainerStarted","Data":"fedd6a42832e8c97d8f36e2b50900bd256a190b8a33aa8fcf464d85c4de830ae"} Oct 03 08:05:39 crc kubenswrapper[4664]: I1003 08:05:39.150655 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-8wwpz" Oct 03 08:05:39 crc kubenswrapper[4664]: I1003 08:05:39.152596 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-8nvt9" event={"ID":"738a79fc-a362-4c7c-a101-f8551019a96b","Type":"ContainerStarted","Data":"24f342dff788e1c0451eb1c47965459bd8e2a9edd5f2207159a32729bb8da7fb"} Oct 03 08:05:39 crc kubenswrapper[4664]: I1003 08:05:39.152902 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-8nvt9" Oct 03 08:05:39 crc kubenswrapper[4664]: I1003 08:05:39.158643 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-s6fqp" podStartSLOduration=3.133235394 podStartE2EDuration="24.158625766s" podCreationTimestamp="2025-10-03 08:05:15 +0000 UTC" firstStartedPulling="2025-10-03 08:05:17.518807744 +0000 UTC m=+1018.339998234" lastFinishedPulling="2025-10-03 08:05:38.544198106 +0000 UTC m=+1039.365388606" observedRunningTime="2025-10-03 08:05:39.155375502 +0000 UTC m=+1039.976565992" watchObservedRunningTime="2025-10-03 08:05:39.158625766 +0000 UTC m=+1039.979816256" Oct 03 08:05:39 crc kubenswrapper[4664]: I1003 08:05:39.208745 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-8wwpz" podStartSLOduration=3.330066798 podStartE2EDuration="24.208725253s" podCreationTimestamp="2025-10-03 08:05:15 +0000 UTC" firstStartedPulling="2025-10-03 08:05:17.726264425 +0000 UTC m=+1018.547454915" lastFinishedPulling="2025-10-03 08:05:38.60492288 +0000 UTC m=+1039.426113370" observedRunningTime="2025-10-03 08:05:39.185713358 +0000 UTC m=+1040.006903868" watchObservedRunningTime="2025-10-03 08:05:39.208725253 +0000 UTC m=+1040.029915753" Oct 03 08:05:39 crc kubenswrapper[4664]: I1003 08:05:39.209733 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-ct22p" podStartSLOduration=4.339951418 podStartE2EDuration="25.209705931s" podCreationTimestamp="2025-10-03 08:05:14 +0000 UTC" firstStartedPulling="2025-10-03 08:05:17.673670661 +0000 UTC m=+1018.494861151" lastFinishedPulling="2025-10-03 08:05:38.543425174 +0000 UTC m=+1039.364615664" observedRunningTime="2025-10-03 08:05:39.203174343 +0000 UTC m=+1040.024364843" watchObservedRunningTime="2025-10-03 08:05:39.209705931 +0000 UTC m=+1040.030896441" Oct 03 08:05:39 crc kubenswrapper[4664]: I1003 08:05:39.230872 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-8nvt9" podStartSLOduration=4.296200326 podStartE2EDuration="25.230850232s" podCreationTimestamp="2025-10-03 08:05:14 +0000 UTC" firstStartedPulling="2025-10-03 08:05:17.608804939 +0000 UTC m=+1018.429995429" lastFinishedPulling="2025-10-03 08:05:38.543454845 +0000 UTC m=+1039.364645335" observedRunningTime="2025-10-03 08:05:39.224084467 +0000 UTC m=+1040.045274967" watchObservedRunningTime="2025-10-03 08:05:39.230850232 +0000 UTC m=+1040.052040732" Oct 03 08:05:39 crc kubenswrapper[4664]: I1003 08:05:39.251296 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-t48px" podStartSLOduration=3.340969778 podStartE2EDuration="24.251276042s" podCreationTimestamp="2025-10-03 08:05:15 +0000 UTC" firstStartedPulling="2025-10-03 08:05:17.673747253 +0000 UTC m=+1018.494937743" lastFinishedPulling="2025-10-03 08:05:38.584053507 +0000 UTC m=+1039.405244007" observedRunningTime="2025-10-03 08:05:39.246730181 +0000 UTC m=+1040.067920691" watchObservedRunningTime="2025-10-03 08:05:39.251276042 +0000 UTC m=+1040.072466532" Oct 03 08:05:40 crc kubenswrapper[4664]: I1003 08:05:40.161932 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-fm85d" event={"ID":"6492ba28-5e2e-42b3-829e-5b666703bd85","Type":"ContainerStarted","Data":"83d4f57a7a98a467ef3dd108bb64262b371ee9dfc51dcba410c13812059666a7"} Oct 03 08:05:40 crc kubenswrapper[4664]: I1003 08:05:40.163754 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-fm85d" Oct 03 08:05:40 crc kubenswrapper[4664]: I1003 08:05:40.182064 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-fm85d" podStartSLOduration=4.235532745 podStartE2EDuration="26.1820443s" podCreationTimestamp="2025-10-03 08:05:14 +0000 UTC" firstStartedPulling="2025-10-03 08:05:17.719864589 +0000 UTC m=+1018.541055089" lastFinishedPulling="2025-10-03 08:05:39.666376164 +0000 UTC m=+1040.487566644" observedRunningTime="2025-10-03 08:05:40.177225571 +0000 UTC m=+1040.998416091" watchObservedRunningTime="2025-10-03 08:05:40.1820443 +0000 UTC m=+1041.003234790" Oct 03 08:05:41 crc kubenswrapper[4664]: I1003 08:05:41.987388 4664 patch_prober.go:28] interesting pod/machine-config-daemon-x9dgm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:05:41 crc kubenswrapper[4664]: I1003 08:05:41.987879 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:05:44 crc kubenswrapper[4664]: I1003 08:05:44.660438 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-mhzlb" Oct 03 08:05:44 crc kubenswrapper[4664]: I1003 08:05:44.686351 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-nw8d7" Oct 03 08:05:44 crc kubenswrapper[4664]: I1003 08:05:44.710424 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-bvgv4" Oct 03 08:05:44 crc kubenswrapper[4664]: I1003 08:05:44.725399 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-44trb" Oct 03 08:05:44 crc kubenswrapper[4664]: I1003 08:05:44.751724 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-599898f689-g5h6q" Oct 03 08:05:44 crc kubenswrapper[4664]: I1003 08:05:44.817799 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-qlgfl" Oct 03 08:05:44 crc kubenswrapper[4664]: I1003 08:05:44.918411 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-fhpcz" Oct 03 08:05:44 crc kubenswrapper[4664]: I1003 08:05:44.954356 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-qf7qg" Oct 03 08:05:45 crc kubenswrapper[4664]: I1003 08:05:45.140875 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-8nvt9" Oct 03 08:05:45 crc kubenswrapper[4664]: I1003 08:05:45.277356 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-59qtz" Oct 03 08:05:45 crc kubenswrapper[4664]: I1003 08:05:45.426478 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-ct22p" Oct 03 08:05:45 crc kubenswrapper[4664]: I1003 08:05:45.606565 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-fm85d" Oct 03 08:05:45 crc kubenswrapper[4664]: I1003 08:05:45.791806 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-t48px" Oct 03 08:05:45 crc kubenswrapper[4664]: I1003 08:05:45.821437 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-kcqct" Oct 03 08:05:45 crc kubenswrapper[4664]: I1003 08:05:45.825134 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-8wwpz" Oct 03 08:05:47 crc kubenswrapper[4664]: I1003 08:05:47.235739 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-8jdvq" event={"ID":"a931be65-b5d0-4685-82ea-42102c8235c8","Type":"ContainerStarted","Data":"a13c5acae279840c045ee20984eacbcf2ede2b8b85876c8be26a6a40bfbdc72d"} Oct 03 08:05:47 crc kubenswrapper[4664]: I1003 08:05:47.236914 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-8jdvq" Oct 03 08:05:47 crc kubenswrapper[4664]: I1003 08:05:47.280383 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-8jdvq" podStartSLOduration=4.235616619 podStartE2EDuration="33.280346413s" podCreationTimestamp="2025-10-03 08:05:14 +0000 UTC" firstStartedPulling="2025-10-03 08:05:17.334080335 +0000 UTC m=+1018.155270825" lastFinishedPulling="2025-10-03 08:05:46.378810129 +0000 UTC m=+1047.200000619" observedRunningTime="2025-10-03 08:05:47.270739265 +0000 UTC m=+1048.091929765" watchObservedRunningTime="2025-10-03 08:05:47.280346413 +0000 UTC m=+1048.101536903" Oct 03 08:05:55 crc kubenswrapper[4664]: I1003 08:05:55.288336 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-8jdvq" Oct 03 08:06:11 crc kubenswrapper[4664]: I1003 08:06:11.988037 4664 patch_prober.go:28] interesting pod/machine-config-daemon-x9dgm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:06:11 crc kubenswrapper[4664]: I1003 08:06:11.988633 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:06:11 crc kubenswrapper[4664]: I1003 08:06:11.988701 4664 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" Oct 03 08:06:11 crc kubenswrapper[4664]: I1003 08:06:11.989329 4664 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d72d28e356e7ba889e503f6d77ef4dcc3b64c797b9e1df46488fe0f1d0abb973"} pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 08:06:11 crc kubenswrapper[4664]: I1003 08:06:11.989379 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" containerID="cri-o://d72d28e356e7ba889e503f6d77ef4dcc3b64c797b9e1df46488fe0f1d0abb973" gracePeriod=600 Oct 03 08:06:12 crc kubenswrapper[4664]: I1003 08:06:12.466363 4664 generic.go:334] "Generic (PLEG): container finished" podID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerID="d72d28e356e7ba889e503f6d77ef4dcc3b64c797b9e1df46488fe0f1d0abb973" exitCode=0 Oct 03 08:06:12 crc kubenswrapper[4664]: I1003 08:06:12.466727 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" event={"ID":"598b81ce-0ce7-498f-9337-ae5e6e64682b","Type":"ContainerDied","Data":"d72d28e356e7ba889e503f6d77ef4dcc3b64c797b9e1df46488fe0f1d0abb973"} Oct 03 08:06:12 crc kubenswrapper[4664]: I1003 08:06:12.467048 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" event={"ID":"598b81ce-0ce7-498f-9337-ae5e6e64682b","Type":"ContainerStarted","Data":"06473cda750028c12efef390356377e8ae805e2359da1c4b578e9e258218058e"} Oct 03 08:06:12 crc kubenswrapper[4664]: I1003 08:06:12.467072 4664 scope.go:117] "RemoveContainer" containerID="33a410bbdb246cf2e9dcb8e9de77a40e30f71ec5cde831e8cfca46d88165b8b1" Oct 03 08:06:13 crc kubenswrapper[4664]: I1003 08:06:13.368210 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-zrgdw"] Oct 03 08:06:13 crc kubenswrapper[4664]: I1003 08:06:13.369832 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-zrgdw" Oct 03 08:06:13 crc kubenswrapper[4664]: W1003 08:06:13.378478 4664 reflector.go:561] object-"openstack"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Oct 03 08:06:13 crc kubenswrapper[4664]: W1003 08:06:13.378938 4664 reflector.go:561] object-"openstack"/"dnsmasq-dns-dockercfg-wj5z4": failed to list *v1.Secret: secrets "dnsmasq-dns-dockercfg-wj5z4" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Oct 03 08:06:13 crc kubenswrapper[4664]: E1003 08:06:13.378988 4664 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"dnsmasq-dns-dockercfg-wj5z4\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"dnsmasq-dns-dockercfg-wj5z4\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 03 08:06:13 crc kubenswrapper[4664]: W1003 08:06:13.378806 4664 reflector.go:561] object-"openstack"/"dns": failed to list *v1.ConfigMap: configmaps "dns" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Oct 03 08:06:13 crc kubenswrapper[4664]: E1003 08:06:13.379022 4664 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"dns\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"dns\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 03 08:06:13 crc kubenswrapper[4664]: W1003 08:06:13.378888 4664 reflector.go:561] object-"openstack"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Oct 03 08:06:13 crc kubenswrapper[4664]: E1003 08:06:13.379043 4664 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 03 08:06:13 crc kubenswrapper[4664]: E1003 08:06:13.378937 4664 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 03 08:06:13 crc kubenswrapper[4664]: I1003 08:06:13.386711 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-zrgdw"] Oct 03 08:06:13 crc kubenswrapper[4664]: I1003 08:06:13.417225 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-9dx47"] Oct 03 08:06:13 crc kubenswrapper[4664]: I1003 08:06:13.427917 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-9dx47" Oct 03 08:06:13 crc kubenswrapper[4664]: I1003 08:06:13.431334 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-9dx47"] Oct 03 08:06:13 crc kubenswrapper[4664]: I1003 08:06:13.432430 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46d660af-8326-4df1-a0e7-2f8804fea23d-config\") pod \"dnsmasq-dns-675f4bcbfc-zrgdw\" (UID: \"46d660af-8326-4df1-a0e7-2f8804fea23d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-zrgdw" Oct 03 08:06:13 crc kubenswrapper[4664]: I1003 08:06:13.432634 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkn9h\" (UniqueName: \"kubernetes.io/projected/46d660af-8326-4df1-a0e7-2f8804fea23d-kube-api-access-hkn9h\") pod \"dnsmasq-dns-675f4bcbfc-zrgdw\" (UID: \"46d660af-8326-4df1-a0e7-2f8804fea23d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-zrgdw" Oct 03 08:06:13 crc kubenswrapper[4664]: I1003 08:06:13.433065 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 03 08:06:13 crc kubenswrapper[4664]: I1003 08:06:13.534313 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46d660af-8326-4df1-a0e7-2f8804fea23d-config\") pod \"dnsmasq-dns-675f4bcbfc-zrgdw\" (UID: \"46d660af-8326-4df1-a0e7-2f8804fea23d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-zrgdw" Oct 03 08:06:13 crc kubenswrapper[4664]: I1003 08:06:13.534380 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85abc773-d346-480c-9255-6eef5f2c4d29-config\") pod \"dnsmasq-dns-78dd6ddcc-9dx47\" (UID: \"85abc773-d346-480c-9255-6eef5f2c4d29\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9dx47" Oct 03 08:06:13 crc kubenswrapper[4664]: I1003 08:06:13.534447 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkn9h\" (UniqueName: \"kubernetes.io/projected/46d660af-8326-4df1-a0e7-2f8804fea23d-kube-api-access-hkn9h\") pod \"dnsmasq-dns-675f4bcbfc-zrgdw\" (UID: \"46d660af-8326-4df1-a0e7-2f8804fea23d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-zrgdw" Oct 03 08:06:13 crc kubenswrapper[4664]: I1003 08:06:13.534489 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbjdg\" (UniqueName: \"kubernetes.io/projected/85abc773-d346-480c-9255-6eef5f2c4d29-kube-api-access-wbjdg\") pod \"dnsmasq-dns-78dd6ddcc-9dx47\" (UID: \"85abc773-d346-480c-9255-6eef5f2c4d29\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9dx47" Oct 03 08:06:13 crc kubenswrapper[4664]: I1003 08:06:13.534547 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85abc773-d346-480c-9255-6eef5f2c4d29-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-9dx47\" (UID: \"85abc773-d346-480c-9255-6eef5f2c4d29\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9dx47" Oct 03 08:06:13 crc kubenswrapper[4664]: I1003 08:06:13.635700 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbjdg\" (UniqueName: \"kubernetes.io/projected/85abc773-d346-480c-9255-6eef5f2c4d29-kube-api-access-wbjdg\") pod \"dnsmasq-dns-78dd6ddcc-9dx47\" (UID: \"85abc773-d346-480c-9255-6eef5f2c4d29\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9dx47" Oct 03 08:06:13 crc kubenswrapper[4664]: I1003 08:06:13.636017 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85abc773-d346-480c-9255-6eef5f2c4d29-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-9dx47\" (UID: \"85abc773-d346-480c-9255-6eef5f2c4d29\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9dx47" Oct 03 08:06:13 crc kubenswrapper[4664]: I1003 08:06:13.636118 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85abc773-d346-480c-9255-6eef5f2c4d29-config\") pod \"dnsmasq-dns-78dd6ddcc-9dx47\" (UID: \"85abc773-d346-480c-9255-6eef5f2c4d29\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9dx47" Oct 03 08:06:13 crc kubenswrapper[4664]: I1003 08:06:13.637155 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85abc773-d346-480c-9255-6eef5f2c4d29-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-9dx47\" (UID: \"85abc773-d346-480c-9255-6eef5f2c4d29\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9dx47" Oct 03 08:06:14 crc kubenswrapper[4664]: I1003 08:06:14.254885 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-wj5z4" Oct 03 08:06:14 crc kubenswrapper[4664]: I1003 08:06:14.420931 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 03 08:06:14 crc kubenswrapper[4664]: I1003 08:06:14.425906 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46d660af-8326-4df1-a0e7-2f8804fea23d-config\") pod \"dnsmasq-dns-675f4bcbfc-zrgdw\" (UID: \"46d660af-8326-4df1-a0e7-2f8804fea23d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-zrgdw" Oct 03 08:06:14 crc kubenswrapper[4664]: I1003 08:06:14.427917 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85abc773-d346-480c-9255-6eef5f2c4d29-config\") pod \"dnsmasq-dns-78dd6ddcc-9dx47\" (UID: \"85abc773-d346-480c-9255-6eef5f2c4d29\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9dx47" Oct 03 08:06:14 crc kubenswrapper[4664]: I1003 08:06:14.441530 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 03 08:06:14 crc kubenswrapper[4664]: E1003 08:06:14.547882 4664 projected.go:288] Couldn't get configMap openstack/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Oct 03 08:06:14 crc kubenswrapper[4664]: E1003 08:06:14.547965 4664 projected.go:194] Error preparing data for projected volume kube-api-access-hkn9h for pod openstack/dnsmasq-dns-675f4bcbfc-zrgdw: failed to sync configmap cache: timed out waiting for the condition Oct 03 08:06:14 crc kubenswrapper[4664]: E1003 08:06:14.548025 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/46d660af-8326-4df1-a0e7-2f8804fea23d-kube-api-access-hkn9h podName:46d660af-8326-4df1-a0e7-2f8804fea23d nodeName:}" failed. No retries permitted until 2025-10-03 08:06:15.048008776 +0000 UTC m=+1075.869199266 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-hkn9h" (UniqueName: "kubernetes.io/projected/46d660af-8326-4df1-a0e7-2f8804fea23d-kube-api-access-hkn9h") pod "dnsmasq-dns-675f4bcbfc-zrgdw" (UID: "46d660af-8326-4df1-a0e7-2f8804fea23d") : failed to sync configmap cache: timed out waiting for the condition Oct 03 08:06:14 crc kubenswrapper[4664]: E1003 08:06:14.653823 4664 projected.go:288] Couldn't get configMap openstack/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Oct 03 08:06:14 crc kubenswrapper[4664]: E1003 08:06:14.653873 4664 projected.go:194] Error preparing data for projected volume kube-api-access-wbjdg for pod openstack/dnsmasq-dns-78dd6ddcc-9dx47: failed to sync configmap cache: timed out waiting for the condition Oct 03 08:06:14 crc kubenswrapper[4664]: E1003 08:06:14.653938 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/85abc773-d346-480c-9255-6eef5f2c4d29-kube-api-access-wbjdg podName:85abc773-d346-480c-9255-6eef5f2c4d29 nodeName:}" failed. No retries permitted until 2025-10-03 08:06:15.153917146 +0000 UTC m=+1075.975107636 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-wbjdg" (UniqueName: "kubernetes.io/projected/85abc773-d346-480c-9255-6eef5f2c4d29-kube-api-access-wbjdg") pod "dnsmasq-dns-78dd6ddcc-9dx47" (UID: "85abc773-d346-480c-9255-6eef5f2c4d29") : failed to sync configmap cache: timed out waiting for the condition Oct 03 08:06:14 crc kubenswrapper[4664]: I1003 08:06:14.923010 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 03 08:06:15 crc kubenswrapper[4664]: I1003 08:06:15.090073 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkn9h\" (UniqueName: \"kubernetes.io/projected/46d660af-8326-4df1-a0e7-2f8804fea23d-kube-api-access-hkn9h\") pod \"dnsmasq-dns-675f4bcbfc-zrgdw\" (UID: \"46d660af-8326-4df1-a0e7-2f8804fea23d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-zrgdw" Oct 03 08:06:15 crc kubenswrapper[4664]: I1003 08:06:15.098397 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkn9h\" (UniqueName: \"kubernetes.io/projected/46d660af-8326-4df1-a0e7-2f8804fea23d-kube-api-access-hkn9h\") pod \"dnsmasq-dns-675f4bcbfc-zrgdw\" (UID: \"46d660af-8326-4df1-a0e7-2f8804fea23d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-zrgdw" Oct 03 08:06:15 crc kubenswrapper[4664]: I1003 08:06:15.192100 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbjdg\" (UniqueName: \"kubernetes.io/projected/85abc773-d346-480c-9255-6eef5f2c4d29-kube-api-access-wbjdg\") pod \"dnsmasq-dns-78dd6ddcc-9dx47\" (UID: \"85abc773-d346-480c-9255-6eef5f2c4d29\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9dx47" Oct 03 08:06:15 crc kubenswrapper[4664]: I1003 08:06:15.192597 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-zrgdw" Oct 03 08:06:15 crc kubenswrapper[4664]: I1003 08:06:15.200469 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbjdg\" (UniqueName: \"kubernetes.io/projected/85abc773-d346-480c-9255-6eef5f2c4d29-kube-api-access-wbjdg\") pod \"dnsmasq-dns-78dd6ddcc-9dx47\" (UID: \"85abc773-d346-480c-9255-6eef5f2c4d29\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9dx47" Oct 03 08:06:15 crc kubenswrapper[4664]: I1003 08:06:15.251698 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-9dx47" Oct 03 08:06:15 crc kubenswrapper[4664]: I1003 08:06:15.687794 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-zrgdw"] Oct 03 08:06:15 crc kubenswrapper[4664]: I1003 08:06:15.749458 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-9dx47"] Oct 03 08:06:15 crc kubenswrapper[4664]: W1003 08:06:15.751139 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85abc773_d346_480c_9255_6eef5f2c4d29.slice/crio-a84bc645cac9c81cc00c6071c96710a1585cc7a3b9440b1cc92e808ae4e37333 WatchSource:0}: Error finding container a84bc645cac9c81cc00c6071c96710a1585cc7a3b9440b1cc92e808ae4e37333: Status 404 returned error can't find the container with id a84bc645cac9c81cc00c6071c96710a1585cc7a3b9440b1cc92e808ae4e37333 Oct 03 08:06:15 crc kubenswrapper[4664]: I1003 08:06:15.949977 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-zrgdw"] Oct 03 08:06:15 crc kubenswrapper[4664]: I1003 08:06:15.987070 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-pvnb8"] Oct 03 08:06:15 crc kubenswrapper[4664]: I1003 08:06:15.988714 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-pvnb8" Oct 03 08:06:16 crc kubenswrapper[4664]: I1003 08:06:16.004295 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-pvnb8"] Oct 03 08:06:16 crc kubenswrapper[4664]: I1003 08:06:16.104544 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/671473b6-b445-4c88-8a2e-1c20a23c5b4f-dns-svc\") pod \"dnsmasq-dns-666b6646f7-pvnb8\" (UID: \"671473b6-b445-4c88-8a2e-1c20a23c5b4f\") " pod="openstack/dnsmasq-dns-666b6646f7-pvnb8" Oct 03 08:06:16 crc kubenswrapper[4664]: I1003 08:06:16.104632 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/671473b6-b445-4c88-8a2e-1c20a23c5b4f-config\") pod \"dnsmasq-dns-666b6646f7-pvnb8\" (UID: \"671473b6-b445-4c88-8a2e-1c20a23c5b4f\") " pod="openstack/dnsmasq-dns-666b6646f7-pvnb8" Oct 03 08:06:16 crc kubenswrapper[4664]: I1003 08:06:16.104689 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqsl9\" (UniqueName: \"kubernetes.io/projected/671473b6-b445-4c88-8a2e-1c20a23c5b4f-kube-api-access-mqsl9\") pod \"dnsmasq-dns-666b6646f7-pvnb8\" (UID: \"671473b6-b445-4c88-8a2e-1c20a23c5b4f\") " pod="openstack/dnsmasq-dns-666b6646f7-pvnb8" Oct 03 08:06:16 crc kubenswrapper[4664]: I1003 08:06:16.206539 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/671473b6-b445-4c88-8a2e-1c20a23c5b4f-dns-svc\") pod \"dnsmasq-dns-666b6646f7-pvnb8\" (UID: \"671473b6-b445-4c88-8a2e-1c20a23c5b4f\") " pod="openstack/dnsmasq-dns-666b6646f7-pvnb8" Oct 03 08:06:16 crc kubenswrapper[4664]: I1003 08:06:16.206912 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/671473b6-b445-4c88-8a2e-1c20a23c5b4f-config\") pod \"dnsmasq-dns-666b6646f7-pvnb8\" (UID: \"671473b6-b445-4c88-8a2e-1c20a23c5b4f\") " pod="openstack/dnsmasq-dns-666b6646f7-pvnb8" Oct 03 08:06:16 crc kubenswrapper[4664]: I1003 08:06:16.206970 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqsl9\" (UniqueName: \"kubernetes.io/projected/671473b6-b445-4c88-8a2e-1c20a23c5b4f-kube-api-access-mqsl9\") pod \"dnsmasq-dns-666b6646f7-pvnb8\" (UID: \"671473b6-b445-4c88-8a2e-1c20a23c5b4f\") " pod="openstack/dnsmasq-dns-666b6646f7-pvnb8" Oct 03 08:06:16 crc kubenswrapper[4664]: I1003 08:06:16.207750 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/671473b6-b445-4c88-8a2e-1c20a23c5b4f-dns-svc\") pod \"dnsmasq-dns-666b6646f7-pvnb8\" (UID: \"671473b6-b445-4c88-8a2e-1c20a23c5b4f\") " pod="openstack/dnsmasq-dns-666b6646f7-pvnb8" Oct 03 08:06:16 crc kubenswrapper[4664]: I1003 08:06:16.207842 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/671473b6-b445-4c88-8a2e-1c20a23c5b4f-config\") pod \"dnsmasq-dns-666b6646f7-pvnb8\" (UID: \"671473b6-b445-4c88-8a2e-1c20a23c5b4f\") " pod="openstack/dnsmasq-dns-666b6646f7-pvnb8" Oct 03 08:06:16 crc kubenswrapper[4664]: I1003 08:06:16.250342 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqsl9\" (UniqueName: \"kubernetes.io/projected/671473b6-b445-4c88-8a2e-1c20a23c5b4f-kube-api-access-mqsl9\") pod \"dnsmasq-dns-666b6646f7-pvnb8\" (UID: \"671473b6-b445-4c88-8a2e-1c20a23c5b4f\") " pod="openstack/dnsmasq-dns-666b6646f7-pvnb8" Oct 03 08:06:16 crc kubenswrapper[4664]: I1003 08:06:16.296305 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-9dx47"] Oct 03 08:06:16 crc kubenswrapper[4664]: I1003 08:06:16.319330 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-hwlpb"] Oct 03 08:06:16 crc kubenswrapper[4664]: I1003 08:06:16.320566 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-hwlpb" Oct 03 08:06:16 crc kubenswrapper[4664]: I1003 08:06:16.334160 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-hwlpb"] Oct 03 08:06:16 crc kubenswrapper[4664]: I1003 08:06:16.368133 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-pvnb8" Oct 03 08:06:16 crc kubenswrapper[4664]: I1003 08:06:16.408875 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-hwlpb\" (UID: \"96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1\") " pod="openstack/dnsmasq-dns-57d769cc4f-hwlpb" Oct 03 08:06:16 crc kubenswrapper[4664]: I1003 08:06:16.408941 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6cg5\" (UniqueName: \"kubernetes.io/projected/96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1-kube-api-access-r6cg5\") pod \"dnsmasq-dns-57d769cc4f-hwlpb\" (UID: \"96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1\") " pod="openstack/dnsmasq-dns-57d769cc4f-hwlpb" Oct 03 08:06:16 crc kubenswrapper[4664]: I1003 08:06:16.408982 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1-config\") pod \"dnsmasq-dns-57d769cc4f-hwlpb\" (UID: \"96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1\") " pod="openstack/dnsmasq-dns-57d769cc4f-hwlpb" Oct 03 08:06:16 crc kubenswrapper[4664]: I1003 08:06:16.504768 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-9dx47" event={"ID":"85abc773-d346-480c-9255-6eef5f2c4d29","Type":"ContainerStarted","Data":"a84bc645cac9c81cc00c6071c96710a1585cc7a3b9440b1cc92e808ae4e37333"} Oct 03 08:06:16 crc kubenswrapper[4664]: I1003 08:06:16.509072 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-zrgdw" event={"ID":"46d660af-8326-4df1-a0e7-2f8804fea23d","Type":"ContainerStarted","Data":"8575d4fd5f809af785d15b40199e333660caffc411ee9e1b01d975e207468004"} Oct 03 08:06:16 crc kubenswrapper[4664]: I1003 08:06:16.517133 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-hwlpb\" (UID: \"96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1\") " pod="openstack/dnsmasq-dns-57d769cc4f-hwlpb" Oct 03 08:06:16 crc kubenswrapper[4664]: I1003 08:06:16.517202 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6cg5\" (UniqueName: \"kubernetes.io/projected/96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1-kube-api-access-r6cg5\") pod \"dnsmasq-dns-57d769cc4f-hwlpb\" (UID: \"96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1\") " pod="openstack/dnsmasq-dns-57d769cc4f-hwlpb" Oct 03 08:06:16 crc kubenswrapper[4664]: I1003 08:06:16.517234 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1-config\") pod \"dnsmasq-dns-57d769cc4f-hwlpb\" (UID: \"96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1\") " pod="openstack/dnsmasq-dns-57d769cc4f-hwlpb" Oct 03 08:06:16 crc kubenswrapper[4664]: I1003 08:06:16.518395 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-hwlpb\" (UID: \"96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1\") " pod="openstack/dnsmasq-dns-57d769cc4f-hwlpb" Oct 03 08:06:16 crc kubenswrapper[4664]: I1003 08:06:16.522930 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1-config\") pod \"dnsmasq-dns-57d769cc4f-hwlpb\" (UID: \"96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1\") " pod="openstack/dnsmasq-dns-57d769cc4f-hwlpb" Oct 03 08:06:16 crc kubenswrapper[4664]: I1003 08:06:16.550591 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6cg5\" (UniqueName: \"kubernetes.io/projected/96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1-kube-api-access-r6cg5\") pod \"dnsmasq-dns-57d769cc4f-hwlpb\" (UID: \"96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1\") " pod="openstack/dnsmasq-dns-57d769cc4f-hwlpb" Oct 03 08:06:16 crc kubenswrapper[4664]: I1003 08:06:16.639201 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-hwlpb" Oct 03 08:06:16 crc kubenswrapper[4664]: I1003 08:06:16.710396 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-pvnb8"] Oct 03 08:06:16 crc kubenswrapper[4664]: W1003 08:06:16.718132 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod671473b6_b445_4c88_8a2e_1c20a23c5b4f.slice/crio-ce1bd26e1916f81b444019a0a37e46b6bf0405f80acc73ce7b12a43044f036bb WatchSource:0}: Error finding container ce1bd26e1916f81b444019a0a37e46b6bf0405f80acc73ce7b12a43044f036bb: Status 404 returned error can't find the container with id ce1bd26e1916f81b444019a0a37e46b6bf0405f80acc73ce7b12a43044f036bb Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.138805 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.140170 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.148503 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.148757 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.148918 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.149143 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-dndjq" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.149555 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.149721 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.149872 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.157367 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.228055 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8bb7b62d-f030-45a7-b9f8-87852ea275de-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8bb7b62d-f030-45a7-b9f8-87852ea275de\") " pod="openstack/rabbitmq-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.228128 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdgwf\" (UniqueName: \"kubernetes.io/projected/8bb7b62d-f030-45a7-b9f8-87852ea275de-kube-api-access-wdgwf\") pod \"rabbitmq-server-0\" (UID: \"8bb7b62d-f030-45a7-b9f8-87852ea275de\") " pod="openstack/rabbitmq-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.228164 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8bb7b62d-f030-45a7-b9f8-87852ea275de-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8bb7b62d-f030-45a7-b9f8-87852ea275de\") " pod="openstack/rabbitmq-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.228184 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8bb7b62d-f030-45a7-b9f8-87852ea275de-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8bb7b62d-f030-45a7-b9f8-87852ea275de\") " pod="openstack/rabbitmq-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.228201 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8bb7b62d-f030-45a7-b9f8-87852ea275de-config-data\") pod \"rabbitmq-server-0\" (UID: \"8bb7b62d-f030-45a7-b9f8-87852ea275de\") " pod="openstack/rabbitmq-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.228227 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8bb7b62d-f030-45a7-b9f8-87852ea275de-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8bb7b62d-f030-45a7-b9f8-87852ea275de\") " pod="openstack/rabbitmq-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.228243 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8bb7b62d-f030-45a7-b9f8-87852ea275de-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"8bb7b62d-f030-45a7-b9f8-87852ea275de\") " pod="openstack/rabbitmq-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.228267 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8bb7b62d-f030-45a7-b9f8-87852ea275de-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8bb7b62d-f030-45a7-b9f8-87852ea275de\") " pod="openstack/rabbitmq-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.228288 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8bb7b62d-f030-45a7-b9f8-87852ea275de-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8bb7b62d-f030-45a7-b9f8-87852ea275de\") " pod="openstack/rabbitmq-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.228308 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"8bb7b62d-f030-45a7-b9f8-87852ea275de\") " pod="openstack/rabbitmq-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.228329 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8bb7b62d-f030-45a7-b9f8-87852ea275de-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8bb7b62d-f030-45a7-b9f8-87852ea275de\") " pod="openstack/rabbitmq-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.229443 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-hwlpb"] Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.329668 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8bb7b62d-f030-45a7-b9f8-87852ea275de-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8bb7b62d-f030-45a7-b9f8-87852ea275de\") " pod="openstack/rabbitmq-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.329727 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8bb7b62d-f030-45a7-b9f8-87852ea275de-config-data\") pod \"rabbitmq-server-0\" (UID: \"8bb7b62d-f030-45a7-b9f8-87852ea275de\") " pod="openstack/rabbitmq-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.329762 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8bb7b62d-f030-45a7-b9f8-87852ea275de-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8bb7b62d-f030-45a7-b9f8-87852ea275de\") " pod="openstack/rabbitmq-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.329780 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8bb7b62d-f030-45a7-b9f8-87852ea275de-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"8bb7b62d-f030-45a7-b9f8-87852ea275de\") " pod="openstack/rabbitmq-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.329802 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8bb7b62d-f030-45a7-b9f8-87852ea275de-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8bb7b62d-f030-45a7-b9f8-87852ea275de\") " pod="openstack/rabbitmq-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.329829 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8bb7b62d-f030-45a7-b9f8-87852ea275de-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8bb7b62d-f030-45a7-b9f8-87852ea275de\") " pod="openstack/rabbitmq-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.329848 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"8bb7b62d-f030-45a7-b9f8-87852ea275de\") " pod="openstack/rabbitmq-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.329872 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8bb7b62d-f030-45a7-b9f8-87852ea275de-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8bb7b62d-f030-45a7-b9f8-87852ea275de\") " pod="openstack/rabbitmq-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.329916 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8bb7b62d-f030-45a7-b9f8-87852ea275de-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8bb7b62d-f030-45a7-b9f8-87852ea275de\") " pod="openstack/rabbitmq-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.329964 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdgwf\" (UniqueName: \"kubernetes.io/projected/8bb7b62d-f030-45a7-b9f8-87852ea275de-kube-api-access-wdgwf\") pod \"rabbitmq-server-0\" (UID: \"8bb7b62d-f030-45a7-b9f8-87852ea275de\") " pod="openstack/rabbitmq-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.329991 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8bb7b62d-f030-45a7-b9f8-87852ea275de-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8bb7b62d-f030-45a7-b9f8-87852ea275de\") " pod="openstack/rabbitmq-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.330949 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8bb7b62d-f030-45a7-b9f8-87852ea275de-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8bb7b62d-f030-45a7-b9f8-87852ea275de\") " pod="openstack/rabbitmq-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.333824 4664 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"8bb7b62d-f030-45a7-b9f8-87852ea275de\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.334139 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8bb7b62d-f030-45a7-b9f8-87852ea275de-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8bb7b62d-f030-45a7-b9f8-87852ea275de\") " pod="openstack/rabbitmq-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.335891 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8bb7b62d-f030-45a7-b9f8-87852ea275de-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8bb7b62d-f030-45a7-b9f8-87852ea275de\") " pod="openstack/rabbitmq-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.337072 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8bb7b62d-f030-45a7-b9f8-87852ea275de-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8bb7b62d-f030-45a7-b9f8-87852ea275de\") " pod="openstack/rabbitmq-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.337735 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8bb7b62d-f030-45a7-b9f8-87852ea275de-config-data\") pod \"rabbitmq-server-0\" (UID: \"8bb7b62d-f030-45a7-b9f8-87852ea275de\") " pod="openstack/rabbitmq-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.347658 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8bb7b62d-f030-45a7-b9f8-87852ea275de-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"8bb7b62d-f030-45a7-b9f8-87852ea275de\") " pod="openstack/rabbitmq-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.357898 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8bb7b62d-f030-45a7-b9f8-87852ea275de-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8bb7b62d-f030-45a7-b9f8-87852ea275de\") " pod="openstack/rabbitmq-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.367720 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8bb7b62d-f030-45a7-b9f8-87852ea275de-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8bb7b62d-f030-45a7-b9f8-87852ea275de\") " pod="openstack/rabbitmq-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.378113 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"8bb7b62d-f030-45a7-b9f8-87852ea275de\") " pod="openstack/rabbitmq-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.379266 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8bb7b62d-f030-45a7-b9f8-87852ea275de-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8bb7b62d-f030-45a7-b9f8-87852ea275de\") " pod="openstack/rabbitmq-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.382582 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdgwf\" (UniqueName: \"kubernetes.io/projected/8bb7b62d-f030-45a7-b9f8-87852ea275de-kube-api-access-wdgwf\") pod \"rabbitmq-server-0\" (UID: \"8bb7b62d-f030-45a7-b9f8-87852ea275de\") " pod="openstack/rabbitmq-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.468967 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.470208 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.472917 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.477470 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.477538 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-fv2fr" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.477548 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.477717 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.477823 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.477823 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.483577 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.518865 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-hwlpb" event={"ID":"96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1","Type":"ContainerStarted","Data":"813c6da658859d0c7f691cc971c55738fa9cc259173aa8fd5752bc913c2f38dd"} Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.519271 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.520198 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-pvnb8" event={"ID":"671473b6-b445-4c88-8a2e-1c20a23c5b4f","Type":"ContainerStarted","Data":"ce1bd26e1916f81b444019a0a37e46b6bf0405f80acc73ce7b12a43044f036bb"} Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.534427 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b8ae1def-1d1a-4acd-af78-204219a99fe6-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8ae1def-1d1a-4acd-af78-204219a99fe6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.534467 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b8ae1def-1d1a-4acd-af78-204219a99fe6-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8ae1def-1d1a-4acd-af78-204219a99fe6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.534489 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8ae1def-1d1a-4acd-af78-204219a99fe6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.534529 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b8ae1def-1d1a-4acd-af78-204219a99fe6-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8ae1def-1d1a-4acd-af78-204219a99fe6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.534561 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b8ae1def-1d1a-4acd-af78-204219a99fe6-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8ae1def-1d1a-4acd-af78-204219a99fe6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.534586 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b8ae1def-1d1a-4acd-af78-204219a99fe6-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8ae1def-1d1a-4acd-af78-204219a99fe6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.534625 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b8ae1def-1d1a-4acd-af78-204219a99fe6-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8ae1def-1d1a-4acd-af78-204219a99fe6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.534642 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b8ae1def-1d1a-4acd-af78-204219a99fe6-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8ae1def-1d1a-4acd-af78-204219a99fe6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.534663 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4rm9\" (UniqueName: \"kubernetes.io/projected/b8ae1def-1d1a-4acd-af78-204219a99fe6-kube-api-access-g4rm9\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8ae1def-1d1a-4acd-af78-204219a99fe6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.534691 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b8ae1def-1d1a-4acd-af78-204219a99fe6-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8ae1def-1d1a-4acd-af78-204219a99fe6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.534713 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b8ae1def-1d1a-4acd-af78-204219a99fe6-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8ae1def-1d1a-4acd-af78-204219a99fe6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.635577 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b8ae1def-1d1a-4acd-af78-204219a99fe6-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8ae1def-1d1a-4acd-af78-204219a99fe6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.635659 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b8ae1def-1d1a-4acd-af78-204219a99fe6-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8ae1def-1d1a-4acd-af78-204219a99fe6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.635686 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b8ae1def-1d1a-4acd-af78-204219a99fe6-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8ae1def-1d1a-4acd-af78-204219a99fe6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.635715 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b8ae1def-1d1a-4acd-af78-204219a99fe6-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8ae1def-1d1a-4acd-af78-204219a99fe6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.635731 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b8ae1def-1d1a-4acd-af78-204219a99fe6-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8ae1def-1d1a-4acd-af78-204219a99fe6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.635747 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4rm9\" (UniqueName: \"kubernetes.io/projected/b8ae1def-1d1a-4acd-af78-204219a99fe6-kube-api-access-g4rm9\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8ae1def-1d1a-4acd-af78-204219a99fe6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.635770 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b8ae1def-1d1a-4acd-af78-204219a99fe6-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8ae1def-1d1a-4acd-af78-204219a99fe6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.635785 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b8ae1def-1d1a-4acd-af78-204219a99fe6-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8ae1def-1d1a-4acd-af78-204219a99fe6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.635810 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b8ae1def-1d1a-4acd-af78-204219a99fe6-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8ae1def-1d1a-4acd-af78-204219a99fe6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.635828 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b8ae1def-1d1a-4acd-af78-204219a99fe6-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8ae1def-1d1a-4acd-af78-204219a99fe6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.635845 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8ae1def-1d1a-4acd-af78-204219a99fe6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.636127 4664 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8ae1def-1d1a-4acd-af78-204219a99fe6\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.639276 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b8ae1def-1d1a-4acd-af78-204219a99fe6-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8ae1def-1d1a-4acd-af78-204219a99fe6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.641301 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b8ae1def-1d1a-4acd-af78-204219a99fe6-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8ae1def-1d1a-4acd-af78-204219a99fe6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.641817 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b8ae1def-1d1a-4acd-af78-204219a99fe6-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8ae1def-1d1a-4acd-af78-204219a99fe6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.641857 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b8ae1def-1d1a-4acd-af78-204219a99fe6-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8ae1def-1d1a-4acd-af78-204219a99fe6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.642600 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b8ae1def-1d1a-4acd-af78-204219a99fe6-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8ae1def-1d1a-4acd-af78-204219a99fe6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.642907 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b8ae1def-1d1a-4acd-af78-204219a99fe6-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8ae1def-1d1a-4acd-af78-204219a99fe6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.645180 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b8ae1def-1d1a-4acd-af78-204219a99fe6-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8ae1def-1d1a-4acd-af78-204219a99fe6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.658239 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4rm9\" (UniqueName: \"kubernetes.io/projected/b8ae1def-1d1a-4acd-af78-204219a99fe6-kube-api-access-g4rm9\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8ae1def-1d1a-4acd-af78-204219a99fe6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.660559 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b8ae1def-1d1a-4acd-af78-204219a99fe6-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8ae1def-1d1a-4acd-af78-204219a99fe6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.661325 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b8ae1def-1d1a-4acd-af78-204219a99fe6-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8ae1def-1d1a-4acd-af78-204219a99fe6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.662926 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8ae1def-1d1a-4acd-af78-204219a99fe6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:06:17 crc kubenswrapper[4664]: I1003 08:06:17.799726 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:06:18 crc kubenswrapper[4664]: I1003 08:06:18.127555 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 08:06:18 crc kubenswrapper[4664]: I1003 08:06:18.471983 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 08:06:18 crc kubenswrapper[4664]: I1003 08:06:18.529156 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b8ae1def-1d1a-4acd-af78-204219a99fe6","Type":"ContainerStarted","Data":"89fafbe8bc674ee21df5e8484a48004dd14c2fdd355ca43e1401ccde72f7e7bd"} Oct 03 08:06:18 crc kubenswrapper[4664]: I1003 08:06:18.531875 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8bb7b62d-f030-45a7-b9f8-87852ea275de","Type":"ContainerStarted","Data":"56fe3d12375be207cde5b8e108243abbb55d19cce00dcf9b4720c4d39ddd2f81"} Oct 03 08:06:18 crc kubenswrapper[4664]: I1003 08:06:18.880019 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 03 08:06:18 crc kubenswrapper[4664]: I1003 08:06:18.881426 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 03 08:06:18 crc kubenswrapper[4664]: I1003 08:06:18.889553 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-q68st" Oct 03 08:06:18 crc kubenswrapper[4664]: I1003 08:06:18.889655 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 03 08:06:18 crc kubenswrapper[4664]: I1003 08:06:18.890095 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 03 08:06:18 crc kubenswrapper[4664]: I1003 08:06:18.890516 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 03 08:06:18 crc kubenswrapper[4664]: I1003 08:06:18.890618 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 03 08:06:18 crc kubenswrapper[4664]: I1003 08:06:18.898524 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 03 08:06:18 crc kubenswrapper[4664]: I1003 08:06:18.901854 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 03 08:06:18 crc kubenswrapper[4664]: I1003 08:06:18.975619 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a5b770eb-2222-43c8-bb15-6e2d18e95fbf-config-data-default\") pod \"openstack-galera-0\" (UID: \"a5b770eb-2222-43c8-bb15-6e2d18e95fbf\") " pod="openstack/openstack-galera-0" Oct 03 08:06:18 crc kubenswrapper[4664]: I1003 08:06:18.975689 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5b770eb-2222-43c8-bb15-6e2d18e95fbf-operator-scripts\") pod \"openstack-galera-0\" (UID: \"a5b770eb-2222-43c8-bb15-6e2d18e95fbf\") " pod="openstack/openstack-galera-0" Oct 03 08:06:18 crc kubenswrapper[4664]: I1003 08:06:18.975719 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a5b770eb-2222-43c8-bb15-6e2d18e95fbf-kolla-config\") pod \"openstack-galera-0\" (UID: \"a5b770eb-2222-43c8-bb15-6e2d18e95fbf\") " pod="openstack/openstack-galera-0" Oct 03 08:06:18 crc kubenswrapper[4664]: I1003 08:06:18.975746 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqxfq\" (UniqueName: \"kubernetes.io/projected/a5b770eb-2222-43c8-bb15-6e2d18e95fbf-kube-api-access-vqxfq\") pod \"openstack-galera-0\" (UID: \"a5b770eb-2222-43c8-bb15-6e2d18e95fbf\") " pod="openstack/openstack-galera-0" Oct 03 08:06:18 crc kubenswrapper[4664]: I1003 08:06:18.975764 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5b770eb-2222-43c8-bb15-6e2d18e95fbf-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"a5b770eb-2222-43c8-bb15-6e2d18e95fbf\") " pod="openstack/openstack-galera-0" Oct 03 08:06:18 crc kubenswrapper[4664]: I1003 08:06:18.975919 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"a5b770eb-2222-43c8-bb15-6e2d18e95fbf\") " pod="openstack/openstack-galera-0" Oct 03 08:06:18 crc kubenswrapper[4664]: I1003 08:06:18.976032 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5b770eb-2222-43c8-bb15-6e2d18e95fbf-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"a5b770eb-2222-43c8-bb15-6e2d18e95fbf\") " pod="openstack/openstack-galera-0" Oct 03 08:06:18 crc kubenswrapper[4664]: I1003 08:06:18.976301 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a5b770eb-2222-43c8-bb15-6e2d18e95fbf-config-data-generated\") pod \"openstack-galera-0\" (UID: \"a5b770eb-2222-43c8-bb15-6e2d18e95fbf\") " pod="openstack/openstack-galera-0" Oct 03 08:06:18 crc kubenswrapper[4664]: I1003 08:06:18.976432 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/a5b770eb-2222-43c8-bb15-6e2d18e95fbf-secrets\") pod \"openstack-galera-0\" (UID: \"a5b770eb-2222-43c8-bb15-6e2d18e95fbf\") " pod="openstack/openstack-galera-0" Oct 03 08:06:19 crc kubenswrapper[4664]: I1003 08:06:19.078090 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a5b770eb-2222-43c8-bb15-6e2d18e95fbf-config-data-generated\") pod \"openstack-galera-0\" (UID: \"a5b770eb-2222-43c8-bb15-6e2d18e95fbf\") " pod="openstack/openstack-galera-0" Oct 03 08:06:19 crc kubenswrapper[4664]: I1003 08:06:19.078178 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/a5b770eb-2222-43c8-bb15-6e2d18e95fbf-secrets\") pod \"openstack-galera-0\" (UID: \"a5b770eb-2222-43c8-bb15-6e2d18e95fbf\") " pod="openstack/openstack-galera-0" Oct 03 08:06:19 crc kubenswrapper[4664]: I1003 08:06:19.078244 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a5b770eb-2222-43c8-bb15-6e2d18e95fbf-config-data-default\") pod \"openstack-galera-0\" (UID: \"a5b770eb-2222-43c8-bb15-6e2d18e95fbf\") " pod="openstack/openstack-galera-0" Oct 03 08:06:19 crc kubenswrapper[4664]: I1003 08:06:19.078278 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5b770eb-2222-43c8-bb15-6e2d18e95fbf-operator-scripts\") pod \"openstack-galera-0\" (UID: \"a5b770eb-2222-43c8-bb15-6e2d18e95fbf\") " pod="openstack/openstack-galera-0" Oct 03 08:06:19 crc kubenswrapper[4664]: I1003 08:06:19.078330 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a5b770eb-2222-43c8-bb15-6e2d18e95fbf-kolla-config\") pod \"openstack-galera-0\" (UID: \"a5b770eb-2222-43c8-bb15-6e2d18e95fbf\") " pod="openstack/openstack-galera-0" Oct 03 08:06:19 crc kubenswrapper[4664]: I1003 08:06:19.078355 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqxfq\" (UniqueName: \"kubernetes.io/projected/a5b770eb-2222-43c8-bb15-6e2d18e95fbf-kube-api-access-vqxfq\") pod \"openstack-galera-0\" (UID: \"a5b770eb-2222-43c8-bb15-6e2d18e95fbf\") " pod="openstack/openstack-galera-0" Oct 03 08:06:19 crc kubenswrapper[4664]: I1003 08:06:19.078396 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5b770eb-2222-43c8-bb15-6e2d18e95fbf-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"a5b770eb-2222-43c8-bb15-6e2d18e95fbf\") " pod="openstack/openstack-galera-0" Oct 03 08:06:19 crc kubenswrapper[4664]: I1003 08:06:19.078426 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"a5b770eb-2222-43c8-bb15-6e2d18e95fbf\") " pod="openstack/openstack-galera-0" Oct 03 08:06:19 crc kubenswrapper[4664]: I1003 08:06:19.078490 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5b770eb-2222-43c8-bb15-6e2d18e95fbf-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"a5b770eb-2222-43c8-bb15-6e2d18e95fbf\") " pod="openstack/openstack-galera-0" Oct 03 08:06:19 crc kubenswrapper[4664]: I1003 08:06:19.081110 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5b770eb-2222-43c8-bb15-6e2d18e95fbf-operator-scripts\") pod \"openstack-galera-0\" (UID: \"a5b770eb-2222-43c8-bb15-6e2d18e95fbf\") " pod="openstack/openstack-galera-0" Oct 03 08:06:19 crc kubenswrapper[4664]: I1003 08:06:19.081511 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a5b770eb-2222-43c8-bb15-6e2d18e95fbf-config-data-generated\") pod \"openstack-galera-0\" (UID: \"a5b770eb-2222-43c8-bb15-6e2d18e95fbf\") " pod="openstack/openstack-galera-0" Oct 03 08:06:19 crc kubenswrapper[4664]: I1003 08:06:19.087168 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/a5b770eb-2222-43c8-bb15-6e2d18e95fbf-secrets\") pod \"openstack-galera-0\" (UID: \"a5b770eb-2222-43c8-bb15-6e2d18e95fbf\") " pod="openstack/openstack-galera-0" Oct 03 08:06:19 crc kubenswrapper[4664]: I1003 08:06:19.087250 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5b770eb-2222-43c8-bb15-6e2d18e95fbf-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"a5b770eb-2222-43c8-bb15-6e2d18e95fbf\") " pod="openstack/openstack-galera-0" Oct 03 08:06:19 crc kubenswrapper[4664]: I1003 08:06:19.087250 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a5b770eb-2222-43c8-bb15-6e2d18e95fbf-kolla-config\") pod \"openstack-galera-0\" (UID: \"a5b770eb-2222-43c8-bb15-6e2d18e95fbf\") " pod="openstack/openstack-galera-0" Oct 03 08:06:19 crc kubenswrapper[4664]: I1003 08:06:19.087545 4664 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"a5b770eb-2222-43c8-bb15-6e2d18e95fbf\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/openstack-galera-0" Oct 03 08:06:19 crc kubenswrapper[4664]: I1003 08:06:19.088963 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a5b770eb-2222-43c8-bb15-6e2d18e95fbf-config-data-default\") pod \"openstack-galera-0\" (UID: \"a5b770eb-2222-43c8-bb15-6e2d18e95fbf\") " pod="openstack/openstack-galera-0" Oct 03 08:06:19 crc kubenswrapper[4664]: I1003 08:06:19.093292 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5b770eb-2222-43c8-bb15-6e2d18e95fbf-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"a5b770eb-2222-43c8-bb15-6e2d18e95fbf\") " pod="openstack/openstack-galera-0" Oct 03 08:06:19 crc kubenswrapper[4664]: I1003 08:06:19.111588 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"a5b770eb-2222-43c8-bb15-6e2d18e95fbf\") " pod="openstack/openstack-galera-0" Oct 03 08:06:19 crc kubenswrapper[4664]: I1003 08:06:19.111767 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqxfq\" (UniqueName: \"kubernetes.io/projected/a5b770eb-2222-43c8-bb15-6e2d18e95fbf-kube-api-access-vqxfq\") pod \"openstack-galera-0\" (UID: \"a5b770eb-2222-43c8-bb15-6e2d18e95fbf\") " pod="openstack/openstack-galera-0" Oct 03 08:06:19 crc kubenswrapper[4664]: I1003 08:06:19.229702 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.026704 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 03 08:06:20 crc kubenswrapper[4664]: W1003 08:06:20.042539 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5b770eb_2222_43c8_bb15_6e2d18e95fbf.slice/crio-5e01f85221e36024bde6bfed7ef44686e52d0a7d45fd8c883a46e11cac9072aa WatchSource:0}: Error finding container 5e01f85221e36024bde6bfed7ef44686e52d0a7d45fd8c883a46e11cac9072aa: Status 404 returned error can't find the container with id 5e01f85221e36024bde6bfed7ef44686e52d0a7d45fd8c883a46e11cac9072aa Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.337534 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.340196 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.348464 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.350456 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.350656 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-xvkr7" Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.350855 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.361160 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.484798 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.486086 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.488964 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.489018 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.489307 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-b4kwh" Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.496799 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.525631 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0710a5a3-3e65-42b8-bd1d-d40deb6a325d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"0710a5a3-3e65-42b8-bd1d-d40deb6a325d\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.525689 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfthl\" (UniqueName: \"kubernetes.io/projected/0710a5a3-3e65-42b8-bd1d-d40deb6a325d-kube-api-access-hfthl\") pod \"openstack-cell1-galera-0\" (UID: \"0710a5a3-3e65-42b8-bd1d-d40deb6a325d\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.525715 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/0710a5a3-3e65-42b8-bd1d-d40deb6a325d-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"0710a5a3-3e65-42b8-bd1d-d40deb6a325d\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.525741 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0710a5a3-3e65-42b8-bd1d-d40deb6a325d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"0710a5a3-3e65-42b8-bd1d-d40deb6a325d\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.525762 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0710a5a3-3e65-42b8-bd1d-d40deb6a325d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"0710a5a3-3e65-42b8-bd1d-d40deb6a325d\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.525778 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0710a5a3-3e65-42b8-bd1d-d40deb6a325d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"0710a5a3-3e65-42b8-bd1d-d40deb6a325d\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.525799 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"0710a5a3-3e65-42b8-bd1d-d40deb6a325d\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.525817 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0710a5a3-3e65-42b8-bd1d-d40deb6a325d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"0710a5a3-3e65-42b8-bd1d-d40deb6a325d\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.525843 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0710a5a3-3e65-42b8-bd1d-d40deb6a325d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"0710a5a3-3e65-42b8-bd1d-d40deb6a325d\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.579956 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a5b770eb-2222-43c8-bb15-6e2d18e95fbf","Type":"ContainerStarted","Data":"5e01f85221e36024bde6bfed7ef44686e52d0a7d45fd8c883a46e11cac9072aa"} Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.627900 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngtwl\" (UniqueName: \"kubernetes.io/projected/350764d9-9981-4b2d-b69d-42712338bdd1-kube-api-access-ngtwl\") pod \"memcached-0\" (UID: \"350764d9-9981-4b2d-b69d-42712338bdd1\") " pod="openstack/memcached-0" Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.633570 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0710a5a3-3e65-42b8-bd1d-d40deb6a325d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"0710a5a3-3e65-42b8-bd1d-d40deb6a325d\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.633754 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfthl\" (UniqueName: \"kubernetes.io/projected/0710a5a3-3e65-42b8-bd1d-d40deb6a325d-kube-api-access-hfthl\") pod \"openstack-cell1-galera-0\" (UID: \"0710a5a3-3e65-42b8-bd1d-d40deb6a325d\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.633826 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/350764d9-9981-4b2d-b69d-42712338bdd1-memcached-tls-certs\") pod \"memcached-0\" (UID: \"350764d9-9981-4b2d-b69d-42712338bdd1\") " pod="openstack/memcached-0" Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.633845 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/0710a5a3-3e65-42b8-bd1d-d40deb6a325d-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"0710a5a3-3e65-42b8-bd1d-d40deb6a325d\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.633927 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0710a5a3-3e65-42b8-bd1d-d40deb6a325d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"0710a5a3-3e65-42b8-bd1d-d40deb6a325d\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.634016 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0710a5a3-3e65-42b8-bd1d-d40deb6a325d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"0710a5a3-3e65-42b8-bd1d-d40deb6a325d\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.636160 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/350764d9-9981-4b2d-b69d-42712338bdd1-kolla-config\") pod \"memcached-0\" (UID: \"350764d9-9981-4b2d-b69d-42712338bdd1\") " pod="openstack/memcached-0" Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.636216 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0710a5a3-3e65-42b8-bd1d-d40deb6a325d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"0710a5a3-3e65-42b8-bd1d-d40deb6a325d\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.637000 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"0710a5a3-3e65-42b8-bd1d-d40deb6a325d\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.637467 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0710a5a3-3e65-42b8-bd1d-d40deb6a325d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"0710a5a3-3e65-42b8-bd1d-d40deb6a325d\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.638029 4664 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"0710a5a3-3e65-42b8-bd1d-d40deb6a325d\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/openstack-cell1-galera-0" Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.639194 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/350764d9-9981-4b2d-b69d-42712338bdd1-config-data\") pod \"memcached-0\" (UID: \"350764d9-9981-4b2d-b69d-42712338bdd1\") " pod="openstack/memcached-0" Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.639250 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0710a5a3-3e65-42b8-bd1d-d40deb6a325d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"0710a5a3-3e65-42b8-bd1d-d40deb6a325d\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.639422 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/350764d9-9981-4b2d-b69d-42712338bdd1-combined-ca-bundle\") pod \"memcached-0\" (UID: \"350764d9-9981-4b2d-b69d-42712338bdd1\") " pod="openstack/memcached-0" Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.712353 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"0710a5a3-3e65-42b8-bd1d-d40deb6a325d\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.719193 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0710a5a3-3e65-42b8-bd1d-d40deb6a325d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"0710a5a3-3e65-42b8-bd1d-d40deb6a325d\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.721045 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0710a5a3-3e65-42b8-bd1d-d40deb6a325d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"0710a5a3-3e65-42b8-bd1d-d40deb6a325d\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.721278 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0710a5a3-3e65-42b8-bd1d-d40deb6a325d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"0710a5a3-3e65-42b8-bd1d-d40deb6a325d\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.721555 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/0710a5a3-3e65-42b8-bd1d-d40deb6a325d-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"0710a5a3-3e65-42b8-bd1d-d40deb6a325d\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.721730 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0710a5a3-3e65-42b8-bd1d-d40deb6a325d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"0710a5a3-3e65-42b8-bd1d-d40deb6a325d\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.722394 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0710a5a3-3e65-42b8-bd1d-d40deb6a325d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"0710a5a3-3e65-42b8-bd1d-d40deb6a325d\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.723306 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfthl\" (UniqueName: \"kubernetes.io/projected/0710a5a3-3e65-42b8-bd1d-d40deb6a325d-kube-api-access-hfthl\") pod \"openstack-cell1-galera-0\" (UID: \"0710a5a3-3e65-42b8-bd1d-d40deb6a325d\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.723631 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0710a5a3-3e65-42b8-bd1d-d40deb6a325d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"0710a5a3-3e65-42b8-bd1d-d40deb6a325d\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.740435 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/350764d9-9981-4b2d-b69d-42712338bdd1-config-data\") pod \"memcached-0\" (UID: \"350764d9-9981-4b2d-b69d-42712338bdd1\") " pod="openstack/memcached-0" Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.740509 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/350764d9-9981-4b2d-b69d-42712338bdd1-combined-ca-bundle\") pod \"memcached-0\" (UID: \"350764d9-9981-4b2d-b69d-42712338bdd1\") " pod="openstack/memcached-0" Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.740542 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngtwl\" (UniqueName: \"kubernetes.io/projected/350764d9-9981-4b2d-b69d-42712338bdd1-kube-api-access-ngtwl\") pod \"memcached-0\" (UID: \"350764d9-9981-4b2d-b69d-42712338bdd1\") " pod="openstack/memcached-0" Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.740591 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/350764d9-9981-4b2d-b69d-42712338bdd1-memcached-tls-certs\") pod \"memcached-0\" (UID: \"350764d9-9981-4b2d-b69d-42712338bdd1\") " pod="openstack/memcached-0" Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.740637 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/350764d9-9981-4b2d-b69d-42712338bdd1-kolla-config\") pod \"memcached-0\" (UID: \"350764d9-9981-4b2d-b69d-42712338bdd1\") " pod="openstack/memcached-0" Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.741904 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/350764d9-9981-4b2d-b69d-42712338bdd1-config-data\") pod \"memcached-0\" (UID: \"350764d9-9981-4b2d-b69d-42712338bdd1\") " pod="openstack/memcached-0" Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.742682 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/350764d9-9981-4b2d-b69d-42712338bdd1-kolla-config\") pod \"memcached-0\" (UID: \"350764d9-9981-4b2d-b69d-42712338bdd1\") " pod="openstack/memcached-0" Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.747271 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/350764d9-9981-4b2d-b69d-42712338bdd1-combined-ca-bundle\") pod \"memcached-0\" (UID: \"350764d9-9981-4b2d-b69d-42712338bdd1\") " pod="openstack/memcached-0" Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.749388 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/350764d9-9981-4b2d-b69d-42712338bdd1-memcached-tls-certs\") pod \"memcached-0\" (UID: \"350764d9-9981-4b2d-b69d-42712338bdd1\") " pod="openstack/memcached-0" Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.764913 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngtwl\" (UniqueName: \"kubernetes.io/projected/350764d9-9981-4b2d-b69d-42712338bdd1-kube-api-access-ngtwl\") pod \"memcached-0\" (UID: \"350764d9-9981-4b2d-b69d-42712338bdd1\") " pod="openstack/memcached-0" Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.830553 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 03 08:06:20 crc kubenswrapper[4664]: I1003 08:06:20.970322 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 03 08:06:21 crc kubenswrapper[4664]: I1003 08:06:21.311464 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 03 08:06:21 crc kubenswrapper[4664]: W1003 08:06:21.327599 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod350764d9_9981_4b2d_b69d_42712338bdd1.slice/crio-3928e45f0ece08f97eaf1befe84f0cf75ffe77d49b4d2a54fe2d214a9d234807 WatchSource:0}: Error finding container 3928e45f0ece08f97eaf1befe84f0cf75ffe77d49b4d2a54fe2d214a9d234807: Status 404 returned error can't find the container with id 3928e45f0ece08f97eaf1befe84f0cf75ffe77d49b4d2a54fe2d214a9d234807 Oct 03 08:06:21 crc kubenswrapper[4664]: I1003 08:06:21.331247 4664 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 08:06:21 crc kubenswrapper[4664]: I1003 08:06:21.590261 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"350764d9-9981-4b2d-b69d-42712338bdd1","Type":"ContainerStarted","Data":"3928e45f0ece08f97eaf1befe84f0cf75ffe77d49b4d2a54fe2d214a9d234807"} Oct 03 08:06:21 crc kubenswrapper[4664]: I1003 08:06:21.784907 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 03 08:06:22 crc kubenswrapper[4664]: I1003 08:06:22.450325 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 08:06:22 crc kubenswrapper[4664]: I1003 08:06:22.451524 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 08:06:22 crc kubenswrapper[4664]: I1003 08:06:22.455509 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 08:06:22 crc kubenswrapper[4664]: I1003 08:06:22.455582 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-mv5fx" Oct 03 08:06:22 crc kubenswrapper[4664]: I1003 08:06:22.609811 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxrxn\" (UniqueName: \"kubernetes.io/projected/c1dcd075-92b3-4f17-888c-2e5580a45789-kube-api-access-hxrxn\") pod \"kube-state-metrics-0\" (UID: \"c1dcd075-92b3-4f17-888c-2e5580a45789\") " pod="openstack/kube-state-metrics-0" Oct 03 08:06:22 crc kubenswrapper[4664]: I1003 08:06:22.672126 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"0710a5a3-3e65-42b8-bd1d-d40deb6a325d","Type":"ContainerStarted","Data":"d05a8185e2965848b1fb2f0ee1b18a9dbab9c8c55c29c15e617b7675b770a6db"} Oct 03 08:06:22 crc kubenswrapper[4664]: I1003 08:06:22.711918 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxrxn\" (UniqueName: \"kubernetes.io/projected/c1dcd075-92b3-4f17-888c-2e5580a45789-kube-api-access-hxrxn\") pod \"kube-state-metrics-0\" (UID: \"c1dcd075-92b3-4f17-888c-2e5580a45789\") " pod="openstack/kube-state-metrics-0" Oct 03 08:06:22 crc kubenswrapper[4664]: I1003 08:06:22.746952 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxrxn\" (UniqueName: \"kubernetes.io/projected/c1dcd075-92b3-4f17-888c-2e5580a45789-kube-api-access-hxrxn\") pod \"kube-state-metrics-0\" (UID: \"c1dcd075-92b3-4f17-888c-2e5580a45789\") " pod="openstack/kube-state-metrics-0" Oct 03 08:06:22 crc kubenswrapper[4664]: I1003 08:06:22.779225 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 08:06:23 crc kubenswrapper[4664]: I1003 08:06:23.294540 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 08:06:23 crc kubenswrapper[4664]: W1003 08:06:23.356906 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1dcd075_92b3_4f17_888c_2e5580a45789.slice/crio-4524900e84497138721aee9226368bf37246668c11d3848b5f94816bf202c2e3 WatchSource:0}: Error finding container 4524900e84497138721aee9226368bf37246668c11d3848b5f94816bf202c2e3: Status 404 returned error can't find the container with id 4524900e84497138721aee9226368bf37246668c11d3848b5f94816bf202c2e3 Oct 03 08:06:23 crc kubenswrapper[4664]: I1003 08:06:23.683478 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c1dcd075-92b3-4f17-888c-2e5580a45789","Type":"ContainerStarted","Data":"4524900e84497138721aee9226368bf37246668c11d3848b5f94816bf202c2e3"} Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.112384 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-fb5ld"] Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.116293 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fb5ld" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.119389 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.123064 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-rmhd5" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.127181 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.140266 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-fb5ld"] Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.171775 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-qcwp9"] Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.174083 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-qcwp9" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.196809 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b65aa3e9-2d60-4cd9-b63a-93a07ab33e72-combined-ca-bundle\") pod \"ovn-controller-fb5ld\" (UID: \"b65aa3e9-2d60-4cd9-b63a-93a07ab33e72\") " pod="openstack/ovn-controller-fb5ld" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.196867 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b65aa3e9-2d60-4cd9-b63a-93a07ab33e72-scripts\") pod \"ovn-controller-fb5ld\" (UID: \"b65aa3e9-2d60-4cd9-b63a-93a07ab33e72\") " pod="openstack/ovn-controller-fb5ld" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.196894 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b65aa3e9-2d60-4cd9-b63a-93a07ab33e72-var-run-ovn\") pod \"ovn-controller-fb5ld\" (UID: \"b65aa3e9-2d60-4cd9-b63a-93a07ab33e72\") " pod="openstack/ovn-controller-fb5ld" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.196955 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b65aa3e9-2d60-4cd9-b63a-93a07ab33e72-ovn-controller-tls-certs\") pod \"ovn-controller-fb5ld\" (UID: \"b65aa3e9-2d60-4cd9-b63a-93a07ab33e72\") " pod="openstack/ovn-controller-fb5ld" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.196982 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b65aa3e9-2d60-4cd9-b63a-93a07ab33e72-var-run\") pod \"ovn-controller-fb5ld\" (UID: \"b65aa3e9-2d60-4cd9-b63a-93a07ab33e72\") " pod="openstack/ovn-controller-fb5ld" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.196997 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgxbj\" (UniqueName: \"kubernetes.io/projected/b65aa3e9-2d60-4cd9-b63a-93a07ab33e72-kube-api-access-rgxbj\") pod \"ovn-controller-fb5ld\" (UID: \"b65aa3e9-2d60-4cd9-b63a-93a07ab33e72\") " pod="openstack/ovn-controller-fb5ld" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.197021 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b65aa3e9-2d60-4cd9-b63a-93a07ab33e72-var-log-ovn\") pod \"ovn-controller-fb5ld\" (UID: \"b65aa3e9-2d60-4cd9-b63a-93a07ab33e72\") " pod="openstack/ovn-controller-fb5ld" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.197785 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-qcwp9"] Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.299863 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b65aa3e9-2d60-4cd9-b63a-93a07ab33e72-ovn-controller-tls-certs\") pod \"ovn-controller-fb5ld\" (UID: \"b65aa3e9-2d60-4cd9-b63a-93a07ab33e72\") " pod="openstack/ovn-controller-fb5ld" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.299929 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/deb5f246-e857-4517-9f9c-290bc76ba8f6-var-lib\") pod \"ovn-controller-ovs-qcwp9\" (UID: \"deb5f246-e857-4517-9f9c-290bc76ba8f6\") " pod="openstack/ovn-controller-ovs-qcwp9" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.299960 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b65aa3e9-2d60-4cd9-b63a-93a07ab33e72-var-run\") pod \"ovn-controller-fb5ld\" (UID: \"b65aa3e9-2d60-4cd9-b63a-93a07ab33e72\") " pod="openstack/ovn-controller-fb5ld" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.299981 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgxbj\" (UniqueName: \"kubernetes.io/projected/b65aa3e9-2d60-4cd9-b63a-93a07ab33e72-kube-api-access-rgxbj\") pod \"ovn-controller-fb5ld\" (UID: \"b65aa3e9-2d60-4cd9-b63a-93a07ab33e72\") " pod="openstack/ovn-controller-fb5ld" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.300015 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/deb5f246-e857-4517-9f9c-290bc76ba8f6-var-log\") pod \"ovn-controller-ovs-qcwp9\" (UID: \"deb5f246-e857-4517-9f9c-290bc76ba8f6\") " pod="openstack/ovn-controller-ovs-qcwp9" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.300037 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/deb5f246-e857-4517-9f9c-290bc76ba8f6-var-run\") pod \"ovn-controller-ovs-qcwp9\" (UID: \"deb5f246-e857-4517-9f9c-290bc76ba8f6\") " pod="openstack/ovn-controller-ovs-qcwp9" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.300058 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw2hz\" (UniqueName: \"kubernetes.io/projected/deb5f246-e857-4517-9f9c-290bc76ba8f6-kube-api-access-jw2hz\") pod \"ovn-controller-ovs-qcwp9\" (UID: \"deb5f246-e857-4517-9f9c-290bc76ba8f6\") " pod="openstack/ovn-controller-ovs-qcwp9" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.300092 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b65aa3e9-2d60-4cd9-b63a-93a07ab33e72-var-log-ovn\") pod \"ovn-controller-fb5ld\" (UID: \"b65aa3e9-2d60-4cd9-b63a-93a07ab33e72\") " pod="openstack/ovn-controller-fb5ld" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.300135 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b65aa3e9-2d60-4cd9-b63a-93a07ab33e72-combined-ca-bundle\") pod \"ovn-controller-fb5ld\" (UID: \"b65aa3e9-2d60-4cd9-b63a-93a07ab33e72\") " pod="openstack/ovn-controller-fb5ld" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.300186 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b65aa3e9-2d60-4cd9-b63a-93a07ab33e72-scripts\") pod \"ovn-controller-fb5ld\" (UID: \"b65aa3e9-2d60-4cd9-b63a-93a07ab33e72\") " pod="openstack/ovn-controller-fb5ld" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.300226 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b65aa3e9-2d60-4cd9-b63a-93a07ab33e72-var-run-ovn\") pod \"ovn-controller-fb5ld\" (UID: \"b65aa3e9-2d60-4cd9-b63a-93a07ab33e72\") " pod="openstack/ovn-controller-fb5ld" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.300280 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/deb5f246-e857-4517-9f9c-290bc76ba8f6-scripts\") pod \"ovn-controller-ovs-qcwp9\" (UID: \"deb5f246-e857-4517-9f9c-290bc76ba8f6\") " pod="openstack/ovn-controller-ovs-qcwp9" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.300309 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/deb5f246-e857-4517-9f9c-290bc76ba8f6-etc-ovs\") pod \"ovn-controller-ovs-qcwp9\" (UID: \"deb5f246-e857-4517-9f9c-290bc76ba8f6\") " pod="openstack/ovn-controller-ovs-qcwp9" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.301008 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b65aa3e9-2d60-4cd9-b63a-93a07ab33e72-var-log-ovn\") pod \"ovn-controller-fb5ld\" (UID: \"b65aa3e9-2d60-4cd9-b63a-93a07ab33e72\") " pod="openstack/ovn-controller-fb5ld" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.301520 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b65aa3e9-2d60-4cd9-b63a-93a07ab33e72-var-run\") pod \"ovn-controller-fb5ld\" (UID: \"b65aa3e9-2d60-4cd9-b63a-93a07ab33e72\") " pod="openstack/ovn-controller-fb5ld" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.302049 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b65aa3e9-2d60-4cd9-b63a-93a07ab33e72-var-run-ovn\") pod \"ovn-controller-fb5ld\" (UID: \"b65aa3e9-2d60-4cd9-b63a-93a07ab33e72\") " pod="openstack/ovn-controller-fb5ld" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.306465 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b65aa3e9-2d60-4cd9-b63a-93a07ab33e72-scripts\") pod \"ovn-controller-fb5ld\" (UID: \"b65aa3e9-2d60-4cd9-b63a-93a07ab33e72\") " pod="openstack/ovn-controller-fb5ld" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.309877 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b65aa3e9-2d60-4cd9-b63a-93a07ab33e72-ovn-controller-tls-certs\") pod \"ovn-controller-fb5ld\" (UID: \"b65aa3e9-2d60-4cd9-b63a-93a07ab33e72\") " pod="openstack/ovn-controller-fb5ld" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.318718 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b65aa3e9-2d60-4cd9-b63a-93a07ab33e72-combined-ca-bundle\") pod \"ovn-controller-fb5ld\" (UID: \"b65aa3e9-2d60-4cd9-b63a-93a07ab33e72\") " pod="openstack/ovn-controller-fb5ld" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.324181 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgxbj\" (UniqueName: \"kubernetes.io/projected/b65aa3e9-2d60-4cd9-b63a-93a07ab33e72-kube-api-access-rgxbj\") pod \"ovn-controller-fb5ld\" (UID: \"b65aa3e9-2d60-4cd9-b63a-93a07ab33e72\") " pod="openstack/ovn-controller-fb5ld" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.401769 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/deb5f246-e857-4517-9f9c-290bc76ba8f6-scripts\") pod \"ovn-controller-ovs-qcwp9\" (UID: \"deb5f246-e857-4517-9f9c-290bc76ba8f6\") " pod="openstack/ovn-controller-ovs-qcwp9" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.401869 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/deb5f246-e857-4517-9f9c-290bc76ba8f6-etc-ovs\") pod \"ovn-controller-ovs-qcwp9\" (UID: \"deb5f246-e857-4517-9f9c-290bc76ba8f6\") " pod="openstack/ovn-controller-ovs-qcwp9" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.401939 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/deb5f246-e857-4517-9f9c-290bc76ba8f6-var-lib\") pod \"ovn-controller-ovs-qcwp9\" (UID: \"deb5f246-e857-4517-9f9c-290bc76ba8f6\") " pod="openstack/ovn-controller-ovs-qcwp9" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.401973 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/deb5f246-e857-4517-9f9c-290bc76ba8f6-var-log\") pod \"ovn-controller-ovs-qcwp9\" (UID: \"deb5f246-e857-4517-9f9c-290bc76ba8f6\") " pod="openstack/ovn-controller-ovs-qcwp9" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.401995 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/deb5f246-e857-4517-9f9c-290bc76ba8f6-var-run\") pod \"ovn-controller-ovs-qcwp9\" (UID: \"deb5f246-e857-4517-9f9c-290bc76ba8f6\") " pod="openstack/ovn-controller-ovs-qcwp9" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.402013 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jw2hz\" (UniqueName: \"kubernetes.io/projected/deb5f246-e857-4517-9f9c-290bc76ba8f6-kube-api-access-jw2hz\") pod \"ovn-controller-ovs-qcwp9\" (UID: \"deb5f246-e857-4517-9f9c-290bc76ba8f6\") " pod="openstack/ovn-controller-ovs-qcwp9" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.402140 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/deb5f246-e857-4517-9f9c-290bc76ba8f6-etc-ovs\") pod \"ovn-controller-ovs-qcwp9\" (UID: \"deb5f246-e857-4517-9f9c-290bc76ba8f6\") " pod="openstack/ovn-controller-ovs-qcwp9" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.402257 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/deb5f246-e857-4517-9f9c-290bc76ba8f6-var-log\") pod \"ovn-controller-ovs-qcwp9\" (UID: \"deb5f246-e857-4517-9f9c-290bc76ba8f6\") " pod="openstack/ovn-controller-ovs-qcwp9" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.402408 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/deb5f246-e857-4517-9f9c-290bc76ba8f6-var-lib\") pod \"ovn-controller-ovs-qcwp9\" (UID: \"deb5f246-e857-4517-9f9c-290bc76ba8f6\") " pod="openstack/ovn-controller-ovs-qcwp9" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.402440 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/deb5f246-e857-4517-9f9c-290bc76ba8f6-var-run\") pod \"ovn-controller-ovs-qcwp9\" (UID: \"deb5f246-e857-4517-9f9c-290bc76ba8f6\") " pod="openstack/ovn-controller-ovs-qcwp9" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.405974 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/deb5f246-e857-4517-9f9c-290bc76ba8f6-scripts\") pod \"ovn-controller-ovs-qcwp9\" (UID: \"deb5f246-e857-4517-9f9c-290bc76ba8f6\") " pod="openstack/ovn-controller-ovs-qcwp9" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.438519 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw2hz\" (UniqueName: \"kubernetes.io/projected/deb5f246-e857-4517-9f9c-290bc76ba8f6-kube-api-access-jw2hz\") pod \"ovn-controller-ovs-qcwp9\" (UID: \"deb5f246-e857-4517-9f9c-290bc76ba8f6\") " pod="openstack/ovn-controller-ovs-qcwp9" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.463706 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fb5ld" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.497374 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.499071 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.503133 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.503479 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.503560 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.503734 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-kkvvk" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.509942 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.512872 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.528007 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-qcwp9" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.605880 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eeeda4d-56ed-4cc1-8f0a-fb43df8f94b3-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3eeeda4d-56ed-4cc1-8f0a-fb43df8f94b3\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.605999 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccqlq\" (UniqueName: \"kubernetes.io/projected/3eeeda4d-56ed-4cc1-8f0a-fb43df8f94b3-kube-api-access-ccqlq\") pod \"ovsdbserver-nb-0\" (UID: \"3eeeda4d-56ed-4cc1-8f0a-fb43df8f94b3\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.606034 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3eeeda4d-56ed-4cc1-8f0a-fb43df8f94b3-config\") pod \"ovsdbserver-nb-0\" (UID: \"3eeeda4d-56ed-4cc1-8f0a-fb43df8f94b3\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.606336 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3eeeda4d-56ed-4cc1-8f0a-fb43df8f94b3\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.606434 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3eeeda4d-56ed-4cc1-8f0a-fb43df8f94b3-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3eeeda4d-56ed-4cc1-8f0a-fb43df8f94b3\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.606554 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3eeeda4d-56ed-4cc1-8f0a-fb43df8f94b3-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3eeeda4d-56ed-4cc1-8f0a-fb43df8f94b3\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.606660 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3eeeda4d-56ed-4cc1-8f0a-fb43df8f94b3-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3eeeda4d-56ed-4cc1-8f0a-fb43df8f94b3\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.606715 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3eeeda4d-56ed-4cc1-8f0a-fb43df8f94b3-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3eeeda4d-56ed-4cc1-8f0a-fb43df8f94b3\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.708153 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eeeda4d-56ed-4cc1-8f0a-fb43df8f94b3-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3eeeda4d-56ed-4cc1-8f0a-fb43df8f94b3\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.708281 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccqlq\" (UniqueName: \"kubernetes.io/projected/3eeeda4d-56ed-4cc1-8f0a-fb43df8f94b3-kube-api-access-ccqlq\") pod \"ovsdbserver-nb-0\" (UID: \"3eeeda4d-56ed-4cc1-8f0a-fb43df8f94b3\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.708328 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3eeeda4d-56ed-4cc1-8f0a-fb43df8f94b3-config\") pod \"ovsdbserver-nb-0\" (UID: \"3eeeda4d-56ed-4cc1-8f0a-fb43df8f94b3\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.708403 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3eeeda4d-56ed-4cc1-8f0a-fb43df8f94b3\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.708449 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3eeeda4d-56ed-4cc1-8f0a-fb43df8f94b3-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3eeeda4d-56ed-4cc1-8f0a-fb43df8f94b3\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.708518 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3eeeda4d-56ed-4cc1-8f0a-fb43df8f94b3-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3eeeda4d-56ed-4cc1-8f0a-fb43df8f94b3\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.708540 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3eeeda4d-56ed-4cc1-8f0a-fb43df8f94b3-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3eeeda4d-56ed-4cc1-8f0a-fb43df8f94b3\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.708557 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3eeeda4d-56ed-4cc1-8f0a-fb43df8f94b3-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3eeeda4d-56ed-4cc1-8f0a-fb43df8f94b3\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.709297 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3eeeda4d-56ed-4cc1-8f0a-fb43df8f94b3-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3eeeda4d-56ed-4cc1-8f0a-fb43df8f94b3\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.710300 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3eeeda4d-56ed-4cc1-8f0a-fb43df8f94b3-config\") pod \"ovsdbserver-nb-0\" (UID: \"3eeeda4d-56ed-4cc1-8f0a-fb43df8f94b3\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.710300 4664 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3eeeda4d-56ed-4cc1-8f0a-fb43df8f94b3\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-nb-0" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.711021 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3eeeda4d-56ed-4cc1-8f0a-fb43df8f94b3-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3eeeda4d-56ed-4cc1-8f0a-fb43df8f94b3\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.713916 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eeeda4d-56ed-4cc1-8f0a-fb43df8f94b3-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3eeeda4d-56ed-4cc1-8f0a-fb43df8f94b3\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.717156 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3eeeda4d-56ed-4cc1-8f0a-fb43df8f94b3-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3eeeda4d-56ed-4cc1-8f0a-fb43df8f94b3\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.717204 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3eeeda4d-56ed-4cc1-8f0a-fb43df8f94b3-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3eeeda4d-56ed-4cc1-8f0a-fb43df8f94b3\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.725763 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccqlq\" (UniqueName: \"kubernetes.io/projected/3eeeda4d-56ed-4cc1-8f0a-fb43df8f94b3-kube-api-access-ccqlq\") pod \"ovsdbserver-nb-0\" (UID: \"3eeeda4d-56ed-4cc1-8f0a-fb43df8f94b3\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.744935 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3eeeda4d-56ed-4cc1-8f0a-fb43df8f94b3\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:06:26 crc kubenswrapper[4664]: I1003 08:06:26.831176 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 03 08:06:30 crc kubenswrapper[4664]: I1003 08:06:30.181147 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 03 08:06:30 crc kubenswrapper[4664]: I1003 08:06:30.183337 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 03 08:06:30 crc kubenswrapper[4664]: I1003 08:06:30.193910 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 03 08:06:30 crc kubenswrapper[4664]: I1003 08:06:30.194826 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 03 08:06:30 crc kubenswrapper[4664]: I1003 08:06:30.196215 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 03 08:06:30 crc kubenswrapper[4664]: I1003 08:06:30.199731 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-x6xh9" Oct 03 08:06:30 crc kubenswrapper[4664]: I1003 08:06:30.201016 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 03 08:06:30 crc kubenswrapper[4664]: I1003 08:06:30.371989 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dd7a7515-e2bf-4c9f-9dac-1c0bd3b87e78-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"dd7a7515-e2bf-4c9f-9dac-1c0bd3b87e78\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:06:30 crc kubenswrapper[4664]: I1003 08:06:30.372052 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd7a7515-e2bf-4c9f-9dac-1c0bd3b87e78-config\") pod \"ovsdbserver-sb-0\" (UID: \"dd7a7515-e2bf-4c9f-9dac-1c0bd3b87e78\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:06:30 crc kubenswrapper[4664]: I1003 08:06:30.372153 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7srv\" (UniqueName: \"kubernetes.io/projected/dd7a7515-e2bf-4c9f-9dac-1c0bd3b87e78-kube-api-access-w7srv\") pod \"ovsdbserver-sb-0\" (UID: \"dd7a7515-e2bf-4c9f-9dac-1c0bd3b87e78\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:06:30 crc kubenswrapper[4664]: I1003 08:06:30.372223 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd7a7515-e2bf-4c9f-9dac-1c0bd3b87e78-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"dd7a7515-e2bf-4c9f-9dac-1c0bd3b87e78\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:06:30 crc kubenswrapper[4664]: I1003 08:06:30.372693 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd7a7515-e2bf-4c9f-9dac-1c0bd3b87e78-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"dd7a7515-e2bf-4c9f-9dac-1c0bd3b87e78\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:06:30 crc kubenswrapper[4664]: I1003 08:06:30.372786 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd7a7515-e2bf-4c9f-9dac-1c0bd3b87e78-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"dd7a7515-e2bf-4c9f-9dac-1c0bd3b87e78\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:06:30 crc kubenswrapper[4664]: I1003 08:06:30.372968 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd7a7515-e2bf-4c9f-9dac-1c0bd3b87e78-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"dd7a7515-e2bf-4c9f-9dac-1c0bd3b87e78\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:06:30 crc kubenswrapper[4664]: I1003 08:06:30.373116 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"dd7a7515-e2bf-4c9f-9dac-1c0bd3b87e78\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:06:30 crc kubenswrapper[4664]: I1003 08:06:30.474403 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7srv\" (UniqueName: \"kubernetes.io/projected/dd7a7515-e2bf-4c9f-9dac-1c0bd3b87e78-kube-api-access-w7srv\") pod \"ovsdbserver-sb-0\" (UID: \"dd7a7515-e2bf-4c9f-9dac-1c0bd3b87e78\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:06:30 crc kubenswrapper[4664]: I1003 08:06:30.474489 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd7a7515-e2bf-4c9f-9dac-1c0bd3b87e78-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"dd7a7515-e2bf-4c9f-9dac-1c0bd3b87e78\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:06:30 crc kubenswrapper[4664]: I1003 08:06:30.474575 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd7a7515-e2bf-4c9f-9dac-1c0bd3b87e78-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"dd7a7515-e2bf-4c9f-9dac-1c0bd3b87e78\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:06:30 crc kubenswrapper[4664]: I1003 08:06:30.474593 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd7a7515-e2bf-4c9f-9dac-1c0bd3b87e78-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"dd7a7515-e2bf-4c9f-9dac-1c0bd3b87e78\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:06:30 crc kubenswrapper[4664]: I1003 08:06:30.474658 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd7a7515-e2bf-4c9f-9dac-1c0bd3b87e78-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"dd7a7515-e2bf-4c9f-9dac-1c0bd3b87e78\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:06:30 crc kubenswrapper[4664]: I1003 08:06:30.474700 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"dd7a7515-e2bf-4c9f-9dac-1c0bd3b87e78\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:06:30 crc kubenswrapper[4664]: I1003 08:06:30.474726 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dd7a7515-e2bf-4c9f-9dac-1c0bd3b87e78-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"dd7a7515-e2bf-4c9f-9dac-1c0bd3b87e78\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:06:30 crc kubenswrapper[4664]: I1003 08:06:30.474758 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd7a7515-e2bf-4c9f-9dac-1c0bd3b87e78-config\") pod \"ovsdbserver-sb-0\" (UID: \"dd7a7515-e2bf-4c9f-9dac-1c0bd3b87e78\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:06:30 crc kubenswrapper[4664]: I1003 08:06:30.475555 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd7a7515-e2bf-4c9f-9dac-1c0bd3b87e78-config\") pod \"ovsdbserver-sb-0\" (UID: \"dd7a7515-e2bf-4c9f-9dac-1c0bd3b87e78\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:06:30 crc kubenswrapper[4664]: I1003 08:06:30.476548 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dd7a7515-e2bf-4c9f-9dac-1c0bd3b87e78-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"dd7a7515-e2bf-4c9f-9dac-1c0bd3b87e78\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:06:30 crc kubenswrapper[4664]: I1003 08:06:30.476733 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd7a7515-e2bf-4c9f-9dac-1c0bd3b87e78-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"dd7a7515-e2bf-4c9f-9dac-1c0bd3b87e78\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:06:30 crc kubenswrapper[4664]: I1003 08:06:30.477209 4664 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"dd7a7515-e2bf-4c9f-9dac-1c0bd3b87e78\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/ovsdbserver-sb-0" Oct 03 08:06:30 crc kubenswrapper[4664]: I1003 08:06:30.481641 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd7a7515-e2bf-4c9f-9dac-1c0bd3b87e78-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"dd7a7515-e2bf-4c9f-9dac-1c0bd3b87e78\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:06:30 crc kubenswrapper[4664]: I1003 08:06:30.481687 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd7a7515-e2bf-4c9f-9dac-1c0bd3b87e78-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"dd7a7515-e2bf-4c9f-9dac-1c0bd3b87e78\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:06:30 crc kubenswrapper[4664]: I1003 08:06:30.481924 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd7a7515-e2bf-4c9f-9dac-1c0bd3b87e78-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"dd7a7515-e2bf-4c9f-9dac-1c0bd3b87e78\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:06:30 crc kubenswrapper[4664]: I1003 08:06:30.491704 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7srv\" (UniqueName: \"kubernetes.io/projected/dd7a7515-e2bf-4c9f-9dac-1c0bd3b87e78-kube-api-access-w7srv\") pod \"ovsdbserver-sb-0\" (UID: \"dd7a7515-e2bf-4c9f-9dac-1c0bd3b87e78\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:06:30 crc kubenswrapper[4664]: I1003 08:06:30.508825 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"dd7a7515-e2bf-4c9f-9dac-1c0bd3b87e78\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:06:30 crc kubenswrapper[4664]: I1003 08:06:30.516958 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 03 08:06:47 crc kubenswrapper[4664]: E1003 08:06:47.143335 4664 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-memcached:current-podified" Oct 03 08:06:47 crc kubenswrapper[4664]: E1003 08:06:47.144144 4664 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:n594h5fch7dh545h55bh5bh5b7h5f5h566hffh5bdh694h9chbdh68fh5bh699h597hf6h698h8dh664h9bhcchbch8ch594h688h5d9h677h5c5h5b8q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/certs/memcached.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/private/memcached.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ngtwl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_openstack(350764d9-9981-4b2d-b69d-42712338bdd1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 08:06:47 crc kubenswrapper[4664]: E1003 08:06:47.145369 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/memcached-0" podUID="350764d9-9981-4b2d-b69d-42712338bdd1" Oct 03 08:06:47 crc kubenswrapper[4664]: E1003 08:06:47.893740 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-memcached:current-podified\\\"\"" pod="openstack/memcached-0" podUID="350764d9-9981-4b2d-b69d-42712338bdd1" Oct 03 08:06:48 crc kubenswrapper[4664]: E1003 08:06:48.698053 4664 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Oct 03 08:06:48 crc kubenswrapper[4664]: E1003 08:06:48.698262 4664 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g4rm9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(b8ae1def-1d1a-4acd-af78-204219a99fe6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 08:06:48 crc kubenswrapper[4664]: E1003 08:06:48.699560 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b8ae1def-1d1a-4acd-af78-204219a99fe6" Oct 03 08:06:48 crc kubenswrapper[4664]: E1003 08:06:48.900652 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b8ae1def-1d1a-4acd-af78-204219a99fe6" Oct 03 08:06:51 crc kubenswrapper[4664]: E1003 08:06:51.364855 4664 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Oct 03 08:06:51 crc kubenswrapper[4664]: E1003 08:06:51.365345 4664 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:DB_ROOT_PASSWORD,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:DbRootPassword,Optional:nil,},},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:secrets,ReadOnly:true,MountPath:/var/lib/secrets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vqxfq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(a5b770eb-2222-43c8-bb15-6e2d18e95fbf): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 08:06:51 crc kubenswrapper[4664]: E1003 08:06:51.366532 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="a5b770eb-2222-43c8-bb15-6e2d18e95fbf" Oct 03 08:06:51 crc kubenswrapper[4664]: E1003 08:06:51.380835 4664 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying layer: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Oct 03 08:06:51 crc kubenswrapper[4664]: E1003 08:06:51.380903 4664 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying layer: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Oct 03 08:06:51 crc kubenswrapper[4664]: E1003 08:06:51.381058 4664 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hxrxn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(c1dcd075-92b3-4f17-888c-2e5580a45789): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying layer: context canceled" logger="UnhandledError" Oct 03 08:06:51 crc kubenswrapper[4664]: E1003 08:06:51.382390 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying layer: context canceled\"" pod="openstack/kube-state-metrics-0" podUID="c1dcd075-92b3-4f17-888c-2e5580a45789" Oct 03 08:06:51 crc kubenswrapper[4664]: E1003 08:06:51.941692 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-galera-0" podUID="a5b770eb-2222-43c8-bb15-6e2d18e95fbf" Oct 03 08:06:51 crc kubenswrapper[4664]: E1003 08:06:51.941786 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="openstack/kube-state-metrics-0" podUID="c1dcd075-92b3-4f17-888c-2e5580a45789" Oct 03 08:06:53 crc kubenswrapper[4664]: E1003 08:06:53.869932 4664 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Oct 03 08:06:53 crc kubenswrapper[4664]: E1003 08:06:53.873038 4664 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wdgwf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(8bb7b62d-f030-45a7-b9f8-87852ea275de): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 08:06:53 crc kubenswrapper[4664]: E1003 08:06:53.874270 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="8bb7b62d-f030-45a7-b9f8-87852ea275de" Oct 03 08:06:53 crc kubenswrapper[4664]: E1003 08:06:53.883827 4664 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Oct 03 08:06:53 crc kubenswrapper[4664]: E1003 08:06:53.884064 4664 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:DB_ROOT_PASSWORD,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:DbRootPassword,Optional:nil,},},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:secrets,ReadOnly:true,MountPath:/var/lib/secrets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hfthl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-cell1-galera-0_openstack(0710a5a3-3e65-42b8-bd1d-d40deb6a325d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 08:06:53 crc kubenswrapper[4664]: E1003 08:06:53.885661 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-cell1-galera-0" podUID="0710a5a3-3e65-42b8-bd1d-d40deb6a325d" Oct 03 08:06:53 crc kubenswrapper[4664]: E1003 08:06:53.960949 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-cell1-galera-0" podUID="0710a5a3-3e65-42b8-bd1d-d40deb6a325d" Oct 03 08:06:53 crc kubenswrapper[4664]: E1003 08:06:53.961455 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="8bb7b62d-f030-45a7-b9f8-87852ea275de" Oct 03 08:06:54 crc kubenswrapper[4664]: E1003 08:06:54.371320 4664 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 03 08:06:54 crc kubenswrapper[4664]: E1003 08:06:54.371493 4664 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r6cg5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-hwlpb_openstack(96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 08:06:54 crc kubenswrapper[4664]: E1003 08:06:54.372574 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-hwlpb" podUID="96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1" Oct 03 08:06:54 crc kubenswrapper[4664]: I1003 08:06:54.374206 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-fb5ld"] Oct 03 08:06:54 crc kubenswrapper[4664]: I1003 08:06:54.436195 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 03 08:06:54 crc kubenswrapper[4664]: E1003 08:06:54.495359 4664 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 03 08:06:54 crc kubenswrapper[4664]: E1003 08:06:54.495630 4664 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hkn9h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-zrgdw_openstack(46d660af-8326-4df1-a0e7-2f8804fea23d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 08:06:54 crc kubenswrapper[4664]: E1003 08:06:54.496857 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-zrgdw" podUID="46d660af-8326-4df1-a0e7-2f8804fea23d" Oct 03 08:06:54 crc kubenswrapper[4664]: I1003 08:06:54.582696 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 03 08:06:54 crc kubenswrapper[4664]: W1003 08:06:54.642708 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd7a7515_e2bf_4c9f_9dac_1c0bd3b87e78.slice/crio-6549235e5f37ea7eaac69e5a947fa38ebf84bda9bc04fc88cc2833f5b61014df WatchSource:0}: Error finding container 6549235e5f37ea7eaac69e5a947fa38ebf84bda9bc04fc88cc2833f5b61014df: Status 404 returned error can't find the container with id 6549235e5f37ea7eaac69e5a947fa38ebf84bda9bc04fc88cc2833f5b61014df Oct 03 08:06:54 crc kubenswrapper[4664]: E1003 08:06:54.747253 4664 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 03 08:06:54 crc kubenswrapper[4664]: E1003 08:06:54.747451 4664 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wbjdg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-9dx47_openstack(85abc773-d346-480c-9255-6eef5f2c4d29): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 08:06:54 crc kubenswrapper[4664]: E1003 08:06:54.749012 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-9dx47" podUID="85abc773-d346-480c-9255-6eef5f2c4d29" Oct 03 08:06:54 crc kubenswrapper[4664]: I1003 08:06:54.878146 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-qcwp9"] Oct 03 08:06:54 crc kubenswrapper[4664]: W1003 08:06:54.883787 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddeb5f246_e857_4517_9f9c_290bc76ba8f6.slice/crio-07b0c53e96bf36799b0b9a3664ec68b9f6aa0e8a63f23c200c3ecba22a38f2eb WatchSource:0}: Error finding container 07b0c53e96bf36799b0b9a3664ec68b9f6aa0e8a63f23c200c3ecba22a38f2eb: Status 404 returned error can't find the container with id 07b0c53e96bf36799b0b9a3664ec68b9f6aa0e8a63f23c200c3ecba22a38f2eb Oct 03 08:06:54 crc kubenswrapper[4664]: I1003 08:06:54.965067 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"dd7a7515-e2bf-4c9f-9dac-1c0bd3b87e78","Type":"ContainerStarted","Data":"6549235e5f37ea7eaac69e5a947fa38ebf84bda9bc04fc88cc2833f5b61014df"} Oct 03 08:06:54 crc kubenswrapper[4664]: I1003 08:06:54.966807 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fb5ld" event={"ID":"b65aa3e9-2d60-4cd9-b63a-93a07ab33e72","Type":"ContainerStarted","Data":"346e72c0d2fd9c94798a17f987ecbf2a87283674a961c734b4eaf0da5a97fbf7"} Oct 03 08:06:54 crc kubenswrapper[4664]: I1003 08:06:54.968092 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qcwp9" event={"ID":"deb5f246-e857-4517-9f9c-290bc76ba8f6","Type":"ContainerStarted","Data":"07b0c53e96bf36799b0b9a3664ec68b9f6aa0e8a63f23c200c3ecba22a38f2eb"} Oct 03 08:06:54 crc kubenswrapper[4664]: I1003 08:06:54.969795 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3eeeda4d-56ed-4cc1-8f0a-fb43df8f94b3","Type":"ContainerStarted","Data":"331313e04e54a59538f92bad21ed32afb3258a13b6402f34dce06c13261f7ca8"} Oct 03 08:06:55 crc kubenswrapper[4664]: I1003 08:06:55.288188 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-zrgdw" Oct 03 08:06:55 crc kubenswrapper[4664]: I1003 08:06:55.297978 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-9dx47" Oct 03 08:06:55 crc kubenswrapper[4664]: I1003 08:06:55.457883 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46d660af-8326-4df1-a0e7-2f8804fea23d-config\") pod \"46d660af-8326-4df1-a0e7-2f8804fea23d\" (UID: \"46d660af-8326-4df1-a0e7-2f8804fea23d\") " Oct 03 08:06:55 crc kubenswrapper[4664]: I1003 08:06:55.457995 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbjdg\" (UniqueName: \"kubernetes.io/projected/85abc773-d346-480c-9255-6eef5f2c4d29-kube-api-access-wbjdg\") pod \"85abc773-d346-480c-9255-6eef5f2c4d29\" (UID: \"85abc773-d346-480c-9255-6eef5f2c4d29\") " Oct 03 08:06:55 crc kubenswrapper[4664]: I1003 08:06:55.458050 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85abc773-d346-480c-9255-6eef5f2c4d29-config\") pod \"85abc773-d346-480c-9255-6eef5f2c4d29\" (UID: \"85abc773-d346-480c-9255-6eef5f2c4d29\") " Oct 03 08:06:55 crc kubenswrapper[4664]: I1003 08:06:55.458149 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkn9h\" (UniqueName: \"kubernetes.io/projected/46d660af-8326-4df1-a0e7-2f8804fea23d-kube-api-access-hkn9h\") pod \"46d660af-8326-4df1-a0e7-2f8804fea23d\" (UID: \"46d660af-8326-4df1-a0e7-2f8804fea23d\") " Oct 03 08:06:55 crc kubenswrapper[4664]: I1003 08:06:55.458214 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85abc773-d346-480c-9255-6eef5f2c4d29-dns-svc\") pod \"85abc773-d346-480c-9255-6eef5f2c4d29\" (UID: \"85abc773-d346-480c-9255-6eef5f2c4d29\") " Oct 03 08:06:55 crc kubenswrapper[4664]: I1003 08:06:55.458792 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46d660af-8326-4df1-a0e7-2f8804fea23d-config" (OuterVolumeSpecName: "config") pod "46d660af-8326-4df1-a0e7-2f8804fea23d" (UID: "46d660af-8326-4df1-a0e7-2f8804fea23d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:06:55 crc kubenswrapper[4664]: I1003 08:06:55.458812 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85abc773-d346-480c-9255-6eef5f2c4d29-config" (OuterVolumeSpecName: "config") pod "85abc773-d346-480c-9255-6eef5f2c4d29" (UID: "85abc773-d346-480c-9255-6eef5f2c4d29"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:06:55 crc kubenswrapper[4664]: I1003 08:06:55.459530 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85abc773-d346-480c-9255-6eef5f2c4d29-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "85abc773-d346-480c-9255-6eef5f2c4d29" (UID: "85abc773-d346-480c-9255-6eef5f2c4d29"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:06:55 crc kubenswrapper[4664]: I1003 08:06:55.459733 4664 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46d660af-8326-4df1-a0e7-2f8804fea23d-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:06:55 crc kubenswrapper[4664]: I1003 08:06:55.459759 4664 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85abc773-d346-480c-9255-6eef5f2c4d29-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:06:55 crc kubenswrapper[4664]: I1003 08:06:55.463481 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46d660af-8326-4df1-a0e7-2f8804fea23d-kube-api-access-hkn9h" (OuterVolumeSpecName: "kube-api-access-hkn9h") pod "46d660af-8326-4df1-a0e7-2f8804fea23d" (UID: "46d660af-8326-4df1-a0e7-2f8804fea23d"). InnerVolumeSpecName "kube-api-access-hkn9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:06:55 crc kubenswrapper[4664]: I1003 08:06:55.464077 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85abc773-d346-480c-9255-6eef5f2c4d29-kube-api-access-wbjdg" (OuterVolumeSpecName: "kube-api-access-wbjdg") pod "85abc773-d346-480c-9255-6eef5f2c4d29" (UID: "85abc773-d346-480c-9255-6eef5f2c4d29"). InnerVolumeSpecName "kube-api-access-wbjdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:06:55 crc kubenswrapper[4664]: I1003 08:06:55.562425 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbjdg\" (UniqueName: \"kubernetes.io/projected/85abc773-d346-480c-9255-6eef5f2c4d29-kube-api-access-wbjdg\") on node \"crc\" DevicePath \"\"" Oct 03 08:06:55 crc kubenswrapper[4664]: I1003 08:06:55.562493 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkn9h\" (UniqueName: \"kubernetes.io/projected/46d660af-8326-4df1-a0e7-2f8804fea23d-kube-api-access-hkn9h\") on node \"crc\" DevicePath \"\"" Oct 03 08:06:55 crc kubenswrapper[4664]: I1003 08:06:55.562518 4664 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85abc773-d346-480c-9255-6eef5f2c4d29-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 08:06:55 crc kubenswrapper[4664]: I1003 08:06:55.981388 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-zrgdw" event={"ID":"46d660af-8326-4df1-a0e7-2f8804fea23d","Type":"ContainerDied","Data":"8575d4fd5f809af785d15b40199e333660caffc411ee9e1b01d975e207468004"} Oct 03 08:06:55 crc kubenswrapper[4664]: I1003 08:06:55.981515 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-zrgdw" Oct 03 08:06:55 crc kubenswrapper[4664]: I1003 08:06:55.985006 4664 generic.go:334] "Generic (PLEG): container finished" podID="96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1" containerID="e891ccea0eb9f5a873ecf1f89302da2311263d8cc2defb9f9f2c45fa4370137d" exitCode=0 Oct 03 08:06:55 crc kubenswrapper[4664]: I1003 08:06:55.985079 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-hwlpb" event={"ID":"96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1","Type":"ContainerDied","Data":"e891ccea0eb9f5a873ecf1f89302da2311263d8cc2defb9f9f2c45fa4370137d"} Oct 03 08:06:55 crc kubenswrapper[4664]: I1003 08:06:55.989119 4664 generic.go:334] "Generic (PLEG): container finished" podID="671473b6-b445-4c88-8a2e-1c20a23c5b4f" containerID="6b308443a727265732f05613b559564a8a66dae747c333b44caeffd2e3a713b5" exitCode=0 Oct 03 08:06:55 crc kubenswrapper[4664]: I1003 08:06:55.989210 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-pvnb8" event={"ID":"671473b6-b445-4c88-8a2e-1c20a23c5b4f","Type":"ContainerDied","Data":"6b308443a727265732f05613b559564a8a66dae747c333b44caeffd2e3a713b5"} Oct 03 08:06:55 crc kubenswrapper[4664]: I1003 08:06:55.990782 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-9dx47" event={"ID":"85abc773-d346-480c-9255-6eef5f2c4d29","Type":"ContainerDied","Data":"a84bc645cac9c81cc00c6071c96710a1585cc7a3b9440b1cc92e808ae4e37333"} Oct 03 08:06:55 crc kubenswrapper[4664]: I1003 08:06:55.990857 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-9dx47" Oct 03 08:06:56 crc kubenswrapper[4664]: I1003 08:06:56.113789 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-zrgdw"] Oct 03 08:06:56 crc kubenswrapper[4664]: I1003 08:06:56.135266 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-zrgdw"] Oct 03 08:06:56 crc kubenswrapper[4664]: I1003 08:06:56.150725 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-9dx47"] Oct 03 08:06:56 crc kubenswrapper[4664]: I1003 08:06:56.151499 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-9dx47"] Oct 03 08:06:57 crc kubenswrapper[4664]: I1003 08:06:57.009356 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-hwlpb" event={"ID":"96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1","Type":"ContainerStarted","Data":"290801e369a3162ccccc045da875b27f969d39c2ea636a29e333fb34a4069e3b"} Oct 03 08:06:57 crc kubenswrapper[4664]: I1003 08:06:57.010050 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-hwlpb" Oct 03 08:06:57 crc kubenswrapper[4664]: I1003 08:06:57.013185 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-pvnb8" event={"ID":"671473b6-b445-4c88-8a2e-1c20a23c5b4f","Type":"ContainerStarted","Data":"5826d31d1d03fb86aef1cb9b508a62670f5b3d5a0c13e270b08710952cb52e6a"} Oct 03 08:06:57 crc kubenswrapper[4664]: I1003 08:06:57.013380 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-pvnb8" Oct 03 08:06:57 crc kubenswrapper[4664]: I1003 08:06:57.033910 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-hwlpb" podStartSLOduration=-9223371995.820894 podStartE2EDuration="41.0338819s" podCreationTimestamp="2025-10-03 08:06:16 +0000 UTC" firstStartedPulling="2025-10-03 08:06:17.240760613 +0000 UTC m=+1078.061951103" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:06:57.030316399 +0000 UTC m=+1117.851506909" watchObservedRunningTime="2025-10-03 08:06:57.0338819 +0000 UTC m=+1117.855072410" Oct 03 08:06:57 crc kubenswrapper[4664]: I1003 08:06:57.058736 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-pvnb8" podStartSLOduration=4.063925544 podStartE2EDuration="42.058701457s" podCreationTimestamp="2025-10-03 08:06:15 +0000 UTC" firstStartedPulling="2025-10-03 08:06:16.722516893 +0000 UTC m=+1077.543707383" lastFinishedPulling="2025-10-03 08:06:54.717292806 +0000 UTC m=+1115.538483296" observedRunningTime="2025-10-03 08:06:57.05144942 +0000 UTC m=+1117.872639920" watchObservedRunningTime="2025-10-03 08:06:57.058701457 +0000 UTC m=+1117.879891947" Oct 03 08:06:57 crc kubenswrapper[4664]: I1003 08:06:57.889062 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46d660af-8326-4df1-a0e7-2f8804fea23d" path="/var/lib/kubelet/pods/46d660af-8326-4df1-a0e7-2f8804fea23d/volumes" Oct 03 08:06:57 crc kubenswrapper[4664]: I1003 08:06:57.889541 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85abc773-d346-480c-9255-6eef5f2c4d29" path="/var/lib/kubelet/pods/85abc773-d346-480c-9255-6eef5f2c4d29/volumes" Oct 03 08:07:01 crc kubenswrapper[4664]: I1003 08:07:01.372629 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-666b6646f7-pvnb8" Oct 03 08:07:01 crc kubenswrapper[4664]: I1003 08:07:01.645244 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-hwlpb" Oct 03 08:07:01 crc kubenswrapper[4664]: I1003 08:07:01.711116 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-pvnb8"] Oct 03 08:07:02 crc kubenswrapper[4664]: I1003 08:07:02.060041 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"dd7a7515-e2bf-4c9f-9dac-1c0bd3b87e78","Type":"ContainerStarted","Data":"20735535ba8d1e6936e9b834ec07a02a347ec3772b4ad940395d5c4b6e007daf"} Oct 03 08:07:02 crc kubenswrapper[4664]: I1003 08:07:02.063032 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fb5ld" event={"ID":"b65aa3e9-2d60-4cd9-b63a-93a07ab33e72","Type":"ContainerStarted","Data":"799dfa67d8250fb64286b67c341d806ecc9df7c8014374aca42b88bd6841b03e"} Oct 03 08:07:02 crc kubenswrapper[4664]: I1003 08:07:02.064677 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-fb5ld" Oct 03 08:07:02 crc kubenswrapper[4664]: I1003 08:07:02.066501 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qcwp9" event={"ID":"deb5f246-e857-4517-9f9c-290bc76ba8f6","Type":"ContainerStarted","Data":"4ed8a9ef743f083d778567cf6ccbc3f5964f0401f441d196917a3ae79c5eaf46"} Oct 03 08:07:02 crc kubenswrapper[4664]: I1003 08:07:02.070376 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-pvnb8" podUID="671473b6-b445-4c88-8a2e-1c20a23c5b4f" containerName="dnsmasq-dns" containerID="cri-o://5826d31d1d03fb86aef1cb9b508a62670f5b3d5a0c13e270b08710952cb52e6a" gracePeriod=10 Oct 03 08:07:02 crc kubenswrapper[4664]: I1003 08:07:02.070722 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3eeeda4d-56ed-4cc1-8f0a-fb43df8f94b3","Type":"ContainerStarted","Data":"a734bf1e3dcc2fd2566e5302afaa616e55908e011ef3a4f82f07e5e38cdce9a0"} Oct 03 08:07:02 crc kubenswrapper[4664]: I1003 08:07:02.088410 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-fb5ld" podStartSLOduration=29.456403003 podStartE2EDuration="36.088385852s" podCreationTimestamp="2025-10-03 08:06:26 +0000 UTC" firstStartedPulling="2025-10-03 08:06:54.637802703 +0000 UTC m=+1115.458993193" lastFinishedPulling="2025-10-03 08:07:01.269785552 +0000 UTC m=+1122.090976042" observedRunningTime="2025-10-03 08:07:02.083019439 +0000 UTC m=+1122.904209959" watchObservedRunningTime="2025-10-03 08:07:02.088385852 +0000 UTC m=+1122.909576342" Oct 03 08:07:02 crc kubenswrapper[4664]: I1003 08:07:02.581131 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-pvnb8" Oct 03 08:07:02 crc kubenswrapper[4664]: I1003 08:07:02.751195 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqsl9\" (UniqueName: \"kubernetes.io/projected/671473b6-b445-4c88-8a2e-1c20a23c5b4f-kube-api-access-mqsl9\") pod \"671473b6-b445-4c88-8a2e-1c20a23c5b4f\" (UID: \"671473b6-b445-4c88-8a2e-1c20a23c5b4f\") " Oct 03 08:07:02 crc kubenswrapper[4664]: I1003 08:07:02.751355 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/671473b6-b445-4c88-8a2e-1c20a23c5b4f-config\") pod \"671473b6-b445-4c88-8a2e-1c20a23c5b4f\" (UID: \"671473b6-b445-4c88-8a2e-1c20a23c5b4f\") " Oct 03 08:07:02 crc kubenswrapper[4664]: I1003 08:07:02.751575 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/671473b6-b445-4c88-8a2e-1c20a23c5b4f-dns-svc\") pod \"671473b6-b445-4c88-8a2e-1c20a23c5b4f\" (UID: \"671473b6-b445-4c88-8a2e-1c20a23c5b4f\") " Oct 03 08:07:02 crc kubenswrapper[4664]: I1003 08:07:02.758597 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/671473b6-b445-4c88-8a2e-1c20a23c5b4f-kube-api-access-mqsl9" (OuterVolumeSpecName: "kube-api-access-mqsl9") pod "671473b6-b445-4c88-8a2e-1c20a23c5b4f" (UID: "671473b6-b445-4c88-8a2e-1c20a23c5b4f"). InnerVolumeSpecName "kube-api-access-mqsl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:07:02 crc kubenswrapper[4664]: I1003 08:07:02.802863 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/671473b6-b445-4c88-8a2e-1c20a23c5b4f-config" (OuterVolumeSpecName: "config") pod "671473b6-b445-4c88-8a2e-1c20a23c5b4f" (UID: "671473b6-b445-4c88-8a2e-1c20a23c5b4f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:07:02 crc kubenswrapper[4664]: I1003 08:07:02.830468 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/671473b6-b445-4c88-8a2e-1c20a23c5b4f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "671473b6-b445-4c88-8a2e-1c20a23c5b4f" (UID: "671473b6-b445-4c88-8a2e-1c20a23c5b4f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:07:02 crc kubenswrapper[4664]: I1003 08:07:02.854196 4664 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/671473b6-b445-4c88-8a2e-1c20a23c5b4f-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 08:07:02 crc kubenswrapper[4664]: I1003 08:07:02.854252 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqsl9\" (UniqueName: \"kubernetes.io/projected/671473b6-b445-4c88-8a2e-1c20a23c5b4f-kube-api-access-mqsl9\") on node \"crc\" DevicePath \"\"" Oct 03 08:07:02 crc kubenswrapper[4664]: I1003 08:07:02.854271 4664 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/671473b6-b445-4c88-8a2e-1c20a23c5b4f-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:07:03 crc kubenswrapper[4664]: I1003 08:07:03.088362 4664 generic.go:334] "Generic (PLEG): container finished" podID="671473b6-b445-4c88-8a2e-1c20a23c5b4f" containerID="5826d31d1d03fb86aef1cb9b508a62670f5b3d5a0c13e270b08710952cb52e6a" exitCode=0 Oct 03 08:07:03 crc kubenswrapper[4664]: I1003 08:07:03.088434 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-pvnb8" Oct 03 08:07:03 crc kubenswrapper[4664]: I1003 08:07:03.088451 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-pvnb8" event={"ID":"671473b6-b445-4c88-8a2e-1c20a23c5b4f","Type":"ContainerDied","Data":"5826d31d1d03fb86aef1cb9b508a62670f5b3d5a0c13e270b08710952cb52e6a"} Oct 03 08:07:03 crc kubenswrapper[4664]: I1003 08:07:03.088628 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-pvnb8" event={"ID":"671473b6-b445-4c88-8a2e-1c20a23c5b4f","Type":"ContainerDied","Data":"ce1bd26e1916f81b444019a0a37e46b6bf0405f80acc73ce7b12a43044f036bb"} Oct 03 08:07:03 crc kubenswrapper[4664]: I1003 08:07:03.088667 4664 scope.go:117] "RemoveContainer" containerID="5826d31d1d03fb86aef1cb9b508a62670f5b3d5a0c13e270b08710952cb52e6a" Oct 03 08:07:03 crc kubenswrapper[4664]: I1003 08:07:03.092907 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"350764d9-9981-4b2d-b69d-42712338bdd1","Type":"ContainerStarted","Data":"19cef43be98901a0824cc94f26d83b370784ab8737e9bc184f10a317a9309279"} Oct 03 08:07:03 crc kubenswrapper[4664]: I1003 08:07:03.093884 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 03 08:07:03 crc kubenswrapper[4664]: I1003 08:07:03.095835 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b8ae1def-1d1a-4acd-af78-204219a99fe6","Type":"ContainerStarted","Data":"62f158f76865a1bd74283911d566ec6f8f9e54cea9ffdb4b21088f5420dd8544"} Oct 03 08:07:03 crc kubenswrapper[4664]: I1003 08:07:03.102393 4664 generic.go:334] "Generic (PLEG): container finished" podID="deb5f246-e857-4517-9f9c-290bc76ba8f6" containerID="4ed8a9ef743f083d778567cf6ccbc3f5964f0401f441d196917a3ae79c5eaf46" exitCode=0 Oct 03 08:07:03 crc kubenswrapper[4664]: I1003 08:07:03.104222 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qcwp9" event={"ID":"deb5f246-e857-4517-9f9c-290bc76ba8f6","Type":"ContainerDied","Data":"4ed8a9ef743f083d778567cf6ccbc3f5964f0401f441d196917a3ae79c5eaf46"} Oct 03 08:07:03 crc kubenswrapper[4664]: I1003 08:07:03.122712 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.130858877 podStartE2EDuration="43.12268069s" podCreationTimestamp="2025-10-03 08:06:20 +0000 UTC" firstStartedPulling="2025-10-03 08:06:21.330762174 +0000 UTC m=+1082.151952664" lastFinishedPulling="2025-10-03 08:07:02.322583987 +0000 UTC m=+1123.143774477" observedRunningTime="2025-10-03 08:07:03.11390305 +0000 UTC m=+1123.935093550" watchObservedRunningTime="2025-10-03 08:07:03.12268069 +0000 UTC m=+1123.943871180" Oct 03 08:07:03 crc kubenswrapper[4664]: I1003 08:07:03.127852 4664 scope.go:117] "RemoveContainer" containerID="6b308443a727265732f05613b559564a8a66dae747c333b44caeffd2e3a713b5" Oct 03 08:07:03 crc kubenswrapper[4664]: I1003 08:07:03.157145 4664 scope.go:117] "RemoveContainer" containerID="5826d31d1d03fb86aef1cb9b508a62670f5b3d5a0c13e270b08710952cb52e6a" Oct 03 08:07:03 crc kubenswrapper[4664]: E1003 08:07:03.158814 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5826d31d1d03fb86aef1cb9b508a62670f5b3d5a0c13e270b08710952cb52e6a\": container with ID starting with 5826d31d1d03fb86aef1cb9b508a62670f5b3d5a0c13e270b08710952cb52e6a not found: ID does not exist" containerID="5826d31d1d03fb86aef1cb9b508a62670f5b3d5a0c13e270b08710952cb52e6a" Oct 03 08:07:03 crc kubenswrapper[4664]: I1003 08:07:03.158865 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5826d31d1d03fb86aef1cb9b508a62670f5b3d5a0c13e270b08710952cb52e6a"} err="failed to get container status \"5826d31d1d03fb86aef1cb9b508a62670f5b3d5a0c13e270b08710952cb52e6a\": rpc error: code = NotFound desc = could not find container \"5826d31d1d03fb86aef1cb9b508a62670f5b3d5a0c13e270b08710952cb52e6a\": container with ID starting with 5826d31d1d03fb86aef1cb9b508a62670f5b3d5a0c13e270b08710952cb52e6a not found: ID does not exist" Oct 03 08:07:03 crc kubenswrapper[4664]: I1003 08:07:03.158890 4664 scope.go:117] "RemoveContainer" containerID="6b308443a727265732f05613b559564a8a66dae747c333b44caeffd2e3a713b5" Oct 03 08:07:03 crc kubenswrapper[4664]: E1003 08:07:03.165883 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b308443a727265732f05613b559564a8a66dae747c333b44caeffd2e3a713b5\": container with ID starting with 6b308443a727265732f05613b559564a8a66dae747c333b44caeffd2e3a713b5 not found: ID does not exist" containerID="6b308443a727265732f05613b559564a8a66dae747c333b44caeffd2e3a713b5" Oct 03 08:07:03 crc kubenswrapper[4664]: I1003 08:07:03.165945 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b308443a727265732f05613b559564a8a66dae747c333b44caeffd2e3a713b5"} err="failed to get container status \"6b308443a727265732f05613b559564a8a66dae747c333b44caeffd2e3a713b5\": rpc error: code = NotFound desc = could not find container \"6b308443a727265732f05613b559564a8a66dae747c333b44caeffd2e3a713b5\": container with ID starting with 6b308443a727265732f05613b559564a8a66dae747c333b44caeffd2e3a713b5 not found: ID does not exist" Oct 03 08:07:03 crc kubenswrapper[4664]: I1003 08:07:03.209062 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-pvnb8"] Oct 03 08:07:03 crc kubenswrapper[4664]: I1003 08:07:03.215802 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-pvnb8"] Oct 03 08:07:03 crc kubenswrapper[4664]: I1003 08:07:03.899049 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="671473b6-b445-4c88-8a2e-1c20a23c5b4f" path="/var/lib/kubelet/pods/671473b6-b445-4c88-8a2e-1c20a23c5b4f/volumes" Oct 03 08:07:04 crc kubenswrapper[4664]: I1003 08:07:04.122426 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qcwp9" event={"ID":"deb5f246-e857-4517-9f9c-290bc76ba8f6","Type":"ContainerStarted","Data":"4805a06f1869da924272b1768f9cfd51bcccf95870a1640ed16d78b06b3f5e29"} Oct 03 08:07:04 crc kubenswrapper[4664]: I1003 08:07:04.122479 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qcwp9" event={"ID":"deb5f246-e857-4517-9f9c-290bc76ba8f6","Type":"ContainerStarted","Data":"b6a5351dbe7853a051a4d4591968e563afa2ce0831834c6c72610a264ba6127e"} Oct 03 08:07:04 crc kubenswrapper[4664]: I1003 08:07:04.122994 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-qcwp9" Oct 03 08:07:04 crc kubenswrapper[4664]: I1003 08:07:04.123119 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-qcwp9" Oct 03 08:07:04 crc kubenswrapper[4664]: I1003 08:07:04.151392 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-qcwp9" podStartSLOduration=31.915400251 podStartE2EDuration="38.151364188s" podCreationTimestamp="2025-10-03 08:06:26 +0000 UTC" firstStartedPulling="2025-10-03 08:06:54.886824761 +0000 UTC m=+1115.708015251" lastFinishedPulling="2025-10-03 08:07:01.122788698 +0000 UTC m=+1121.943979188" observedRunningTime="2025-10-03 08:07:04.14337385 +0000 UTC m=+1124.964564340" watchObservedRunningTime="2025-10-03 08:07:04.151364188 +0000 UTC m=+1124.972554678" Oct 03 08:07:07 crc kubenswrapper[4664]: I1003 08:07:07.152918 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"0710a5a3-3e65-42b8-bd1d-d40deb6a325d","Type":"ContainerStarted","Data":"aed462f888e5faa64216d1bbd507634f2745a86382b59548304a11f3a42e7dcb"} Oct 03 08:07:07 crc kubenswrapper[4664]: I1003 08:07:07.159416 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3eeeda4d-56ed-4cc1-8f0a-fb43df8f94b3","Type":"ContainerStarted","Data":"02a380c8a6975a8be14f13cb83f8e0ddadf9a65c8f990dd4c58d80ba5e84f038"} Oct 03 08:07:07 crc kubenswrapper[4664]: I1003 08:07:07.207329 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=31.053372002 podStartE2EDuration="42.207293375s" podCreationTimestamp="2025-10-03 08:06:25 +0000 UTC" firstStartedPulling="2025-10-03 08:06:54.692257823 +0000 UTC m=+1115.513448313" lastFinishedPulling="2025-10-03 08:07:05.846179196 +0000 UTC m=+1126.667369686" observedRunningTime="2025-10-03 08:07:07.203378184 +0000 UTC m=+1128.024568684" watchObservedRunningTime="2025-10-03 08:07:07.207293375 +0000 UTC m=+1128.028483865" Oct 03 08:07:08 crc kubenswrapper[4664]: I1003 08:07:08.171286 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"dd7a7515-e2bf-4c9f-9dac-1c0bd3b87e78","Type":"ContainerStarted","Data":"a6a5c1a1068324fbb9b7ee3e0b9e750286db965a52ef5dabe0f531a58dbbd91e"} Oct 03 08:07:08 crc kubenswrapper[4664]: I1003 08:07:08.174355 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a5b770eb-2222-43c8-bb15-6e2d18e95fbf","Type":"ContainerStarted","Data":"6a5c965261238bf1912fa3f341678a6cce7d7d8ce8c1a088d13ebfdba6fef2ba"} Oct 03 08:07:08 crc kubenswrapper[4664]: I1003 08:07:08.241214 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=28.069711189 podStartE2EDuration="39.241181352s" podCreationTimestamp="2025-10-03 08:06:29 +0000 UTC" firstStartedPulling="2025-10-03 08:06:54.692129399 +0000 UTC m=+1115.513319889" lastFinishedPulling="2025-10-03 08:07:05.863599552 +0000 UTC m=+1126.684790052" observedRunningTime="2025-10-03 08:07:08.206007241 +0000 UTC m=+1129.027197751" watchObservedRunningTime="2025-10-03 08:07:08.241181352 +0000 UTC m=+1129.062371852" Oct 03 08:07:08 crc kubenswrapper[4664]: I1003 08:07:08.831592 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 03 08:07:08 crc kubenswrapper[4664]: I1003 08:07:08.874242 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 03 08:07:09 crc kubenswrapper[4664]: I1003 08:07:09.183162 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c1dcd075-92b3-4f17-888c-2e5580a45789","Type":"ContainerStarted","Data":"f61ffab7b1d18ae4b5b27fe6766cb5a6acaedbae8434e907bbc8459f00ffbdab"} Oct 03 08:07:09 crc kubenswrapper[4664]: I1003 08:07:09.183754 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 03 08:07:09 crc kubenswrapper[4664]: I1003 08:07:09.208986 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.8760175719999999 podStartE2EDuration="47.208957658s" podCreationTimestamp="2025-10-03 08:06:22 +0000 UTC" firstStartedPulling="2025-10-03 08:06:23.359745947 +0000 UTC m=+1084.180936437" lastFinishedPulling="2025-10-03 08:07:08.692686033 +0000 UTC m=+1129.513876523" observedRunningTime="2025-10-03 08:07:09.19957424 +0000 UTC m=+1130.020764760" watchObservedRunningTime="2025-10-03 08:07:09.208957658 +0000 UTC m=+1130.030148148" Oct 03 08:07:09 crc kubenswrapper[4664]: I1003 08:07:09.226704 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 03 08:07:09 crc kubenswrapper[4664]: I1003 08:07:09.517933 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-w782k"] Oct 03 08:07:09 crc kubenswrapper[4664]: E1003 08:07:09.518819 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="671473b6-b445-4c88-8a2e-1c20a23c5b4f" containerName="init" Oct 03 08:07:09 crc kubenswrapper[4664]: I1003 08:07:09.518840 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="671473b6-b445-4c88-8a2e-1c20a23c5b4f" containerName="init" Oct 03 08:07:09 crc kubenswrapper[4664]: E1003 08:07:09.518862 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="671473b6-b445-4c88-8a2e-1c20a23c5b4f" containerName="dnsmasq-dns" Oct 03 08:07:09 crc kubenswrapper[4664]: I1003 08:07:09.518871 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="671473b6-b445-4c88-8a2e-1c20a23c5b4f" containerName="dnsmasq-dns" Oct 03 08:07:09 crc kubenswrapper[4664]: I1003 08:07:09.519043 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="671473b6-b445-4c88-8a2e-1c20a23c5b4f" containerName="dnsmasq-dns" Oct 03 08:07:09 crc kubenswrapper[4664]: I1003 08:07:09.519890 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 03 08:07:09 crc kubenswrapper[4664]: I1003 08:07:09.519983 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-w782k" Oct 03 08:07:09 crc kubenswrapper[4664]: I1003 08:07:09.526260 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 03 08:07:09 crc kubenswrapper[4664]: I1003 08:07:09.559318 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-w782k"] Oct 03 08:07:09 crc kubenswrapper[4664]: I1003 08:07:09.581558 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 03 08:07:09 crc kubenswrapper[4664]: I1003 08:07:09.634294 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-2k7c8"] Oct 03 08:07:09 crc kubenswrapper[4664]: I1003 08:07:09.635152 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b9b1417-849a-446f-bdab-62f46de22cca-config\") pod \"dnsmasq-dns-5bf47b49b7-w782k\" (UID: \"7b9b1417-849a-446f-bdab-62f46de22cca\") " pod="openstack/dnsmasq-dns-5bf47b49b7-w782k" Oct 03 08:07:09 crc kubenswrapper[4664]: I1003 08:07:09.635263 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hwg7\" (UniqueName: \"kubernetes.io/projected/7b9b1417-849a-446f-bdab-62f46de22cca-kube-api-access-5hwg7\") pod \"dnsmasq-dns-5bf47b49b7-w782k\" (UID: \"7b9b1417-849a-446f-bdab-62f46de22cca\") " pod="openstack/dnsmasq-dns-5bf47b49b7-w782k" Oct 03 08:07:09 crc kubenswrapper[4664]: I1003 08:07:09.635350 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7b9b1417-849a-446f-bdab-62f46de22cca-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-w782k\" (UID: \"7b9b1417-849a-446f-bdab-62f46de22cca\") " pod="openstack/dnsmasq-dns-5bf47b49b7-w782k" Oct 03 08:07:09 crc kubenswrapper[4664]: I1003 08:07:09.635383 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b9b1417-849a-446f-bdab-62f46de22cca-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-w782k\" (UID: \"7b9b1417-849a-446f-bdab-62f46de22cca\") " pod="openstack/dnsmasq-dns-5bf47b49b7-w782k" Oct 03 08:07:09 crc kubenswrapper[4664]: I1003 08:07:09.636158 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-2k7c8" Oct 03 08:07:09 crc kubenswrapper[4664]: I1003 08:07:09.639153 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 03 08:07:09 crc kubenswrapper[4664]: I1003 08:07:09.646457 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-2k7c8"] Oct 03 08:07:09 crc kubenswrapper[4664]: I1003 08:07:09.737239 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7b9b1417-849a-446f-bdab-62f46de22cca-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-w782k\" (UID: \"7b9b1417-849a-446f-bdab-62f46de22cca\") " pod="openstack/dnsmasq-dns-5bf47b49b7-w782k" Oct 03 08:07:09 crc kubenswrapper[4664]: I1003 08:07:09.737285 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/9f1aa7a2-0b64-4d75-9e05-9a53c987be28-ovs-rundir\") pod \"ovn-controller-metrics-2k7c8\" (UID: \"9f1aa7a2-0b64-4d75-9e05-9a53c987be28\") " pod="openstack/ovn-controller-metrics-2k7c8" Oct 03 08:07:09 crc kubenswrapper[4664]: I1003 08:07:09.737324 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b9b1417-849a-446f-bdab-62f46de22cca-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-w782k\" (UID: \"7b9b1417-849a-446f-bdab-62f46de22cca\") " pod="openstack/dnsmasq-dns-5bf47b49b7-w782k" Oct 03 08:07:09 crc kubenswrapper[4664]: I1003 08:07:09.737359 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f1aa7a2-0b64-4d75-9e05-9a53c987be28-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-2k7c8\" (UID: \"9f1aa7a2-0b64-4d75-9e05-9a53c987be28\") " pod="openstack/ovn-controller-metrics-2k7c8" Oct 03 08:07:09 crc kubenswrapper[4664]: I1003 08:07:09.737380 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fn2n\" (UniqueName: \"kubernetes.io/projected/9f1aa7a2-0b64-4d75-9e05-9a53c987be28-kube-api-access-5fn2n\") pod \"ovn-controller-metrics-2k7c8\" (UID: \"9f1aa7a2-0b64-4d75-9e05-9a53c987be28\") " pod="openstack/ovn-controller-metrics-2k7c8" Oct 03 08:07:09 crc kubenswrapper[4664]: I1003 08:07:09.737730 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f1aa7a2-0b64-4d75-9e05-9a53c987be28-config\") pod \"ovn-controller-metrics-2k7c8\" (UID: \"9f1aa7a2-0b64-4d75-9e05-9a53c987be28\") " pod="openstack/ovn-controller-metrics-2k7c8" Oct 03 08:07:09 crc kubenswrapper[4664]: I1003 08:07:09.737797 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b9b1417-849a-446f-bdab-62f46de22cca-config\") pod \"dnsmasq-dns-5bf47b49b7-w782k\" (UID: \"7b9b1417-849a-446f-bdab-62f46de22cca\") " pod="openstack/dnsmasq-dns-5bf47b49b7-w782k" Oct 03 08:07:09 crc kubenswrapper[4664]: I1003 08:07:09.738176 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hwg7\" (UniqueName: \"kubernetes.io/projected/7b9b1417-849a-446f-bdab-62f46de22cca-kube-api-access-5hwg7\") pod \"dnsmasq-dns-5bf47b49b7-w782k\" (UID: \"7b9b1417-849a-446f-bdab-62f46de22cca\") " pod="openstack/dnsmasq-dns-5bf47b49b7-w782k" Oct 03 08:07:09 crc kubenswrapper[4664]: I1003 08:07:09.738337 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/9f1aa7a2-0b64-4d75-9e05-9a53c987be28-ovn-rundir\") pod \"ovn-controller-metrics-2k7c8\" (UID: \"9f1aa7a2-0b64-4d75-9e05-9a53c987be28\") " pod="openstack/ovn-controller-metrics-2k7c8" Oct 03 08:07:09 crc kubenswrapper[4664]: I1003 08:07:09.738458 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f1aa7a2-0b64-4d75-9e05-9a53c987be28-combined-ca-bundle\") pod \"ovn-controller-metrics-2k7c8\" (UID: \"9f1aa7a2-0b64-4d75-9e05-9a53c987be28\") " pod="openstack/ovn-controller-metrics-2k7c8" Oct 03 08:07:09 crc kubenswrapper[4664]: I1003 08:07:09.738660 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7b9b1417-849a-446f-bdab-62f46de22cca-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-w782k\" (UID: \"7b9b1417-849a-446f-bdab-62f46de22cca\") " pod="openstack/dnsmasq-dns-5bf47b49b7-w782k" Oct 03 08:07:09 crc kubenswrapper[4664]: I1003 08:07:09.738704 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b9b1417-849a-446f-bdab-62f46de22cca-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-w782k\" (UID: \"7b9b1417-849a-446f-bdab-62f46de22cca\") " pod="openstack/dnsmasq-dns-5bf47b49b7-w782k" Oct 03 08:07:09 crc kubenswrapper[4664]: I1003 08:07:09.738780 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b9b1417-849a-446f-bdab-62f46de22cca-config\") pod \"dnsmasq-dns-5bf47b49b7-w782k\" (UID: \"7b9b1417-849a-446f-bdab-62f46de22cca\") " pod="openstack/dnsmasq-dns-5bf47b49b7-w782k" Oct 03 08:07:09 crc kubenswrapper[4664]: I1003 08:07:09.778221 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hwg7\" (UniqueName: \"kubernetes.io/projected/7b9b1417-849a-446f-bdab-62f46de22cca-kube-api-access-5hwg7\") pod \"dnsmasq-dns-5bf47b49b7-w782k\" (UID: \"7b9b1417-849a-446f-bdab-62f46de22cca\") " pod="openstack/dnsmasq-dns-5bf47b49b7-w782k" Oct 03 08:07:09 crc kubenswrapper[4664]: I1003 08:07:09.840681 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f1aa7a2-0b64-4d75-9e05-9a53c987be28-config\") pod \"ovn-controller-metrics-2k7c8\" (UID: \"9f1aa7a2-0b64-4d75-9e05-9a53c987be28\") " pod="openstack/ovn-controller-metrics-2k7c8" Oct 03 08:07:09 crc kubenswrapper[4664]: I1003 08:07:09.840823 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/9f1aa7a2-0b64-4d75-9e05-9a53c987be28-ovn-rundir\") pod \"ovn-controller-metrics-2k7c8\" (UID: \"9f1aa7a2-0b64-4d75-9e05-9a53c987be28\") " pod="openstack/ovn-controller-metrics-2k7c8" Oct 03 08:07:09 crc kubenswrapper[4664]: I1003 08:07:09.840898 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f1aa7a2-0b64-4d75-9e05-9a53c987be28-combined-ca-bundle\") pod \"ovn-controller-metrics-2k7c8\" (UID: \"9f1aa7a2-0b64-4d75-9e05-9a53c987be28\") " pod="openstack/ovn-controller-metrics-2k7c8" Oct 03 08:07:09 crc kubenswrapper[4664]: I1003 08:07:09.841387 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/9f1aa7a2-0b64-4d75-9e05-9a53c987be28-ovn-rundir\") pod \"ovn-controller-metrics-2k7c8\" (UID: \"9f1aa7a2-0b64-4d75-9e05-9a53c987be28\") " pod="openstack/ovn-controller-metrics-2k7c8" Oct 03 08:07:09 crc kubenswrapper[4664]: I1003 08:07:09.841544 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/9f1aa7a2-0b64-4d75-9e05-9a53c987be28-ovs-rundir\") pod \"ovn-controller-metrics-2k7c8\" (UID: \"9f1aa7a2-0b64-4d75-9e05-9a53c987be28\") " pod="openstack/ovn-controller-metrics-2k7c8" Oct 03 08:07:09 crc kubenswrapper[4664]: I1003 08:07:09.841665 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/9f1aa7a2-0b64-4d75-9e05-9a53c987be28-ovs-rundir\") pod \"ovn-controller-metrics-2k7c8\" (UID: \"9f1aa7a2-0b64-4d75-9e05-9a53c987be28\") " pod="openstack/ovn-controller-metrics-2k7c8" Oct 03 08:07:09 crc kubenswrapper[4664]: I1003 08:07:09.841763 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f1aa7a2-0b64-4d75-9e05-9a53c987be28-config\") pod \"ovn-controller-metrics-2k7c8\" (UID: \"9f1aa7a2-0b64-4d75-9e05-9a53c987be28\") " pod="openstack/ovn-controller-metrics-2k7c8" Oct 03 08:07:09 crc kubenswrapper[4664]: I1003 08:07:09.841818 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f1aa7a2-0b64-4d75-9e05-9a53c987be28-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-2k7c8\" (UID: \"9f1aa7a2-0b64-4d75-9e05-9a53c987be28\") " pod="openstack/ovn-controller-metrics-2k7c8" Oct 03 08:07:09 crc kubenswrapper[4664]: I1003 08:07:09.842491 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fn2n\" (UniqueName: \"kubernetes.io/projected/9f1aa7a2-0b64-4d75-9e05-9a53c987be28-kube-api-access-5fn2n\") pod \"ovn-controller-metrics-2k7c8\" (UID: \"9f1aa7a2-0b64-4d75-9e05-9a53c987be28\") " pod="openstack/ovn-controller-metrics-2k7c8" Oct 03 08:07:09 crc kubenswrapper[4664]: I1003 08:07:09.846203 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f1aa7a2-0b64-4d75-9e05-9a53c987be28-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-2k7c8\" (UID: \"9f1aa7a2-0b64-4d75-9e05-9a53c987be28\") " pod="openstack/ovn-controller-metrics-2k7c8" Oct 03 08:07:09 crc kubenswrapper[4664]: I1003 08:07:09.846525 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f1aa7a2-0b64-4d75-9e05-9a53c987be28-combined-ca-bundle\") pod \"ovn-controller-metrics-2k7c8\" (UID: \"9f1aa7a2-0b64-4d75-9e05-9a53c987be28\") " pod="openstack/ovn-controller-metrics-2k7c8" Oct 03 08:07:09 crc kubenswrapper[4664]: I1003 08:07:09.846598 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-w782k" Oct 03 08:07:09 crc kubenswrapper[4664]: I1003 08:07:09.874108 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fn2n\" (UniqueName: \"kubernetes.io/projected/9f1aa7a2-0b64-4d75-9e05-9a53c987be28-kube-api-access-5fn2n\") pod \"ovn-controller-metrics-2k7c8\" (UID: \"9f1aa7a2-0b64-4d75-9e05-9a53c987be28\") " pod="openstack/ovn-controller-metrics-2k7c8" Oct 03 08:07:09 crc kubenswrapper[4664]: I1003 08:07:09.955756 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-2k7c8" Oct 03 08:07:10 crc kubenswrapper[4664]: I1003 08:07:10.057770 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-w782k"] Oct 03 08:07:10 crc kubenswrapper[4664]: I1003 08:07:10.095938 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-6rhzw"] Oct 03 08:07:10 crc kubenswrapper[4664]: I1003 08:07:10.108022 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-6rhzw" Oct 03 08:07:10 crc kubenswrapper[4664]: I1003 08:07:10.130969 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 03 08:07:10 crc kubenswrapper[4664]: I1003 08:07:10.141382 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-6rhzw"] Oct 03 08:07:10 crc kubenswrapper[4664]: I1003 08:07:10.156133 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8f29518-3766-425f-b4fd-737fde3bf509-config\") pod \"dnsmasq-dns-8554648995-6rhzw\" (UID: \"e8f29518-3766-425f-b4fd-737fde3bf509\") " pod="openstack/dnsmasq-dns-8554648995-6rhzw" Oct 03 08:07:10 crc kubenswrapper[4664]: I1003 08:07:10.156206 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q897\" (UniqueName: \"kubernetes.io/projected/e8f29518-3766-425f-b4fd-737fde3bf509-kube-api-access-8q897\") pod \"dnsmasq-dns-8554648995-6rhzw\" (UID: \"e8f29518-3766-425f-b4fd-737fde3bf509\") " pod="openstack/dnsmasq-dns-8554648995-6rhzw" Oct 03 08:07:10 crc kubenswrapper[4664]: I1003 08:07:10.156368 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8f29518-3766-425f-b4fd-737fde3bf509-dns-svc\") pod \"dnsmasq-dns-8554648995-6rhzw\" (UID: \"e8f29518-3766-425f-b4fd-737fde3bf509\") " pod="openstack/dnsmasq-dns-8554648995-6rhzw" Oct 03 08:07:10 crc kubenswrapper[4664]: I1003 08:07:10.156419 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8f29518-3766-425f-b4fd-737fde3bf509-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-6rhzw\" (UID: \"e8f29518-3766-425f-b4fd-737fde3bf509\") " pod="openstack/dnsmasq-dns-8554648995-6rhzw" Oct 03 08:07:10 crc kubenswrapper[4664]: I1003 08:07:10.156536 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8f29518-3766-425f-b4fd-737fde3bf509-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-6rhzw\" (UID: \"e8f29518-3766-425f-b4fd-737fde3bf509\") " pod="openstack/dnsmasq-dns-8554648995-6rhzw" Oct 03 08:07:10 crc kubenswrapper[4664]: I1003 08:07:10.210148 4664 generic.go:334] "Generic (PLEG): container finished" podID="0710a5a3-3e65-42b8-bd1d-d40deb6a325d" containerID="aed462f888e5faa64216d1bbd507634f2745a86382b59548304a11f3a42e7dcb" exitCode=0 Oct 03 08:07:10 crc kubenswrapper[4664]: I1003 08:07:10.210698 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"0710a5a3-3e65-42b8-bd1d-d40deb6a325d","Type":"ContainerDied","Data":"aed462f888e5faa64216d1bbd507634f2745a86382b59548304a11f3a42e7dcb"} Oct 03 08:07:10 crc kubenswrapper[4664]: I1003 08:07:10.211480 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 03 08:07:10 crc kubenswrapper[4664]: I1003 08:07:10.260727 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8f29518-3766-425f-b4fd-737fde3bf509-config\") pod \"dnsmasq-dns-8554648995-6rhzw\" (UID: \"e8f29518-3766-425f-b4fd-737fde3bf509\") " pod="openstack/dnsmasq-dns-8554648995-6rhzw" Oct 03 08:07:10 crc kubenswrapper[4664]: I1003 08:07:10.260791 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8q897\" (UniqueName: \"kubernetes.io/projected/e8f29518-3766-425f-b4fd-737fde3bf509-kube-api-access-8q897\") pod \"dnsmasq-dns-8554648995-6rhzw\" (UID: \"e8f29518-3766-425f-b4fd-737fde3bf509\") " pod="openstack/dnsmasq-dns-8554648995-6rhzw" Oct 03 08:07:10 crc kubenswrapper[4664]: I1003 08:07:10.260900 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8f29518-3766-425f-b4fd-737fde3bf509-dns-svc\") pod \"dnsmasq-dns-8554648995-6rhzw\" (UID: \"e8f29518-3766-425f-b4fd-737fde3bf509\") " pod="openstack/dnsmasq-dns-8554648995-6rhzw" Oct 03 08:07:10 crc kubenswrapper[4664]: I1003 08:07:10.260955 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8f29518-3766-425f-b4fd-737fde3bf509-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-6rhzw\" (UID: \"e8f29518-3766-425f-b4fd-737fde3bf509\") " pod="openstack/dnsmasq-dns-8554648995-6rhzw" Oct 03 08:07:10 crc kubenswrapper[4664]: I1003 08:07:10.261024 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8f29518-3766-425f-b4fd-737fde3bf509-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-6rhzw\" (UID: \"e8f29518-3766-425f-b4fd-737fde3bf509\") " pod="openstack/dnsmasq-dns-8554648995-6rhzw" Oct 03 08:07:10 crc kubenswrapper[4664]: I1003 08:07:10.262912 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8f29518-3766-425f-b4fd-737fde3bf509-dns-svc\") pod \"dnsmasq-dns-8554648995-6rhzw\" (UID: \"e8f29518-3766-425f-b4fd-737fde3bf509\") " pod="openstack/dnsmasq-dns-8554648995-6rhzw" Oct 03 08:07:10 crc kubenswrapper[4664]: I1003 08:07:10.263426 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8f29518-3766-425f-b4fd-737fde3bf509-config\") pod \"dnsmasq-dns-8554648995-6rhzw\" (UID: \"e8f29518-3766-425f-b4fd-737fde3bf509\") " pod="openstack/dnsmasq-dns-8554648995-6rhzw" Oct 03 08:07:10 crc kubenswrapper[4664]: I1003 08:07:10.264418 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8f29518-3766-425f-b4fd-737fde3bf509-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-6rhzw\" (UID: \"e8f29518-3766-425f-b4fd-737fde3bf509\") " pod="openstack/dnsmasq-dns-8554648995-6rhzw" Oct 03 08:07:10 crc kubenswrapper[4664]: I1003 08:07:10.265239 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8f29518-3766-425f-b4fd-737fde3bf509-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-6rhzw\" (UID: \"e8f29518-3766-425f-b4fd-737fde3bf509\") " pod="openstack/dnsmasq-dns-8554648995-6rhzw" Oct 03 08:07:10 crc kubenswrapper[4664]: I1003 08:07:10.283006 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q897\" (UniqueName: \"kubernetes.io/projected/e8f29518-3766-425f-b4fd-737fde3bf509-kube-api-access-8q897\") pod \"dnsmasq-dns-8554648995-6rhzw\" (UID: \"e8f29518-3766-425f-b4fd-737fde3bf509\") " pod="openstack/dnsmasq-dns-8554648995-6rhzw" Oct 03 08:07:10 crc kubenswrapper[4664]: I1003 08:07:10.283340 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 03 08:07:10 crc kubenswrapper[4664]: I1003 08:07:10.450092 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-6rhzw" Oct 03 08:07:10 crc kubenswrapper[4664]: I1003 08:07:10.502935 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-w782k"] Oct 03 08:07:10 crc kubenswrapper[4664]: W1003 08:07:10.513872 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b9b1417_849a_446f_bdab_62f46de22cca.slice/crio-cdbf1509e1e6635c9d64a4def54dc88a234788ae31f40173ace22694c9079f40 WatchSource:0}: Error finding container cdbf1509e1e6635c9d64a4def54dc88a234788ae31f40173ace22694c9079f40: Status 404 returned error can't find the container with id cdbf1509e1e6635c9d64a4def54dc88a234788ae31f40173ace22694c9079f40 Oct 03 08:07:10 crc kubenswrapper[4664]: I1003 08:07:10.600784 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-2k7c8"] Oct 03 08:07:10 crc kubenswrapper[4664]: I1003 08:07:10.830551 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 03 08:07:10 crc kubenswrapper[4664]: I1003 08:07:10.836989 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 03 08:07:10 crc kubenswrapper[4664]: I1003 08:07:10.837108 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 03 08:07:10 crc kubenswrapper[4664]: I1003 08:07:10.840005 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 03 08:07:10 crc kubenswrapper[4664]: I1003 08:07:10.840258 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 03 08:07:10 crc kubenswrapper[4664]: I1003 08:07:10.840401 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 03 08:07:10 crc kubenswrapper[4664]: I1003 08:07:10.840400 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-cbp48" Oct 03 08:07:10 crc kubenswrapper[4664]: I1003 08:07:10.877742 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 03 08:07:10 crc kubenswrapper[4664]: I1003 08:07:10.894425 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-6rhzw"] Oct 03 08:07:10 crc kubenswrapper[4664]: I1003 08:07:10.977556 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/55811e3b-b345-4cc9-9ade-c8f977a0706c-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"55811e3b-b345-4cc9-9ade-c8f977a0706c\") " pod="openstack/ovn-northd-0" Oct 03 08:07:10 crc kubenswrapper[4664]: I1003 08:07:10.977677 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55811e3b-b345-4cc9-9ade-c8f977a0706c-config\") pod \"ovn-northd-0\" (UID: \"55811e3b-b345-4cc9-9ade-c8f977a0706c\") " pod="openstack/ovn-northd-0" Oct 03 08:07:10 crc kubenswrapper[4664]: I1003 08:07:10.977706 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/55811e3b-b345-4cc9-9ade-c8f977a0706c-scripts\") pod \"ovn-northd-0\" (UID: \"55811e3b-b345-4cc9-9ade-c8f977a0706c\") " pod="openstack/ovn-northd-0" Oct 03 08:07:10 crc kubenswrapper[4664]: I1003 08:07:10.977761 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/55811e3b-b345-4cc9-9ade-c8f977a0706c-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"55811e3b-b345-4cc9-9ade-c8f977a0706c\") " pod="openstack/ovn-northd-0" Oct 03 08:07:10 crc kubenswrapper[4664]: I1003 08:07:10.977818 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55811e3b-b345-4cc9-9ade-c8f977a0706c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"55811e3b-b345-4cc9-9ade-c8f977a0706c\") " pod="openstack/ovn-northd-0" Oct 03 08:07:10 crc kubenswrapper[4664]: I1003 08:07:10.977843 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t69c8\" (UniqueName: \"kubernetes.io/projected/55811e3b-b345-4cc9-9ade-c8f977a0706c-kube-api-access-t69c8\") pod \"ovn-northd-0\" (UID: \"55811e3b-b345-4cc9-9ade-c8f977a0706c\") " pod="openstack/ovn-northd-0" Oct 03 08:07:10 crc kubenswrapper[4664]: I1003 08:07:10.979636 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/55811e3b-b345-4cc9-9ade-c8f977a0706c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"55811e3b-b345-4cc9-9ade-c8f977a0706c\") " pod="openstack/ovn-northd-0" Oct 03 08:07:11 crc kubenswrapper[4664]: I1003 08:07:11.086996 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55811e3b-b345-4cc9-9ade-c8f977a0706c-config\") pod \"ovn-northd-0\" (UID: \"55811e3b-b345-4cc9-9ade-c8f977a0706c\") " pod="openstack/ovn-northd-0" Oct 03 08:07:11 crc kubenswrapper[4664]: I1003 08:07:11.087054 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/55811e3b-b345-4cc9-9ade-c8f977a0706c-scripts\") pod \"ovn-northd-0\" (UID: \"55811e3b-b345-4cc9-9ade-c8f977a0706c\") " pod="openstack/ovn-northd-0" Oct 03 08:07:11 crc kubenswrapper[4664]: I1003 08:07:11.087082 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/55811e3b-b345-4cc9-9ade-c8f977a0706c-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"55811e3b-b345-4cc9-9ade-c8f977a0706c\") " pod="openstack/ovn-northd-0" Oct 03 08:07:11 crc kubenswrapper[4664]: I1003 08:07:11.087117 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55811e3b-b345-4cc9-9ade-c8f977a0706c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"55811e3b-b345-4cc9-9ade-c8f977a0706c\") " pod="openstack/ovn-northd-0" Oct 03 08:07:11 crc kubenswrapper[4664]: I1003 08:07:11.087141 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t69c8\" (UniqueName: \"kubernetes.io/projected/55811e3b-b345-4cc9-9ade-c8f977a0706c-kube-api-access-t69c8\") pod \"ovn-northd-0\" (UID: \"55811e3b-b345-4cc9-9ade-c8f977a0706c\") " pod="openstack/ovn-northd-0" Oct 03 08:07:11 crc kubenswrapper[4664]: I1003 08:07:11.087258 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/55811e3b-b345-4cc9-9ade-c8f977a0706c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"55811e3b-b345-4cc9-9ade-c8f977a0706c\") " pod="openstack/ovn-northd-0" Oct 03 08:07:11 crc kubenswrapper[4664]: I1003 08:07:11.087311 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/55811e3b-b345-4cc9-9ade-c8f977a0706c-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"55811e3b-b345-4cc9-9ade-c8f977a0706c\") " pod="openstack/ovn-northd-0" Oct 03 08:07:11 crc kubenswrapper[4664]: I1003 08:07:11.090813 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55811e3b-b345-4cc9-9ade-c8f977a0706c-config\") pod \"ovn-northd-0\" (UID: \"55811e3b-b345-4cc9-9ade-c8f977a0706c\") " pod="openstack/ovn-northd-0" Oct 03 08:07:11 crc kubenswrapper[4664]: I1003 08:07:11.091587 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/55811e3b-b345-4cc9-9ade-c8f977a0706c-scripts\") pod \"ovn-northd-0\" (UID: \"55811e3b-b345-4cc9-9ade-c8f977a0706c\") " pod="openstack/ovn-northd-0" Oct 03 08:07:11 crc kubenswrapper[4664]: I1003 08:07:11.092064 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/55811e3b-b345-4cc9-9ade-c8f977a0706c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"55811e3b-b345-4cc9-9ade-c8f977a0706c\") " pod="openstack/ovn-northd-0" Oct 03 08:07:11 crc kubenswrapper[4664]: I1003 08:07:11.255477 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/55811e3b-b345-4cc9-9ade-c8f977a0706c-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"55811e3b-b345-4cc9-9ade-c8f977a0706c\") " pod="openstack/ovn-northd-0" Oct 03 08:07:11 crc kubenswrapper[4664]: I1003 08:07:11.267626 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/55811e3b-b345-4cc9-9ade-c8f977a0706c-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"55811e3b-b345-4cc9-9ade-c8f977a0706c\") " pod="openstack/ovn-northd-0" Oct 03 08:07:11 crc kubenswrapper[4664]: I1003 08:07:11.267727 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55811e3b-b345-4cc9-9ade-c8f977a0706c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"55811e3b-b345-4cc9-9ade-c8f977a0706c\") " pod="openstack/ovn-northd-0" Oct 03 08:07:11 crc kubenswrapper[4664]: I1003 08:07:11.268080 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t69c8\" (UniqueName: \"kubernetes.io/projected/55811e3b-b345-4cc9-9ade-c8f977a0706c-kube-api-access-t69c8\") pod \"ovn-northd-0\" (UID: \"55811e3b-b345-4cc9-9ade-c8f977a0706c\") " pod="openstack/ovn-northd-0" Oct 03 08:07:11 crc kubenswrapper[4664]: I1003 08:07:11.278840 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"0710a5a3-3e65-42b8-bd1d-d40deb6a325d","Type":"ContainerStarted","Data":"339efdf68f0a122f44ffea8090ea1f68cef7256f1c9cf2970225de45bd3a838d"} Oct 03 08:07:11 crc kubenswrapper[4664]: I1003 08:07:11.283695 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-2k7c8" event={"ID":"9f1aa7a2-0b64-4d75-9e05-9a53c987be28","Type":"ContainerStarted","Data":"d10cefcb9be0a1f6333276d35972bd21af45efb949f89402a037e21436e41f2e"} Oct 03 08:07:11 crc kubenswrapper[4664]: I1003 08:07:11.285243 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-w782k" event={"ID":"7b9b1417-849a-446f-bdab-62f46de22cca","Type":"ContainerStarted","Data":"9213143e118b371a30146c925144e10aaf1ed5014bc1fd7c60302f0fb24dc1ce"} Oct 03 08:07:11 crc kubenswrapper[4664]: I1003 08:07:11.285270 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-w782k" event={"ID":"7b9b1417-849a-446f-bdab-62f46de22cca","Type":"ContainerStarted","Data":"cdbf1509e1e6635c9d64a4def54dc88a234788ae31f40173ace22694c9079f40"} Oct 03 08:07:11 crc kubenswrapper[4664]: I1003 08:07:11.285391 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5bf47b49b7-w782k" podUID="7b9b1417-849a-446f-bdab-62f46de22cca" containerName="init" containerID="cri-o://9213143e118b371a30146c925144e10aaf1ed5014bc1fd7c60302f0fb24dc1ce" gracePeriod=10 Oct 03 08:07:11 crc kubenswrapper[4664]: I1003 08:07:11.291830 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-6rhzw" event={"ID":"e8f29518-3766-425f-b4fd-737fde3bf509","Type":"ContainerStarted","Data":"6ece55bd01fc9bd5b77bb3fcf80b021e58f51626908937c76a9191d98146051d"} Oct 03 08:07:11 crc kubenswrapper[4664]: I1003 08:07:11.368850 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 03 08:07:11 crc kubenswrapper[4664]: I1003 08:07:11.371393 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=8.333491408 podStartE2EDuration="52.371369074s" podCreationTimestamp="2025-10-03 08:06:19 +0000 UTC" firstStartedPulling="2025-10-03 08:06:21.81075237 +0000 UTC m=+1082.631942860" lastFinishedPulling="2025-10-03 08:07:05.848630036 +0000 UTC m=+1126.669820526" observedRunningTime="2025-10-03 08:07:11.331626193 +0000 UTC m=+1132.152816693" watchObservedRunningTime="2025-10-03 08:07:11.371369074 +0000 UTC m=+1132.192559564" Oct 03 08:07:11 crc kubenswrapper[4664]: I1003 08:07:11.693018 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 03 08:07:11 crc kubenswrapper[4664]: W1003 08:07:11.702228 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55811e3b_b345_4cc9_9ade_c8f977a0706c.slice/crio-46538053c16c32dc6489375bfec81ffc151c50b3e8293a0a0b40138fcaeacd7a WatchSource:0}: Error finding container 46538053c16c32dc6489375bfec81ffc151c50b3e8293a0a0b40138fcaeacd7a: Status 404 returned error can't find the container with id 46538053c16c32dc6489375bfec81ffc151c50b3e8293a0a0b40138fcaeacd7a Oct 03 08:07:12 crc kubenswrapper[4664]: I1003 08:07:12.302860 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8bb7b62d-f030-45a7-b9f8-87852ea275de","Type":"ContainerStarted","Data":"dea969824e6c260d6adfd6cb873a9ed48c1243cced9fbb9c84161d22c5a1daa9"} Oct 03 08:07:12 crc kubenswrapper[4664]: I1003 08:07:12.306257 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"55811e3b-b345-4cc9-9ade-c8f977a0706c","Type":"ContainerStarted","Data":"46538053c16c32dc6489375bfec81ffc151c50b3e8293a0a0b40138fcaeacd7a"} Oct 03 08:07:12 crc kubenswrapper[4664]: I1003 08:07:12.308316 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-2k7c8" event={"ID":"9f1aa7a2-0b64-4d75-9e05-9a53c987be28","Type":"ContainerStarted","Data":"e11e5d15405b19a726b3625f5269e2e44fa82c315b8093006b8ea7eb312ecda6"} Oct 03 08:07:12 crc kubenswrapper[4664]: I1003 08:07:12.311658 4664 generic.go:334] "Generic (PLEG): container finished" podID="7b9b1417-849a-446f-bdab-62f46de22cca" containerID="9213143e118b371a30146c925144e10aaf1ed5014bc1fd7c60302f0fb24dc1ce" exitCode=0 Oct 03 08:07:12 crc kubenswrapper[4664]: I1003 08:07:12.311773 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-w782k" event={"ID":"7b9b1417-849a-446f-bdab-62f46de22cca","Type":"ContainerDied","Data":"9213143e118b371a30146c925144e10aaf1ed5014bc1fd7c60302f0fb24dc1ce"} Oct 03 08:07:12 crc kubenswrapper[4664]: I1003 08:07:12.693468 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-w782k" Oct 03 08:07:12 crc kubenswrapper[4664]: I1003 08:07:12.715811 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-2k7c8" podStartSLOduration=3.715789709 podStartE2EDuration="3.715789709s" podCreationTimestamp="2025-10-03 08:07:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:07:12.400244828 +0000 UTC m=+1133.221435338" watchObservedRunningTime="2025-10-03 08:07:12.715789709 +0000 UTC m=+1133.536980199" Oct 03 08:07:12 crc kubenswrapper[4664]: I1003 08:07:12.731252 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7b9b1417-849a-446f-bdab-62f46de22cca-ovsdbserver-nb\") pod \"7b9b1417-849a-446f-bdab-62f46de22cca\" (UID: \"7b9b1417-849a-446f-bdab-62f46de22cca\") " Oct 03 08:07:12 crc kubenswrapper[4664]: I1003 08:07:12.731344 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b9b1417-849a-446f-bdab-62f46de22cca-dns-svc\") pod \"7b9b1417-849a-446f-bdab-62f46de22cca\" (UID: \"7b9b1417-849a-446f-bdab-62f46de22cca\") " Oct 03 08:07:12 crc kubenswrapper[4664]: I1003 08:07:12.731457 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b9b1417-849a-446f-bdab-62f46de22cca-config\") pod \"7b9b1417-849a-446f-bdab-62f46de22cca\" (UID: \"7b9b1417-849a-446f-bdab-62f46de22cca\") " Oct 03 08:07:12 crc kubenswrapper[4664]: I1003 08:07:12.731543 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hwg7\" (UniqueName: \"kubernetes.io/projected/7b9b1417-849a-446f-bdab-62f46de22cca-kube-api-access-5hwg7\") pod \"7b9b1417-849a-446f-bdab-62f46de22cca\" (UID: \"7b9b1417-849a-446f-bdab-62f46de22cca\") " Oct 03 08:07:12 crc kubenswrapper[4664]: I1003 08:07:12.740743 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b9b1417-849a-446f-bdab-62f46de22cca-kube-api-access-5hwg7" (OuterVolumeSpecName: "kube-api-access-5hwg7") pod "7b9b1417-849a-446f-bdab-62f46de22cca" (UID: "7b9b1417-849a-446f-bdab-62f46de22cca"). InnerVolumeSpecName "kube-api-access-5hwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:07:12 crc kubenswrapper[4664]: I1003 08:07:12.767471 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b9b1417-849a-446f-bdab-62f46de22cca-config" (OuterVolumeSpecName: "config") pod "7b9b1417-849a-446f-bdab-62f46de22cca" (UID: "7b9b1417-849a-446f-bdab-62f46de22cca"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:07:12 crc kubenswrapper[4664]: I1003 08:07:12.777377 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b9b1417-849a-446f-bdab-62f46de22cca-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7b9b1417-849a-446f-bdab-62f46de22cca" (UID: "7b9b1417-849a-446f-bdab-62f46de22cca"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:07:12 crc kubenswrapper[4664]: I1003 08:07:12.779439 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 03 08:07:12 crc kubenswrapper[4664]: I1003 08:07:12.805133 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b9b1417-849a-446f-bdab-62f46de22cca-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7b9b1417-849a-446f-bdab-62f46de22cca" (UID: "7b9b1417-849a-446f-bdab-62f46de22cca"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:07:12 crc kubenswrapper[4664]: I1003 08:07:12.835105 4664 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b9b1417-849a-446f-bdab-62f46de22cca-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:07:12 crc kubenswrapper[4664]: I1003 08:07:12.835134 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hwg7\" (UniqueName: \"kubernetes.io/projected/7b9b1417-849a-446f-bdab-62f46de22cca-kube-api-access-5hwg7\") on node \"crc\" DevicePath \"\"" Oct 03 08:07:12 crc kubenswrapper[4664]: I1003 08:07:12.835145 4664 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7b9b1417-849a-446f-bdab-62f46de22cca-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 08:07:12 crc kubenswrapper[4664]: I1003 08:07:12.835156 4664 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b9b1417-849a-446f-bdab-62f46de22cca-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 08:07:12 crc kubenswrapper[4664]: I1003 08:07:12.907038 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-6rhzw"] Oct 03 08:07:12 crc kubenswrapper[4664]: I1003 08:07:12.947999 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-7nzs5"] Oct 03 08:07:12 crc kubenswrapper[4664]: E1003 08:07:12.948479 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b9b1417-849a-446f-bdab-62f46de22cca" containerName="init" Oct 03 08:07:12 crc kubenswrapper[4664]: I1003 08:07:12.948500 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b9b1417-849a-446f-bdab-62f46de22cca" containerName="init" Oct 03 08:07:12 crc kubenswrapper[4664]: I1003 08:07:12.948757 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b9b1417-849a-446f-bdab-62f46de22cca" containerName="init" Oct 03 08:07:12 crc kubenswrapper[4664]: I1003 08:07:12.949875 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-7nzs5" Oct 03 08:07:12 crc kubenswrapper[4664]: I1003 08:07:12.971023 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-7nzs5"] Oct 03 08:07:13 crc kubenswrapper[4664]: I1003 08:07:13.040890 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/455d48b8-17a8-4c2f-9923-a30626506388-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-7nzs5\" (UID: \"455d48b8-17a8-4c2f-9923-a30626506388\") " pod="openstack/dnsmasq-dns-b8fbc5445-7nzs5" Oct 03 08:07:13 crc kubenswrapper[4664]: I1003 08:07:13.040985 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/455d48b8-17a8-4c2f-9923-a30626506388-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-7nzs5\" (UID: \"455d48b8-17a8-4c2f-9923-a30626506388\") " pod="openstack/dnsmasq-dns-b8fbc5445-7nzs5" Oct 03 08:07:13 crc kubenswrapper[4664]: I1003 08:07:13.041099 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/455d48b8-17a8-4c2f-9923-a30626506388-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-7nzs5\" (UID: \"455d48b8-17a8-4c2f-9923-a30626506388\") " pod="openstack/dnsmasq-dns-b8fbc5445-7nzs5" Oct 03 08:07:13 crc kubenswrapper[4664]: I1003 08:07:13.041335 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b49bw\" (UniqueName: \"kubernetes.io/projected/455d48b8-17a8-4c2f-9923-a30626506388-kube-api-access-b49bw\") pod \"dnsmasq-dns-b8fbc5445-7nzs5\" (UID: \"455d48b8-17a8-4c2f-9923-a30626506388\") " pod="openstack/dnsmasq-dns-b8fbc5445-7nzs5" Oct 03 08:07:13 crc kubenswrapper[4664]: I1003 08:07:13.041508 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/455d48b8-17a8-4c2f-9923-a30626506388-config\") pod \"dnsmasq-dns-b8fbc5445-7nzs5\" (UID: \"455d48b8-17a8-4c2f-9923-a30626506388\") " pod="openstack/dnsmasq-dns-b8fbc5445-7nzs5" Oct 03 08:07:13 crc kubenswrapper[4664]: I1003 08:07:13.142922 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b49bw\" (UniqueName: \"kubernetes.io/projected/455d48b8-17a8-4c2f-9923-a30626506388-kube-api-access-b49bw\") pod \"dnsmasq-dns-b8fbc5445-7nzs5\" (UID: \"455d48b8-17a8-4c2f-9923-a30626506388\") " pod="openstack/dnsmasq-dns-b8fbc5445-7nzs5" Oct 03 08:07:13 crc kubenswrapper[4664]: I1003 08:07:13.143008 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/455d48b8-17a8-4c2f-9923-a30626506388-config\") pod \"dnsmasq-dns-b8fbc5445-7nzs5\" (UID: \"455d48b8-17a8-4c2f-9923-a30626506388\") " pod="openstack/dnsmasq-dns-b8fbc5445-7nzs5" Oct 03 08:07:13 crc kubenswrapper[4664]: I1003 08:07:13.143077 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/455d48b8-17a8-4c2f-9923-a30626506388-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-7nzs5\" (UID: \"455d48b8-17a8-4c2f-9923-a30626506388\") " pod="openstack/dnsmasq-dns-b8fbc5445-7nzs5" Oct 03 08:07:13 crc kubenswrapper[4664]: I1003 08:07:13.143112 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/455d48b8-17a8-4c2f-9923-a30626506388-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-7nzs5\" (UID: \"455d48b8-17a8-4c2f-9923-a30626506388\") " pod="openstack/dnsmasq-dns-b8fbc5445-7nzs5" Oct 03 08:07:13 crc kubenswrapper[4664]: I1003 08:07:13.143165 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/455d48b8-17a8-4c2f-9923-a30626506388-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-7nzs5\" (UID: \"455d48b8-17a8-4c2f-9923-a30626506388\") " pod="openstack/dnsmasq-dns-b8fbc5445-7nzs5" Oct 03 08:07:13 crc kubenswrapper[4664]: I1003 08:07:13.144300 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/455d48b8-17a8-4c2f-9923-a30626506388-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-7nzs5\" (UID: \"455d48b8-17a8-4c2f-9923-a30626506388\") " pod="openstack/dnsmasq-dns-b8fbc5445-7nzs5" Oct 03 08:07:13 crc kubenswrapper[4664]: I1003 08:07:13.145105 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/455d48b8-17a8-4c2f-9923-a30626506388-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-7nzs5\" (UID: \"455d48b8-17a8-4c2f-9923-a30626506388\") " pod="openstack/dnsmasq-dns-b8fbc5445-7nzs5" Oct 03 08:07:13 crc kubenswrapper[4664]: I1003 08:07:13.145432 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/455d48b8-17a8-4c2f-9923-a30626506388-config\") pod \"dnsmasq-dns-b8fbc5445-7nzs5\" (UID: \"455d48b8-17a8-4c2f-9923-a30626506388\") " pod="openstack/dnsmasq-dns-b8fbc5445-7nzs5" Oct 03 08:07:13 crc kubenswrapper[4664]: I1003 08:07:13.147979 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/455d48b8-17a8-4c2f-9923-a30626506388-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-7nzs5\" (UID: \"455d48b8-17a8-4c2f-9923-a30626506388\") " pod="openstack/dnsmasq-dns-b8fbc5445-7nzs5" Oct 03 08:07:13 crc kubenswrapper[4664]: I1003 08:07:13.170453 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b49bw\" (UniqueName: \"kubernetes.io/projected/455d48b8-17a8-4c2f-9923-a30626506388-kube-api-access-b49bw\") pod \"dnsmasq-dns-b8fbc5445-7nzs5\" (UID: \"455d48b8-17a8-4c2f-9923-a30626506388\") " pod="openstack/dnsmasq-dns-b8fbc5445-7nzs5" Oct 03 08:07:13 crc kubenswrapper[4664]: I1003 08:07:13.272495 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-7nzs5" Oct 03 08:07:13 crc kubenswrapper[4664]: I1003 08:07:13.340053 4664 generic.go:334] "Generic (PLEG): container finished" podID="a5b770eb-2222-43c8-bb15-6e2d18e95fbf" containerID="6a5c965261238bf1912fa3f341678a6cce7d7d8ce8c1a088d13ebfdba6fef2ba" exitCode=0 Oct 03 08:07:13 crc kubenswrapper[4664]: I1003 08:07:13.340208 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a5b770eb-2222-43c8-bb15-6e2d18e95fbf","Type":"ContainerDied","Data":"6a5c965261238bf1912fa3f341678a6cce7d7d8ce8c1a088d13ebfdba6fef2ba"} Oct 03 08:07:13 crc kubenswrapper[4664]: I1003 08:07:13.359695 4664 generic.go:334] "Generic (PLEG): container finished" podID="e8f29518-3766-425f-b4fd-737fde3bf509" containerID="688c505e3e1c67d093425773171e52c9beadae1bbd4a34d0fc8778843255da0a" exitCode=0 Oct 03 08:07:13 crc kubenswrapper[4664]: I1003 08:07:13.359801 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-6rhzw" event={"ID":"e8f29518-3766-425f-b4fd-737fde3bf509","Type":"ContainerDied","Data":"688c505e3e1c67d093425773171e52c9beadae1bbd4a34d0fc8778843255da0a"} Oct 03 08:07:13 crc kubenswrapper[4664]: I1003 08:07:13.365746 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-w782k" Oct 03 08:07:13 crc kubenswrapper[4664]: I1003 08:07:13.365798 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-w782k" event={"ID":"7b9b1417-849a-446f-bdab-62f46de22cca","Type":"ContainerDied","Data":"cdbf1509e1e6635c9d64a4def54dc88a234788ae31f40173ace22694c9079f40"} Oct 03 08:07:13 crc kubenswrapper[4664]: I1003 08:07:13.365892 4664 scope.go:117] "RemoveContainer" containerID="9213143e118b371a30146c925144e10aaf1ed5014bc1fd7c60302f0fb24dc1ce" Oct 03 08:07:13 crc kubenswrapper[4664]: I1003 08:07:13.504204 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-w782k"] Oct 03 08:07:13 crc kubenswrapper[4664]: I1003 08:07:13.518704 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-w782k"] Oct 03 08:07:13 crc kubenswrapper[4664]: E1003 08:07:13.632203 4664 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Oct 03 08:07:13 crc kubenswrapper[4664]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/e8f29518-3766-425f-b4fd-737fde3bf509/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 03 08:07:13 crc kubenswrapper[4664]: > podSandboxID="6ece55bd01fc9bd5b77bb3fcf80b021e58f51626908937c76a9191d98146051d" Oct 03 08:07:13 crc kubenswrapper[4664]: E1003 08:07:13.632800 4664 kuberuntime_manager.go:1274] "Unhandled Error" err=< Oct 03 08:07:13 crc kubenswrapper[4664]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n654h99h64ch5dbh6dh555h587h64bh5cfh647h5fdh57ch679h9h597h5f5hbch59bh54fh575h566h667h586h5f5h65ch5bch57h68h65ch58bh694h5cfq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8q897,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-8554648995-6rhzw_openstack(e8f29518-3766-425f-b4fd-737fde3bf509): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/e8f29518-3766-425f-b4fd-737fde3bf509/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 03 08:07:13 crc kubenswrapper[4664]: > logger="UnhandledError" Oct 03 08:07:13 crc kubenswrapper[4664]: E1003 08:07:13.634118 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/e8f29518-3766-425f-b4fd-737fde3bf509/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-8554648995-6rhzw" podUID="e8f29518-3766-425f-b4fd-737fde3bf509" Oct 03 08:07:13 crc kubenswrapper[4664]: I1003 08:07:13.762876 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-7nzs5"] Oct 03 08:07:13 crc kubenswrapper[4664]: W1003 08:07:13.768776 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod455d48b8_17a8_4c2f_9923_a30626506388.slice/crio-54ea4fac0a1e7542db8d75d60858a8b0ccf58ebcc0d9756e676346e56dd44951 WatchSource:0}: Error finding container 54ea4fac0a1e7542db8d75d60858a8b0ccf58ebcc0d9756e676346e56dd44951: Status 404 returned error can't find the container with id 54ea4fac0a1e7542db8d75d60858a8b0ccf58ebcc0d9756e676346e56dd44951 Oct 03 08:07:13 crc kubenswrapper[4664]: I1003 08:07:13.894566 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b9b1417-849a-446f-bdab-62f46de22cca" path="/var/lib/kubelet/pods/7b9b1417-849a-446f-bdab-62f46de22cca/volumes" Oct 03 08:07:14 crc kubenswrapper[4664]: I1003 08:07:14.026042 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Oct 03 08:07:14 crc kubenswrapper[4664]: I1003 08:07:14.076979 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 03 08:07:14 crc kubenswrapper[4664]: I1003 08:07:14.078434 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 03 08:07:14 crc kubenswrapper[4664]: I1003 08:07:14.083205 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Oct 03 08:07:14 crc kubenswrapper[4664]: I1003 08:07:14.083403 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Oct 03 08:07:14 crc kubenswrapper[4664]: I1003 08:07:14.083576 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Oct 03 08:07:14 crc kubenswrapper[4664]: I1003 08:07:14.083096 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-pv449" Oct 03 08:07:14 crc kubenswrapper[4664]: I1003 08:07:14.274688 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e-etc-swift\") pod \"swift-storage-0\" (UID: \"2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e\") " pod="openstack/swift-storage-0" Oct 03 08:07:14 crc kubenswrapper[4664]: I1003 08:07:14.274756 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e-lock\") pod \"swift-storage-0\" (UID: \"2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e\") " pod="openstack/swift-storage-0" Oct 03 08:07:14 crc kubenswrapper[4664]: I1003 08:07:14.274779 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p4nb\" (UniqueName: \"kubernetes.io/projected/2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e-kube-api-access-6p4nb\") pod \"swift-storage-0\" (UID: \"2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e\") " pod="openstack/swift-storage-0" Oct 03 08:07:14 crc kubenswrapper[4664]: I1003 08:07:14.275244 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e-cache\") pod \"swift-storage-0\" (UID: \"2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e\") " pod="openstack/swift-storage-0" Oct 03 08:07:14 crc kubenswrapper[4664]: I1003 08:07:14.275317 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e\") " pod="openstack/swift-storage-0" Oct 03 08:07:14 crc kubenswrapper[4664]: I1003 08:07:14.376490 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e-cache\") pod \"swift-storage-0\" (UID: \"2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e\") " pod="openstack/swift-storage-0" Oct 03 08:07:14 crc kubenswrapper[4664]: I1003 08:07:14.376543 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e\") " pod="openstack/swift-storage-0" Oct 03 08:07:14 crc kubenswrapper[4664]: I1003 08:07:14.376483 4664 generic.go:334] "Generic (PLEG): container finished" podID="455d48b8-17a8-4c2f-9923-a30626506388" containerID="e9acd6a3e935cf29a65fe995181e25088e73a6149a4a689a65ed6d8709c8d13f" exitCode=0 Oct 03 08:07:14 crc kubenswrapper[4664]: I1003 08:07:14.376512 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-7nzs5" event={"ID":"455d48b8-17a8-4c2f-9923-a30626506388","Type":"ContainerDied","Data":"e9acd6a3e935cf29a65fe995181e25088e73a6149a4a689a65ed6d8709c8d13f"} Oct 03 08:07:14 crc kubenswrapper[4664]: I1003 08:07:14.376735 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-7nzs5" event={"ID":"455d48b8-17a8-4c2f-9923-a30626506388","Type":"ContainerStarted","Data":"54ea4fac0a1e7542db8d75d60858a8b0ccf58ebcc0d9756e676346e56dd44951"} Oct 03 08:07:14 crc kubenswrapper[4664]: I1003 08:07:14.376883 4664 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/swift-storage-0" Oct 03 08:07:14 crc kubenswrapper[4664]: I1003 08:07:14.377528 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e-cache\") pod \"swift-storage-0\" (UID: \"2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e\") " pod="openstack/swift-storage-0" Oct 03 08:07:14 crc kubenswrapper[4664]: I1003 08:07:14.379778 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a5b770eb-2222-43c8-bb15-6e2d18e95fbf","Type":"ContainerStarted","Data":"0529836dc8eaec0491801135009fb96019ed93045256cd028b4dbf78bff9ec84"} Oct 03 08:07:14 crc kubenswrapper[4664]: I1003 08:07:14.381205 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"55811e3b-b345-4cc9-9ade-c8f977a0706c","Type":"ContainerStarted","Data":"2437010580e768675a0dd7f492521e97e1778638742b7c1b1c19372f3a65bee7"} Oct 03 08:07:14 crc kubenswrapper[4664]: I1003 08:07:14.381247 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"55811e3b-b345-4cc9-9ade-c8f977a0706c","Type":"ContainerStarted","Data":"5cc001e335d18e5c31d18e1919d9b5c71e00f3957760b825ce9bc4d58ad8192a"} Oct 03 08:07:14 crc kubenswrapper[4664]: I1003 08:07:14.388031 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e-etc-swift\") pod \"swift-storage-0\" (UID: \"2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e\") " pod="openstack/swift-storage-0" Oct 03 08:07:14 crc kubenswrapper[4664]: I1003 08:07:14.388136 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e-lock\") pod \"swift-storage-0\" (UID: \"2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e\") " pod="openstack/swift-storage-0" Oct 03 08:07:14 crc kubenswrapper[4664]: I1003 08:07:14.388187 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6p4nb\" (UniqueName: \"kubernetes.io/projected/2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e-kube-api-access-6p4nb\") pod \"swift-storage-0\" (UID: \"2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e\") " pod="openstack/swift-storage-0" Oct 03 08:07:14 crc kubenswrapper[4664]: E1003 08:07:14.389041 4664 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 03 08:07:14 crc kubenswrapper[4664]: E1003 08:07:14.389065 4664 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 03 08:07:14 crc kubenswrapper[4664]: E1003 08:07:14.389116 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e-etc-swift podName:2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e nodeName:}" failed. No retries permitted until 2025-10-03 08:07:14.889094344 +0000 UTC m=+1135.710284834 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e-etc-swift") pod "swift-storage-0" (UID: "2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e") : configmap "swift-ring-files" not found Oct 03 08:07:14 crc kubenswrapper[4664]: I1003 08:07:14.389503 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e-lock\") pod \"swift-storage-0\" (UID: \"2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e\") " pod="openstack/swift-storage-0" Oct 03 08:07:14 crc kubenswrapper[4664]: I1003 08:07:14.423879 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p4nb\" (UniqueName: \"kubernetes.io/projected/2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e-kube-api-access-6p4nb\") pod \"swift-storage-0\" (UID: \"2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e\") " pod="openstack/swift-storage-0" Oct 03 08:07:14 crc kubenswrapper[4664]: I1003 08:07:14.433736 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=-9223371979.421066 podStartE2EDuration="57.433708614s" podCreationTimestamp="2025-10-03 08:06:17 +0000 UTC" firstStartedPulling="2025-10-03 08:06:20.044317812 +0000 UTC m=+1080.865508302" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:07:14.430903224 +0000 UTC m=+1135.252093734" watchObservedRunningTime="2025-10-03 08:07:14.433708614 +0000 UTC m=+1135.254899104" Oct 03 08:07:14 crc kubenswrapper[4664]: I1003 08:07:14.441921 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e\") " pod="openstack/swift-storage-0" Oct 03 08:07:14 crc kubenswrapper[4664]: I1003 08:07:14.459166 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.9005811379999997 podStartE2EDuration="4.459137558s" podCreationTimestamp="2025-10-03 08:07:10 +0000 UTC" firstStartedPulling="2025-10-03 08:07:11.709086576 +0000 UTC m=+1132.530277066" lastFinishedPulling="2025-10-03 08:07:13.267642996 +0000 UTC m=+1134.088833486" observedRunningTime="2025-10-03 08:07:14.45253313 +0000 UTC m=+1135.273723620" watchObservedRunningTime="2025-10-03 08:07:14.459137558 +0000 UTC m=+1135.280328058" Oct 03 08:07:14 crc kubenswrapper[4664]: I1003 08:07:14.648461 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-6rhzw" Oct 03 08:07:14 crc kubenswrapper[4664]: I1003 08:07:14.794778 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8f29518-3766-425f-b4fd-737fde3bf509-dns-svc\") pod \"e8f29518-3766-425f-b4fd-737fde3bf509\" (UID: \"e8f29518-3766-425f-b4fd-737fde3bf509\") " Oct 03 08:07:14 crc kubenswrapper[4664]: I1003 08:07:14.794901 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8f29518-3766-425f-b4fd-737fde3bf509-ovsdbserver-nb\") pod \"e8f29518-3766-425f-b4fd-737fde3bf509\" (UID: \"e8f29518-3766-425f-b4fd-737fde3bf509\") " Oct 03 08:07:14 crc kubenswrapper[4664]: I1003 08:07:14.794974 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8f29518-3766-425f-b4fd-737fde3bf509-config\") pod \"e8f29518-3766-425f-b4fd-737fde3bf509\" (UID: \"e8f29518-3766-425f-b4fd-737fde3bf509\") " Oct 03 08:07:14 crc kubenswrapper[4664]: I1003 08:07:14.795042 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8q897\" (UniqueName: \"kubernetes.io/projected/e8f29518-3766-425f-b4fd-737fde3bf509-kube-api-access-8q897\") pod \"e8f29518-3766-425f-b4fd-737fde3bf509\" (UID: \"e8f29518-3766-425f-b4fd-737fde3bf509\") " Oct 03 08:07:14 crc kubenswrapper[4664]: I1003 08:07:14.795119 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8f29518-3766-425f-b4fd-737fde3bf509-ovsdbserver-sb\") pod \"e8f29518-3766-425f-b4fd-737fde3bf509\" (UID: \"e8f29518-3766-425f-b4fd-737fde3bf509\") " Oct 03 08:07:14 crc kubenswrapper[4664]: I1003 08:07:14.800748 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8f29518-3766-425f-b4fd-737fde3bf509-kube-api-access-8q897" (OuterVolumeSpecName: "kube-api-access-8q897") pod "e8f29518-3766-425f-b4fd-737fde3bf509" (UID: "e8f29518-3766-425f-b4fd-737fde3bf509"). InnerVolumeSpecName "kube-api-access-8q897". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:07:14 crc kubenswrapper[4664]: I1003 08:07:14.842686 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8f29518-3766-425f-b4fd-737fde3bf509-config" (OuterVolumeSpecName: "config") pod "e8f29518-3766-425f-b4fd-737fde3bf509" (UID: "e8f29518-3766-425f-b4fd-737fde3bf509"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:07:14 crc kubenswrapper[4664]: I1003 08:07:14.851927 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8f29518-3766-425f-b4fd-737fde3bf509-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e8f29518-3766-425f-b4fd-737fde3bf509" (UID: "e8f29518-3766-425f-b4fd-737fde3bf509"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:07:14 crc kubenswrapper[4664]: I1003 08:07:14.859248 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8f29518-3766-425f-b4fd-737fde3bf509-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e8f29518-3766-425f-b4fd-737fde3bf509" (UID: "e8f29518-3766-425f-b4fd-737fde3bf509"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:07:14 crc kubenswrapper[4664]: I1003 08:07:14.860382 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8f29518-3766-425f-b4fd-737fde3bf509-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e8f29518-3766-425f-b4fd-737fde3bf509" (UID: "e8f29518-3766-425f-b4fd-737fde3bf509"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:07:14 crc kubenswrapper[4664]: I1003 08:07:14.896569 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e-etc-swift\") pod \"swift-storage-0\" (UID: \"2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e\") " pod="openstack/swift-storage-0" Oct 03 08:07:14 crc kubenswrapper[4664]: I1003 08:07:14.896758 4664 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8f29518-3766-425f-b4fd-737fde3bf509-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 08:07:14 crc kubenswrapper[4664]: I1003 08:07:14.896771 4664 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8f29518-3766-425f-b4fd-737fde3bf509-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:07:14 crc kubenswrapper[4664]: E1003 08:07:14.896901 4664 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 03 08:07:14 crc kubenswrapper[4664]: E1003 08:07:14.896916 4664 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 03 08:07:14 crc kubenswrapper[4664]: I1003 08:07:14.896935 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8q897\" (UniqueName: \"kubernetes.io/projected/e8f29518-3766-425f-b4fd-737fde3bf509-kube-api-access-8q897\") on node \"crc\" DevicePath \"\"" Oct 03 08:07:14 crc kubenswrapper[4664]: E1003 08:07:14.896970 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e-etc-swift podName:2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e nodeName:}" failed. No retries permitted until 2025-10-03 08:07:15.896953919 +0000 UTC m=+1136.718144409 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e-etc-swift") pod "swift-storage-0" (UID: "2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e") : configmap "swift-ring-files" not found Oct 03 08:07:14 crc kubenswrapper[4664]: I1003 08:07:14.896990 4664 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8f29518-3766-425f-b4fd-737fde3bf509-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 08:07:14 crc kubenswrapper[4664]: I1003 08:07:14.897003 4664 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8f29518-3766-425f-b4fd-737fde3bf509-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 08:07:15 crc kubenswrapper[4664]: I1003 08:07:15.392704 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-6rhzw" event={"ID":"e8f29518-3766-425f-b4fd-737fde3bf509","Type":"ContainerDied","Data":"6ece55bd01fc9bd5b77bb3fcf80b021e58f51626908937c76a9191d98146051d"} Oct 03 08:07:15 crc kubenswrapper[4664]: I1003 08:07:15.394521 4664 scope.go:117] "RemoveContainer" containerID="688c505e3e1c67d093425773171e52c9beadae1bbd4a34d0fc8778843255da0a" Oct 03 08:07:15 crc kubenswrapper[4664]: I1003 08:07:15.393090 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-6rhzw" Oct 03 08:07:15 crc kubenswrapper[4664]: I1003 08:07:15.396421 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-7nzs5" event={"ID":"455d48b8-17a8-4c2f-9923-a30626506388","Type":"ContainerStarted","Data":"f11f91f1b8e17dcbb36ff90cf9bf1aecf3a5a37bbc0f64b527cf36e74cb8321f"} Oct 03 08:07:15 crc kubenswrapper[4664]: I1003 08:07:15.396659 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 03 08:07:15 crc kubenswrapper[4664]: I1003 08:07:15.467449 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-7nzs5" podStartSLOduration=3.467431776 podStartE2EDuration="3.467431776s" podCreationTimestamp="2025-10-03 08:07:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:07:15.462533447 +0000 UTC m=+1136.283723937" watchObservedRunningTime="2025-10-03 08:07:15.467431776 +0000 UTC m=+1136.288622266" Oct 03 08:07:15 crc kubenswrapper[4664]: I1003 08:07:15.505311 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-6rhzw"] Oct 03 08:07:15 crc kubenswrapper[4664]: I1003 08:07:15.512377 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-6rhzw"] Oct 03 08:07:15 crc kubenswrapper[4664]: I1003 08:07:15.886583 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8f29518-3766-425f-b4fd-737fde3bf509" path="/var/lib/kubelet/pods/e8f29518-3766-425f-b4fd-737fde3bf509/volumes" Oct 03 08:07:15 crc kubenswrapper[4664]: I1003 08:07:15.932824 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e-etc-swift\") pod \"swift-storage-0\" (UID: \"2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e\") " pod="openstack/swift-storage-0" Oct 03 08:07:15 crc kubenswrapper[4664]: E1003 08:07:15.933051 4664 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 03 08:07:15 crc kubenswrapper[4664]: E1003 08:07:15.933094 4664 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 03 08:07:15 crc kubenswrapper[4664]: E1003 08:07:15.933166 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e-etc-swift podName:2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e nodeName:}" failed. No retries permitted until 2025-10-03 08:07:17.933146722 +0000 UTC m=+1138.754337212 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e-etc-swift") pod "swift-storage-0" (UID: "2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e") : configmap "swift-ring-files" not found Oct 03 08:07:16 crc kubenswrapper[4664]: I1003 08:07:16.438160 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-7nzs5" Oct 03 08:07:17 crc kubenswrapper[4664]: I1003 08:07:17.947103 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-ms9ms"] Oct 03 08:07:17 crc kubenswrapper[4664]: E1003 08:07:17.947454 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8f29518-3766-425f-b4fd-737fde3bf509" containerName="init" Oct 03 08:07:17 crc kubenswrapper[4664]: I1003 08:07:17.947467 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8f29518-3766-425f-b4fd-737fde3bf509" containerName="init" Oct 03 08:07:17 crc kubenswrapper[4664]: I1003 08:07:17.947650 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8f29518-3766-425f-b4fd-737fde3bf509" containerName="init" Oct 03 08:07:17 crc kubenswrapper[4664]: I1003 08:07:17.948156 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-ms9ms" Oct 03 08:07:17 crc kubenswrapper[4664]: I1003 08:07:17.950160 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Oct 03 08:07:17 crc kubenswrapper[4664]: I1003 08:07:17.950541 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Oct 03 08:07:17 crc kubenswrapper[4664]: I1003 08:07:17.951366 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 03 08:07:17 crc kubenswrapper[4664]: I1003 08:07:17.969068 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e-etc-swift\") pod \"swift-storage-0\" (UID: \"2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e\") " pod="openstack/swift-storage-0" Oct 03 08:07:17 crc kubenswrapper[4664]: E1003 08:07:17.969308 4664 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 03 08:07:17 crc kubenswrapper[4664]: E1003 08:07:17.969353 4664 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 03 08:07:17 crc kubenswrapper[4664]: E1003 08:07:17.969421 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e-etc-swift podName:2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e nodeName:}" failed. No retries permitted until 2025-10-03 08:07:21.969397007 +0000 UTC m=+1142.790587497 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e-etc-swift") pod "swift-storage-0" (UID: "2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e") : configmap "swift-ring-files" not found Oct 03 08:07:17 crc kubenswrapper[4664]: I1003 08:07:17.994247 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-pwkt9"] Oct 03 08:07:17 crc kubenswrapper[4664]: I1003 08:07:17.995415 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-pwkt9" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.015306 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-pwkt9"] Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.031507 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-ms9ms"] Oct 03 08:07:18 crc kubenswrapper[4664]: E1003 08:07:18.032407 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-5w858 ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/swift-ring-rebalance-ms9ms" podUID="5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.058058 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-ms9ms"] Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.070673 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb-scripts\") pod \"swift-ring-rebalance-ms9ms\" (UID: \"5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb\") " pod="openstack/swift-ring-rebalance-ms9ms" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.070730 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb-ring-data-devices\") pod \"swift-ring-rebalance-ms9ms\" (UID: \"5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb\") " pod="openstack/swift-ring-rebalance-ms9ms" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.070753 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb-dispersionconf\") pod \"swift-ring-rebalance-ms9ms\" (UID: \"5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb\") " pod="openstack/swift-ring-rebalance-ms9ms" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.070949 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb-combined-ca-bundle\") pod \"swift-ring-rebalance-ms9ms\" (UID: \"5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb\") " pod="openstack/swift-ring-rebalance-ms9ms" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.071026 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5w858\" (UniqueName: \"kubernetes.io/projected/5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb-kube-api-access-5w858\") pod \"swift-ring-rebalance-ms9ms\" (UID: \"5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb\") " pod="openstack/swift-ring-rebalance-ms9ms" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.071121 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb-etc-swift\") pod \"swift-ring-rebalance-ms9ms\" (UID: \"5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb\") " pod="openstack/swift-ring-rebalance-ms9ms" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.071232 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb-swiftconf\") pod \"swift-ring-rebalance-ms9ms\" (UID: \"5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb\") " pod="openstack/swift-ring-rebalance-ms9ms" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.173467 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tjcd\" (UniqueName: \"kubernetes.io/projected/861470e0-672f-4457-86cd-9711fc6dd059-kube-api-access-5tjcd\") pod \"swift-ring-rebalance-pwkt9\" (UID: \"861470e0-672f-4457-86cd-9711fc6dd059\") " pod="openstack/swift-ring-rebalance-pwkt9" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.173622 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb-scripts\") pod \"swift-ring-rebalance-ms9ms\" (UID: \"5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb\") " pod="openstack/swift-ring-rebalance-ms9ms" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.173672 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb-ring-data-devices\") pod \"swift-ring-rebalance-ms9ms\" (UID: \"5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb\") " pod="openstack/swift-ring-rebalance-ms9ms" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.173693 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb-dispersionconf\") pod \"swift-ring-rebalance-ms9ms\" (UID: \"5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb\") " pod="openstack/swift-ring-rebalance-ms9ms" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.173723 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/861470e0-672f-4457-86cd-9711fc6dd059-ring-data-devices\") pod \"swift-ring-rebalance-pwkt9\" (UID: \"861470e0-672f-4457-86cd-9711fc6dd059\") " pod="openstack/swift-ring-rebalance-pwkt9" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.173750 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/861470e0-672f-4457-86cd-9711fc6dd059-scripts\") pod \"swift-ring-rebalance-pwkt9\" (UID: \"861470e0-672f-4457-86cd-9711fc6dd059\") " pod="openstack/swift-ring-rebalance-pwkt9" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.173768 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb-combined-ca-bundle\") pod \"swift-ring-rebalance-ms9ms\" (UID: \"5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb\") " pod="openstack/swift-ring-rebalance-ms9ms" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.173786 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/861470e0-672f-4457-86cd-9711fc6dd059-dispersionconf\") pod \"swift-ring-rebalance-pwkt9\" (UID: \"861470e0-672f-4457-86cd-9711fc6dd059\") " pod="openstack/swift-ring-rebalance-pwkt9" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.173803 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5w858\" (UniqueName: \"kubernetes.io/projected/5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb-kube-api-access-5w858\") pod \"swift-ring-rebalance-ms9ms\" (UID: \"5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb\") " pod="openstack/swift-ring-rebalance-ms9ms" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.173820 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/861470e0-672f-4457-86cd-9711fc6dd059-combined-ca-bundle\") pod \"swift-ring-rebalance-pwkt9\" (UID: \"861470e0-672f-4457-86cd-9711fc6dd059\") " pod="openstack/swift-ring-rebalance-pwkt9" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.173848 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/861470e0-672f-4457-86cd-9711fc6dd059-etc-swift\") pod \"swift-ring-rebalance-pwkt9\" (UID: \"861470e0-672f-4457-86cd-9711fc6dd059\") " pod="openstack/swift-ring-rebalance-pwkt9" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.174007 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb-etc-swift\") pod \"swift-ring-rebalance-ms9ms\" (UID: \"5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb\") " pod="openstack/swift-ring-rebalance-ms9ms" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.174040 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/861470e0-672f-4457-86cd-9711fc6dd059-swiftconf\") pod \"swift-ring-rebalance-pwkt9\" (UID: \"861470e0-672f-4457-86cd-9711fc6dd059\") " pod="openstack/swift-ring-rebalance-pwkt9" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.174058 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb-swiftconf\") pod \"swift-ring-rebalance-ms9ms\" (UID: \"5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb\") " pod="openstack/swift-ring-rebalance-ms9ms" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.174656 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb-scripts\") pod \"swift-ring-rebalance-ms9ms\" (UID: \"5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb\") " pod="openstack/swift-ring-rebalance-ms9ms" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.174756 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb-etc-swift\") pod \"swift-ring-rebalance-ms9ms\" (UID: \"5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb\") " pod="openstack/swift-ring-rebalance-ms9ms" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.174827 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb-ring-data-devices\") pod \"swift-ring-rebalance-ms9ms\" (UID: \"5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb\") " pod="openstack/swift-ring-rebalance-ms9ms" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.180411 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb-swiftconf\") pod \"swift-ring-rebalance-ms9ms\" (UID: \"5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb\") " pod="openstack/swift-ring-rebalance-ms9ms" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.180631 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb-combined-ca-bundle\") pod \"swift-ring-rebalance-ms9ms\" (UID: \"5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb\") " pod="openstack/swift-ring-rebalance-ms9ms" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.181078 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb-dispersionconf\") pod \"swift-ring-rebalance-ms9ms\" (UID: \"5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb\") " pod="openstack/swift-ring-rebalance-ms9ms" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.195377 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5w858\" (UniqueName: \"kubernetes.io/projected/5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb-kube-api-access-5w858\") pod \"swift-ring-rebalance-ms9ms\" (UID: \"5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb\") " pod="openstack/swift-ring-rebalance-ms9ms" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.275489 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/861470e0-672f-4457-86cd-9711fc6dd059-dispersionconf\") pod \"swift-ring-rebalance-pwkt9\" (UID: \"861470e0-672f-4457-86cd-9711fc6dd059\") " pod="openstack/swift-ring-rebalance-pwkt9" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.275873 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/861470e0-672f-4457-86cd-9711fc6dd059-combined-ca-bundle\") pod \"swift-ring-rebalance-pwkt9\" (UID: \"861470e0-672f-4457-86cd-9711fc6dd059\") " pod="openstack/swift-ring-rebalance-pwkt9" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.275916 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/861470e0-672f-4457-86cd-9711fc6dd059-etc-swift\") pod \"swift-ring-rebalance-pwkt9\" (UID: \"861470e0-672f-4457-86cd-9711fc6dd059\") " pod="openstack/swift-ring-rebalance-pwkt9" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.275947 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/861470e0-672f-4457-86cd-9711fc6dd059-swiftconf\") pod \"swift-ring-rebalance-pwkt9\" (UID: \"861470e0-672f-4457-86cd-9711fc6dd059\") " pod="openstack/swift-ring-rebalance-pwkt9" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.275995 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tjcd\" (UniqueName: \"kubernetes.io/projected/861470e0-672f-4457-86cd-9711fc6dd059-kube-api-access-5tjcd\") pod \"swift-ring-rebalance-pwkt9\" (UID: \"861470e0-672f-4457-86cd-9711fc6dd059\") " pod="openstack/swift-ring-rebalance-pwkt9" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.276102 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/861470e0-672f-4457-86cd-9711fc6dd059-ring-data-devices\") pod \"swift-ring-rebalance-pwkt9\" (UID: \"861470e0-672f-4457-86cd-9711fc6dd059\") " pod="openstack/swift-ring-rebalance-pwkt9" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.276137 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/861470e0-672f-4457-86cd-9711fc6dd059-scripts\") pod \"swift-ring-rebalance-pwkt9\" (UID: \"861470e0-672f-4457-86cd-9711fc6dd059\") " pod="openstack/swift-ring-rebalance-pwkt9" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.276525 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/861470e0-672f-4457-86cd-9711fc6dd059-etc-swift\") pod \"swift-ring-rebalance-pwkt9\" (UID: \"861470e0-672f-4457-86cd-9711fc6dd059\") " pod="openstack/swift-ring-rebalance-pwkt9" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.276820 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/861470e0-672f-4457-86cd-9711fc6dd059-ring-data-devices\") pod \"swift-ring-rebalance-pwkt9\" (UID: \"861470e0-672f-4457-86cd-9711fc6dd059\") " pod="openstack/swift-ring-rebalance-pwkt9" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.276857 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/861470e0-672f-4457-86cd-9711fc6dd059-scripts\") pod \"swift-ring-rebalance-pwkt9\" (UID: \"861470e0-672f-4457-86cd-9711fc6dd059\") " pod="openstack/swift-ring-rebalance-pwkt9" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.279364 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/861470e0-672f-4457-86cd-9711fc6dd059-combined-ca-bundle\") pod \"swift-ring-rebalance-pwkt9\" (UID: \"861470e0-672f-4457-86cd-9711fc6dd059\") " pod="openstack/swift-ring-rebalance-pwkt9" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.279422 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/861470e0-672f-4457-86cd-9711fc6dd059-swiftconf\") pod \"swift-ring-rebalance-pwkt9\" (UID: \"861470e0-672f-4457-86cd-9711fc6dd059\") " pod="openstack/swift-ring-rebalance-pwkt9" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.284036 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/861470e0-672f-4457-86cd-9711fc6dd059-dispersionconf\") pod \"swift-ring-rebalance-pwkt9\" (UID: \"861470e0-672f-4457-86cd-9711fc6dd059\") " pod="openstack/swift-ring-rebalance-pwkt9" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.297487 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tjcd\" (UniqueName: \"kubernetes.io/projected/861470e0-672f-4457-86cd-9711fc6dd059-kube-api-access-5tjcd\") pod \"swift-ring-rebalance-pwkt9\" (UID: \"861470e0-672f-4457-86cd-9711fc6dd059\") " pod="openstack/swift-ring-rebalance-pwkt9" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.312651 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-pwkt9" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.451887 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-ms9ms" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.466418 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-ms9ms" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.573277 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-pwkt9"] Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.582656 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb-dispersionconf\") pod \"5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb\" (UID: \"5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb\") " Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.582727 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb-swiftconf\") pod \"5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb\" (UID: \"5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb\") " Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.582771 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb-ring-data-devices\") pod \"5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb\" (UID: \"5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb\") " Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.584553 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb-scripts\") pod \"5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb\" (UID: \"5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb\") " Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.584698 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb" (UID: "5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.584760 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb-combined-ca-bundle\") pod \"5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb\" (UID: \"5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb\") " Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.584856 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb-etc-swift\") pod \"5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb\" (UID: \"5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb\") " Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.584964 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5w858\" (UniqueName: \"kubernetes.io/projected/5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb-kube-api-access-5w858\") pod \"5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb\" (UID: \"5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb\") " Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.585200 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb-scripts" (OuterVolumeSpecName: "scripts") pod "5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb" (UID: "5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.585971 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb" (UID: "5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.586784 4664 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.586817 4664 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.586831 4664 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.590419 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb" (UID: "5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.590443 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb" (UID: "5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.592113 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb" (UID: "5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.593513 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb-kube-api-access-5w858" (OuterVolumeSpecName: "kube-api-access-5w858") pod "5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb" (UID: "5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb"). InnerVolumeSpecName "kube-api-access-5w858". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.688729 4664 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.688770 4664 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.688779 4664 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:07:18 crc kubenswrapper[4664]: I1003 08:07:18.688789 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5w858\" (UniqueName: \"kubernetes.io/projected/5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb-kube-api-access-5w858\") on node \"crc\" DevicePath \"\"" Oct 03 08:07:19 crc kubenswrapper[4664]: I1003 08:07:19.229929 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 03 08:07:19 crc kubenswrapper[4664]: I1003 08:07:19.229972 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 03 08:07:19 crc kubenswrapper[4664]: I1003 08:07:19.465933 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-pwkt9" event={"ID":"861470e0-672f-4457-86cd-9711fc6dd059","Type":"ContainerStarted","Data":"c9418ad96532c8e5202a1fc99de4286660968b9b9a212fd872bb11a04dcc3639"} Oct 03 08:07:19 crc kubenswrapper[4664]: I1003 08:07:19.465962 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-ms9ms" Oct 03 08:07:19 crc kubenswrapper[4664]: I1003 08:07:19.509949 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-ms9ms"] Oct 03 08:07:19 crc kubenswrapper[4664]: I1003 08:07:19.518979 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-ms9ms"] Oct 03 08:07:19 crc kubenswrapper[4664]: I1003 08:07:19.888125 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb" path="/var/lib/kubelet/pods/5fadfb62-b3f2-47c2-8e1d-b03767c5a4fb/volumes" Oct 03 08:07:20 crc kubenswrapper[4664]: I1003 08:07:20.972213 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 03 08:07:20 crc kubenswrapper[4664]: I1003 08:07:20.972803 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 03 08:07:21 crc kubenswrapper[4664]: I1003 08:07:21.023441 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 03 08:07:21 crc kubenswrapper[4664]: I1003 08:07:21.286889 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 03 08:07:21 crc kubenswrapper[4664]: I1003 08:07:21.336740 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 03 08:07:21 crc kubenswrapper[4664]: I1003 08:07:21.545297 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 03 08:07:22 crc kubenswrapper[4664]: I1003 08:07:22.045545 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e-etc-swift\") pod \"swift-storage-0\" (UID: \"2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e\") " pod="openstack/swift-storage-0" Oct 03 08:07:22 crc kubenswrapper[4664]: E1003 08:07:22.046739 4664 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 03 08:07:22 crc kubenswrapper[4664]: E1003 08:07:22.046864 4664 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 03 08:07:22 crc kubenswrapper[4664]: E1003 08:07:22.047046 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e-etc-swift podName:2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e nodeName:}" failed. No retries permitted until 2025-10-03 08:07:30.047020625 +0000 UTC m=+1150.868211285 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e-etc-swift") pod "swift-storage-0" (UID: "2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e") : configmap "swift-ring-files" not found Oct 03 08:07:22 crc kubenswrapper[4664]: I1003 08:07:22.492243 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-pwkt9" event={"ID":"861470e0-672f-4457-86cd-9711fc6dd059","Type":"ContainerStarted","Data":"376893dfafad48542de78811a62c595b0a861666d70ef565979b5632d5d804cf"} Oct 03 08:07:22 crc kubenswrapper[4664]: I1003 08:07:22.512867 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-pwkt9" podStartSLOduration=1.9437588 podStartE2EDuration="5.512851664s" podCreationTimestamp="2025-10-03 08:07:17 +0000 UTC" firstStartedPulling="2025-10-03 08:07:18.582411845 +0000 UTC m=+1139.403602335" lastFinishedPulling="2025-10-03 08:07:22.151504709 +0000 UTC m=+1142.972695199" observedRunningTime="2025-10-03 08:07:22.50992845 +0000 UTC m=+1143.331118940" watchObservedRunningTime="2025-10-03 08:07:22.512851664 +0000 UTC m=+1143.334042154" Oct 03 08:07:22 crc kubenswrapper[4664]: I1003 08:07:22.787300 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 03 08:07:23 crc kubenswrapper[4664]: I1003 08:07:23.274896 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-7nzs5" Oct 03 08:07:23 crc kubenswrapper[4664]: I1003 08:07:23.338898 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-hwlpb"] Oct 03 08:07:23 crc kubenswrapper[4664]: I1003 08:07:23.339189 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-hwlpb" podUID="96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1" containerName="dnsmasq-dns" containerID="cri-o://290801e369a3162ccccc045da875b27f969d39c2ea636a29e333fb34a4069e3b" gracePeriod=10 Oct 03 08:07:24 crc kubenswrapper[4664]: I1003 08:07:24.512828 4664 generic.go:334] "Generic (PLEG): container finished" podID="96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1" containerID="290801e369a3162ccccc045da875b27f969d39c2ea636a29e333fb34a4069e3b" exitCode=0 Oct 03 08:07:24 crc kubenswrapper[4664]: I1003 08:07:24.512944 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-hwlpb" event={"ID":"96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1","Type":"ContainerDied","Data":"290801e369a3162ccccc045da875b27f969d39c2ea636a29e333fb34a4069e3b"} Oct 03 08:07:25 crc kubenswrapper[4664]: I1003 08:07:25.523116 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-hwlpb" event={"ID":"96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1","Type":"ContainerDied","Data":"813c6da658859d0c7f691cc971c55738fa9cc259173aa8fd5752bc913c2f38dd"} Oct 03 08:07:25 crc kubenswrapper[4664]: I1003 08:07:25.523169 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="813c6da658859d0c7f691cc971c55738fa9cc259173aa8fd5752bc913c2f38dd" Oct 03 08:07:25 crc kubenswrapper[4664]: I1003 08:07:25.530956 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-hwlpb" Oct 03 08:07:25 crc kubenswrapper[4664]: I1003 08:07:25.609851 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1-config\") pod \"96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1\" (UID: \"96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1\") " Oct 03 08:07:25 crc kubenswrapper[4664]: I1003 08:07:25.610196 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6cg5\" (UniqueName: \"kubernetes.io/projected/96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1-kube-api-access-r6cg5\") pod \"96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1\" (UID: \"96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1\") " Oct 03 08:07:25 crc kubenswrapper[4664]: I1003 08:07:25.610358 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1-dns-svc\") pod \"96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1\" (UID: \"96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1\") " Oct 03 08:07:25 crc kubenswrapper[4664]: I1003 08:07:25.615567 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1-kube-api-access-r6cg5" (OuterVolumeSpecName: "kube-api-access-r6cg5") pod "96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1" (UID: "96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1"). InnerVolumeSpecName "kube-api-access-r6cg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:07:25 crc kubenswrapper[4664]: I1003 08:07:25.652456 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1-config" (OuterVolumeSpecName: "config") pod "96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1" (UID: "96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:07:25 crc kubenswrapper[4664]: I1003 08:07:25.653185 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1" (UID: "96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:07:25 crc kubenswrapper[4664]: I1003 08:07:25.712444 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6cg5\" (UniqueName: \"kubernetes.io/projected/96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1-kube-api-access-r6cg5\") on node \"crc\" DevicePath \"\"" Oct 03 08:07:25 crc kubenswrapper[4664]: I1003 08:07:25.712487 4664 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 08:07:25 crc kubenswrapper[4664]: I1003 08:07:25.712502 4664 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:07:26 crc kubenswrapper[4664]: I1003 08:07:26.440547 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 03 08:07:26 crc kubenswrapper[4664]: I1003 08:07:26.533329 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-hwlpb" Oct 03 08:07:26 crc kubenswrapper[4664]: I1003 08:07:26.556757 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-hwlpb"] Oct 03 08:07:26 crc kubenswrapper[4664]: I1003 08:07:26.565193 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-hwlpb"] Oct 03 08:07:27 crc kubenswrapper[4664]: I1003 08:07:27.886170 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1" path="/var/lib/kubelet/pods/96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1/volumes" Oct 03 08:07:30 crc kubenswrapper[4664]: I1003 08:07:30.084382 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e-etc-swift\") pod \"swift-storage-0\" (UID: \"2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e\") " pod="openstack/swift-storage-0" Oct 03 08:07:30 crc kubenswrapper[4664]: E1003 08:07:30.084866 4664 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 03 08:07:30 crc kubenswrapper[4664]: E1003 08:07:30.084928 4664 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 03 08:07:30 crc kubenswrapper[4664]: E1003 08:07:30.085028 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e-etc-swift podName:2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e nodeName:}" failed. No retries permitted until 2025-10-03 08:07:46.084989443 +0000 UTC m=+1166.906179943 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e-etc-swift") pod "swift-storage-0" (UID: "2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e") : configmap "swift-ring-files" not found Oct 03 08:07:30 crc kubenswrapper[4664]: I1003 08:07:30.473540 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-98kgj"] Oct 03 08:07:30 crc kubenswrapper[4664]: E1003 08:07:30.474239 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1" containerName="init" Oct 03 08:07:30 crc kubenswrapper[4664]: I1003 08:07:30.474338 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1" containerName="init" Oct 03 08:07:30 crc kubenswrapper[4664]: E1003 08:07:30.474444 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1" containerName="dnsmasq-dns" Oct 03 08:07:30 crc kubenswrapper[4664]: I1003 08:07:30.474535 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1" containerName="dnsmasq-dns" Oct 03 08:07:30 crc kubenswrapper[4664]: I1003 08:07:30.474820 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="96c8c272-2d72-4eaf-b3b7-0a4a862dd8a1" containerName="dnsmasq-dns" Oct 03 08:07:30 crc kubenswrapper[4664]: I1003 08:07:30.475445 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-98kgj" Oct 03 08:07:30 crc kubenswrapper[4664]: I1003 08:07:30.483349 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-98kgj"] Oct 03 08:07:30 crc kubenswrapper[4664]: I1003 08:07:30.563697 4664 generic.go:334] "Generic (PLEG): container finished" podID="861470e0-672f-4457-86cd-9711fc6dd059" containerID="376893dfafad48542de78811a62c595b0a861666d70ef565979b5632d5d804cf" exitCode=0 Oct 03 08:07:30 crc kubenswrapper[4664]: I1003 08:07:30.563728 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-pwkt9" event={"ID":"861470e0-672f-4457-86cd-9711fc6dd059","Type":"ContainerDied","Data":"376893dfafad48542de78811a62c595b0a861666d70ef565979b5632d5d804cf"} Oct 03 08:07:30 crc kubenswrapper[4664]: I1003 08:07:30.591384 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wndkp\" (UniqueName: \"kubernetes.io/projected/6a531dde-5ad1-4118-acef-d104aec77b92-kube-api-access-wndkp\") pod \"keystone-db-create-98kgj\" (UID: \"6a531dde-5ad1-4118-acef-d104aec77b92\") " pod="openstack/keystone-db-create-98kgj" Oct 03 08:07:30 crc kubenswrapper[4664]: I1003 08:07:30.693391 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wndkp\" (UniqueName: \"kubernetes.io/projected/6a531dde-5ad1-4118-acef-d104aec77b92-kube-api-access-wndkp\") pod \"keystone-db-create-98kgj\" (UID: \"6a531dde-5ad1-4118-acef-d104aec77b92\") " pod="openstack/keystone-db-create-98kgj" Oct 03 08:07:30 crc kubenswrapper[4664]: I1003 08:07:30.723592 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wndkp\" (UniqueName: \"kubernetes.io/projected/6a531dde-5ad1-4118-acef-d104aec77b92-kube-api-access-wndkp\") pod \"keystone-db-create-98kgj\" (UID: \"6a531dde-5ad1-4118-acef-d104aec77b92\") " pod="openstack/keystone-db-create-98kgj" Oct 03 08:07:30 crc kubenswrapper[4664]: I1003 08:07:30.727363 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-m7kq6"] Oct 03 08:07:30 crc kubenswrapper[4664]: I1003 08:07:30.728688 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-m7kq6" Oct 03 08:07:30 crc kubenswrapper[4664]: I1003 08:07:30.737426 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-m7kq6"] Oct 03 08:07:30 crc kubenswrapper[4664]: I1003 08:07:30.795506 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj9t9\" (UniqueName: \"kubernetes.io/projected/25492a2d-416b-4e30-a717-2b814d47066e-kube-api-access-lj9t9\") pod \"placement-db-create-m7kq6\" (UID: \"25492a2d-416b-4e30-a717-2b814d47066e\") " pod="openstack/placement-db-create-m7kq6" Oct 03 08:07:30 crc kubenswrapper[4664]: I1003 08:07:30.802925 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-98kgj" Oct 03 08:07:30 crc kubenswrapper[4664]: I1003 08:07:30.896933 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lj9t9\" (UniqueName: \"kubernetes.io/projected/25492a2d-416b-4e30-a717-2b814d47066e-kube-api-access-lj9t9\") pod \"placement-db-create-m7kq6\" (UID: \"25492a2d-416b-4e30-a717-2b814d47066e\") " pod="openstack/placement-db-create-m7kq6" Oct 03 08:07:30 crc kubenswrapper[4664]: I1003 08:07:30.917787 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj9t9\" (UniqueName: \"kubernetes.io/projected/25492a2d-416b-4e30-a717-2b814d47066e-kube-api-access-lj9t9\") pod \"placement-db-create-m7kq6\" (UID: \"25492a2d-416b-4e30-a717-2b814d47066e\") " pod="openstack/placement-db-create-m7kq6" Oct 03 08:07:31 crc kubenswrapper[4664]: I1003 08:07:31.065073 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-m7kq6" Oct 03 08:07:31 crc kubenswrapper[4664]: I1003 08:07:31.242781 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-98kgj"] Oct 03 08:07:31 crc kubenswrapper[4664]: I1003 08:07:31.247865 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-drt6h"] Oct 03 08:07:31 crc kubenswrapper[4664]: I1003 08:07:31.248925 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-drt6h" Oct 03 08:07:31 crc kubenswrapper[4664]: W1003 08:07:31.256671 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a531dde_5ad1_4118_acef_d104aec77b92.slice/crio-4649bfe7bc676bdd98863e34f5a1b2d1c2c90b15dfa3015df00db1202793c365 WatchSource:0}: Error finding container 4649bfe7bc676bdd98863e34f5a1b2d1c2c90b15dfa3015df00db1202793c365: Status 404 returned error can't find the container with id 4649bfe7bc676bdd98863e34f5a1b2d1c2c90b15dfa3015df00db1202793c365 Oct 03 08:07:31 crc kubenswrapper[4664]: I1003 08:07:31.263094 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-drt6h"] Oct 03 08:07:31 crc kubenswrapper[4664]: I1003 08:07:31.307323 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spdm8\" (UniqueName: \"kubernetes.io/projected/0b5c50a5-3090-403e-9207-a48e761544c3-kube-api-access-spdm8\") pod \"glance-db-create-drt6h\" (UID: \"0b5c50a5-3090-403e-9207-a48e761544c3\") " pod="openstack/glance-db-create-drt6h" Oct 03 08:07:31 crc kubenswrapper[4664]: I1003 08:07:31.409942 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spdm8\" (UniqueName: \"kubernetes.io/projected/0b5c50a5-3090-403e-9207-a48e761544c3-kube-api-access-spdm8\") pod \"glance-db-create-drt6h\" (UID: \"0b5c50a5-3090-403e-9207-a48e761544c3\") " pod="openstack/glance-db-create-drt6h" Oct 03 08:07:31 crc kubenswrapper[4664]: I1003 08:07:31.429199 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spdm8\" (UniqueName: \"kubernetes.io/projected/0b5c50a5-3090-403e-9207-a48e761544c3-kube-api-access-spdm8\") pod \"glance-db-create-drt6h\" (UID: \"0b5c50a5-3090-403e-9207-a48e761544c3\") " pod="openstack/glance-db-create-drt6h" Oct 03 08:07:31 crc kubenswrapper[4664]: I1003 08:07:31.497856 4664 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-fb5ld" podUID="b65aa3e9-2d60-4cd9-b63a-93a07ab33e72" containerName="ovn-controller" probeResult="failure" output=< Oct 03 08:07:31 crc kubenswrapper[4664]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 03 08:07:31 crc kubenswrapper[4664]: > Oct 03 08:07:31 crc kubenswrapper[4664]: I1003 08:07:31.572876 4664 generic.go:334] "Generic (PLEG): container finished" podID="6a531dde-5ad1-4118-acef-d104aec77b92" containerID="1643f639e33cd2766005bc72f8b4849c583f9b645475ace313ca004b3e1cc402" exitCode=0 Oct 03 08:07:31 crc kubenswrapper[4664]: I1003 08:07:31.573057 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-98kgj" event={"ID":"6a531dde-5ad1-4118-acef-d104aec77b92","Type":"ContainerDied","Data":"1643f639e33cd2766005bc72f8b4849c583f9b645475ace313ca004b3e1cc402"} Oct 03 08:07:31 crc kubenswrapper[4664]: I1003 08:07:31.573082 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-98kgj" event={"ID":"6a531dde-5ad1-4118-acef-d104aec77b92","Type":"ContainerStarted","Data":"4649bfe7bc676bdd98863e34f5a1b2d1c2c90b15dfa3015df00db1202793c365"} Oct 03 08:07:31 crc kubenswrapper[4664]: I1003 08:07:31.578977 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-drt6h" Oct 03 08:07:31 crc kubenswrapper[4664]: I1003 08:07:31.602264 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-m7kq6"] Oct 03 08:07:31 crc kubenswrapper[4664]: W1003 08:07:31.618567 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25492a2d_416b_4e30_a717_2b814d47066e.slice/crio-2cea30bad86639736a0c5a4812523c64ca9dfba18e12e2ff731eb8d90b05b073 WatchSource:0}: Error finding container 2cea30bad86639736a0c5a4812523c64ca9dfba18e12e2ff731eb8d90b05b073: Status 404 returned error can't find the container with id 2cea30bad86639736a0c5a4812523c64ca9dfba18e12e2ff731eb8d90b05b073 Oct 03 08:07:31 crc kubenswrapper[4664]: I1003 08:07:31.940965 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-pwkt9" Oct 03 08:07:32 crc kubenswrapper[4664]: I1003 08:07:32.021475 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/861470e0-672f-4457-86cd-9711fc6dd059-scripts\") pod \"861470e0-672f-4457-86cd-9711fc6dd059\" (UID: \"861470e0-672f-4457-86cd-9711fc6dd059\") " Oct 03 08:07:32 crc kubenswrapper[4664]: I1003 08:07:32.021593 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tjcd\" (UniqueName: \"kubernetes.io/projected/861470e0-672f-4457-86cd-9711fc6dd059-kube-api-access-5tjcd\") pod \"861470e0-672f-4457-86cd-9711fc6dd059\" (UID: \"861470e0-672f-4457-86cd-9711fc6dd059\") " Oct 03 08:07:32 crc kubenswrapper[4664]: I1003 08:07:32.021650 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/861470e0-672f-4457-86cd-9711fc6dd059-etc-swift\") pod \"861470e0-672f-4457-86cd-9711fc6dd059\" (UID: \"861470e0-672f-4457-86cd-9711fc6dd059\") " Oct 03 08:07:32 crc kubenswrapper[4664]: I1003 08:07:32.021706 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/861470e0-672f-4457-86cd-9711fc6dd059-dispersionconf\") pod \"861470e0-672f-4457-86cd-9711fc6dd059\" (UID: \"861470e0-672f-4457-86cd-9711fc6dd059\") " Oct 03 08:07:32 crc kubenswrapper[4664]: I1003 08:07:32.021773 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/861470e0-672f-4457-86cd-9711fc6dd059-swiftconf\") pod \"861470e0-672f-4457-86cd-9711fc6dd059\" (UID: \"861470e0-672f-4457-86cd-9711fc6dd059\") " Oct 03 08:07:32 crc kubenswrapper[4664]: I1003 08:07:32.021816 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/861470e0-672f-4457-86cd-9711fc6dd059-ring-data-devices\") pod \"861470e0-672f-4457-86cd-9711fc6dd059\" (UID: \"861470e0-672f-4457-86cd-9711fc6dd059\") " Oct 03 08:07:32 crc kubenswrapper[4664]: I1003 08:07:32.021841 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/861470e0-672f-4457-86cd-9711fc6dd059-combined-ca-bundle\") pod \"861470e0-672f-4457-86cd-9711fc6dd059\" (UID: \"861470e0-672f-4457-86cd-9711fc6dd059\") " Oct 03 08:07:32 crc kubenswrapper[4664]: I1003 08:07:32.023088 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/861470e0-672f-4457-86cd-9711fc6dd059-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "861470e0-672f-4457-86cd-9711fc6dd059" (UID: "861470e0-672f-4457-86cd-9711fc6dd059"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:07:32 crc kubenswrapper[4664]: I1003 08:07:32.023453 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/861470e0-672f-4457-86cd-9711fc6dd059-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "861470e0-672f-4457-86cd-9711fc6dd059" (UID: "861470e0-672f-4457-86cd-9711fc6dd059"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:07:32 crc kubenswrapper[4664]: I1003 08:07:32.028362 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/861470e0-672f-4457-86cd-9711fc6dd059-kube-api-access-5tjcd" (OuterVolumeSpecName: "kube-api-access-5tjcd") pod "861470e0-672f-4457-86cd-9711fc6dd059" (UID: "861470e0-672f-4457-86cd-9711fc6dd059"). InnerVolumeSpecName "kube-api-access-5tjcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:07:32 crc kubenswrapper[4664]: I1003 08:07:32.048426 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/861470e0-672f-4457-86cd-9711fc6dd059-scripts" (OuterVolumeSpecName: "scripts") pod "861470e0-672f-4457-86cd-9711fc6dd059" (UID: "861470e0-672f-4457-86cd-9711fc6dd059"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:07:32 crc kubenswrapper[4664]: I1003 08:07:32.049825 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/861470e0-672f-4457-86cd-9711fc6dd059-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "861470e0-672f-4457-86cd-9711fc6dd059" (UID: "861470e0-672f-4457-86cd-9711fc6dd059"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:07:32 crc kubenswrapper[4664]: I1003 08:07:32.050778 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/861470e0-672f-4457-86cd-9711fc6dd059-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "861470e0-672f-4457-86cd-9711fc6dd059" (UID: "861470e0-672f-4457-86cd-9711fc6dd059"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:07:32 crc kubenswrapper[4664]: I1003 08:07:32.054836 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/861470e0-672f-4457-86cd-9711fc6dd059-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "861470e0-672f-4457-86cd-9711fc6dd059" (UID: "861470e0-672f-4457-86cd-9711fc6dd059"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:07:32 crc kubenswrapper[4664]: I1003 08:07:32.081109 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-drt6h"] Oct 03 08:07:32 crc kubenswrapper[4664]: I1003 08:07:32.123896 4664 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/861470e0-672f-4457-86cd-9711fc6dd059-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 03 08:07:32 crc kubenswrapper[4664]: I1003 08:07:32.123943 4664 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/861470e0-672f-4457-86cd-9711fc6dd059-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 03 08:07:32 crc kubenswrapper[4664]: I1003 08:07:32.123956 4664 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/861470e0-672f-4457-86cd-9711fc6dd059-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 03 08:07:32 crc kubenswrapper[4664]: I1003 08:07:32.123970 4664 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/861470e0-672f-4457-86cd-9711fc6dd059-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:07:32 crc kubenswrapper[4664]: I1003 08:07:32.123982 4664 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/861470e0-672f-4457-86cd-9711fc6dd059-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:07:32 crc kubenswrapper[4664]: I1003 08:07:32.123993 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tjcd\" (UniqueName: \"kubernetes.io/projected/861470e0-672f-4457-86cd-9711fc6dd059-kube-api-access-5tjcd\") on node \"crc\" DevicePath \"\"" Oct 03 08:07:32 crc kubenswrapper[4664]: I1003 08:07:32.124006 4664 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/861470e0-672f-4457-86cd-9711fc6dd059-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 03 08:07:32 crc kubenswrapper[4664]: I1003 08:07:32.581583 4664 generic.go:334] "Generic (PLEG): container finished" podID="0b5c50a5-3090-403e-9207-a48e761544c3" containerID="f36f3035ecc75d0a667c02b19ccd5b69308b32c3f95f1d1b83f8b6b1576bebed" exitCode=0 Oct 03 08:07:32 crc kubenswrapper[4664]: I1003 08:07:32.581746 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-drt6h" event={"ID":"0b5c50a5-3090-403e-9207-a48e761544c3","Type":"ContainerDied","Data":"f36f3035ecc75d0a667c02b19ccd5b69308b32c3f95f1d1b83f8b6b1576bebed"} Oct 03 08:07:32 crc kubenswrapper[4664]: I1003 08:07:32.581839 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-drt6h" event={"ID":"0b5c50a5-3090-403e-9207-a48e761544c3","Type":"ContainerStarted","Data":"b4eacebd9db0e59eb18de2b6318331a6ac768ad14bfcac566dc192a99e0a3df2"} Oct 03 08:07:32 crc kubenswrapper[4664]: I1003 08:07:32.584102 4664 generic.go:334] "Generic (PLEG): container finished" podID="25492a2d-416b-4e30-a717-2b814d47066e" containerID="13398b4f9ee8746d3fd93b75af8e690a8ab7dee644b3b721d295c8fbca10c576" exitCode=0 Oct 03 08:07:32 crc kubenswrapper[4664]: I1003 08:07:32.584177 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-m7kq6" event={"ID":"25492a2d-416b-4e30-a717-2b814d47066e","Type":"ContainerDied","Data":"13398b4f9ee8746d3fd93b75af8e690a8ab7dee644b3b721d295c8fbca10c576"} Oct 03 08:07:32 crc kubenswrapper[4664]: I1003 08:07:32.584212 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-m7kq6" event={"ID":"25492a2d-416b-4e30-a717-2b814d47066e","Type":"ContainerStarted","Data":"2cea30bad86639736a0c5a4812523c64ca9dfba18e12e2ff731eb8d90b05b073"} Oct 03 08:07:32 crc kubenswrapper[4664]: I1003 08:07:32.586144 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-pwkt9" event={"ID":"861470e0-672f-4457-86cd-9711fc6dd059","Type":"ContainerDied","Data":"c9418ad96532c8e5202a1fc99de4286660968b9b9a212fd872bb11a04dcc3639"} Oct 03 08:07:32 crc kubenswrapper[4664]: I1003 08:07:32.586181 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9418ad96532c8e5202a1fc99de4286660968b9b9a212fd872bb11a04dcc3639" Oct 03 08:07:32 crc kubenswrapper[4664]: I1003 08:07:32.586190 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-pwkt9" Oct 03 08:07:32 crc kubenswrapper[4664]: I1003 08:07:32.918267 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-98kgj" Oct 03 08:07:32 crc kubenswrapper[4664]: I1003 08:07:32.937287 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wndkp\" (UniqueName: \"kubernetes.io/projected/6a531dde-5ad1-4118-acef-d104aec77b92-kube-api-access-wndkp\") pod \"6a531dde-5ad1-4118-acef-d104aec77b92\" (UID: \"6a531dde-5ad1-4118-acef-d104aec77b92\") " Oct 03 08:07:32 crc kubenswrapper[4664]: I1003 08:07:32.946899 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a531dde-5ad1-4118-acef-d104aec77b92-kube-api-access-wndkp" (OuterVolumeSpecName: "kube-api-access-wndkp") pod "6a531dde-5ad1-4118-acef-d104aec77b92" (UID: "6a531dde-5ad1-4118-acef-d104aec77b92"). InnerVolumeSpecName "kube-api-access-wndkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:07:33 crc kubenswrapper[4664]: I1003 08:07:33.039212 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wndkp\" (UniqueName: \"kubernetes.io/projected/6a531dde-5ad1-4118-acef-d104aec77b92-kube-api-access-wndkp\") on node \"crc\" DevicePath \"\"" Oct 03 08:07:33 crc kubenswrapper[4664]: I1003 08:07:33.596962 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-98kgj" event={"ID":"6a531dde-5ad1-4118-acef-d104aec77b92","Type":"ContainerDied","Data":"4649bfe7bc676bdd98863e34f5a1b2d1c2c90b15dfa3015df00db1202793c365"} Oct 03 08:07:33 crc kubenswrapper[4664]: I1003 08:07:33.597016 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4649bfe7bc676bdd98863e34f5a1b2d1c2c90b15dfa3015df00db1202793c365" Oct 03 08:07:33 crc kubenswrapper[4664]: I1003 08:07:33.597085 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-98kgj" Oct 03 08:07:33 crc kubenswrapper[4664]: I1003 08:07:33.983708 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-m7kq6" Oct 03 08:07:33 crc kubenswrapper[4664]: I1003 08:07:33.990098 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-drt6h" Oct 03 08:07:34 crc kubenswrapper[4664]: I1003 08:07:34.059117 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lj9t9\" (UniqueName: \"kubernetes.io/projected/25492a2d-416b-4e30-a717-2b814d47066e-kube-api-access-lj9t9\") pod \"25492a2d-416b-4e30-a717-2b814d47066e\" (UID: \"25492a2d-416b-4e30-a717-2b814d47066e\") " Oct 03 08:07:34 crc kubenswrapper[4664]: I1003 08:07:34.059329 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spdm8\" (UniqueName: \"kubernetes.io/projected/0b5c50a5-3090-403e-9207-a48e761544c3-kube-api-access-spdm8\") pod \"0b5c50a5-3090-403e-9207-a48e761544c3\" (UID: \"0b5c50a5-3090-403e-9207-a48e761544c3\") " Oct 03 08:07:34 crc kubenswrapper[4664]: I1003 08:07:34.063203 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b5c50a5-3090-403e-9207-a48e761544c3-kube-api-access-spdm8" (OuterVolumeSpecName: "kube-api-access-spdm8") pod "0b5c50a5-3090-403e-9207-a48e761544c3" (UID: "0b5c50a5-3090-403e-9207-a48e761544c3"). InnerVolumeSpecName "kube-api-access-spdm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:07:34 crc kubenswrapper[4664]: I1003 08:07:34.063819 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25492a2d-416b-4e30-a717-2b814d47066e-kube-api-access-lj9t9" (OuterVolumeSpecName: "kube-api-access-lj9t9") pod "25492a2d-416b-4e30-a717-2b814d47066e" (UID: "25492a2d-416b-4e30-a717-2b814d47066e"). InnerVolumeSpecName "kube-api-access-lj9t9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:07:34 crc kubenswrapper[4664]: I1003 08:07:34.162151 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lj9t9\" (UniqueName: \"kubernetes.io/projected/25492a2d-416b-4e30-a717-2b814d47066e-kube-api-access-lj9t9\") on node \"crc\" DevicePath \"\"" Oct 03 08:07:34 crc kubenswrapper[4664]: I1003 08:07:34.162224 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spdm8\" (UniqueName: \"kubernetes.io/projected/0b5c50a5-3090-403e-9207-a48e761544c3-kube-api-access-spdm8\") on node \"crc\" DevicePath \"\"" Oct 03 08:07:34 crc kubenswrapper[4664]: I1003 08:07:34.606836 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-drt6h" Oct 03 08:07:34 crc kubenswrapper[4664]: I1003 08:07:34.606834 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-drt6h" event={"ID":"0b5c50a5-3090-403e-9207-a48e761544c3","Type":"ContainerDied","Data":"b4eacebd9db0e59eb18de2b6318331a6ac768ad14bfcac566dc192a99e0a3df2"} Oct 03 08:07:34 crc kubenswrapper[4664]: I1003 08:07:34.607048 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4eacebd9db0e59eb18de2b6318331a6ac768ad14bfcac566dc192a99e0a3df2" Oct 03 08:07:34 crc kubenswrapper[4664]: I1003 08:07:34.608057 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-m7kq6" event={"ID":"25492a2d-416b-4e30-a717-2b814d47066e","Type":"ContainerDied","Data":"2cea30bad86639736a0c5a4812523c64ca9dfba18e12e2ff731eb8d90b05b073"} Oct 03 08:07:34 crc kubenswrapper[4664]: I1003 08:07:34.608089 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2cea30bad86639736a0c5a4812523c64ca9dfba18e12e2ff731eb8d90b05b073" Oct 03 08:07:34 crc kubenswrapper[4664]: I1003 08:07:34.608145 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-m7kq6" Oct 03 08:07:35 crc kubenswrapper[4664]: I1003 08:07:35.617930 4664 generic.go:334] "Generic (PLEG): container finished" podID="b8ae1def-1d1a-4acd-af78-204219a99fe6" containerID="62f158f76865a1bd74283911d566ec6f8f9e54cea9ffdb4b21088f5420dd8544" exitCode=0 Oct 03 08:07:35 crc kubenswrapper[4664]: I1003 08:07:35.618050 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b8ae1def-1d1a-4acd-af78-204219a99fe6","Type":"ContainerDied","Data":"62f158f76865a1bd74283911d566ec6f8f9e54cea9ffdb4b21088f5420dd8544"} Oct 03 08:07:36 crc kubenswrapper[4664]: I1003 08:07:36.500304 4664 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-fb5ld" podUID="b65aa3e9-2d60-4cd9-b63a-93a07ab33e72" containerName="ovn-controller" probeResult="failure" output=< Oct 03 08:07:36 crc kubenswrapper[4664]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 03 08:07:36 crc kubenswrapper[4664]: > Oct 03 08:07:36 crc kubenswrapper[4664]: I1003 08:07:36.576167 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-qcwp9" Oct 03 08:07:36 crc kubenswrapper[4664]: I1003 08:07:36.576844 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-qcwp9" Oct 03 08:07:36 crc kubenswrapper[4664]: I1003 08:07:36.630992 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b8ae1def-1d1a-4acd-af78-204219a99fe6","Type":"ContainerStarted","Data":"99a69382c766c5f7bd613bbec49e828ed57ab948713f9652a5673c0488212a4d"} Oct 03 08:07:36 crc kubenswrapper[4664]: I1003 08:07:36.631468 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:07:36 crc kubenswrapper[4664]: I1003 08:07:36.829028 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.067868524 podStartE2EDuration="1m20.828994219s" podCreationTimestamp="2025-10-03 08:06:16 +0000 UTC" firstStartedPulling="2025-10-03 08:06:18.501545445 +0000 UTC m=+1079.322735935" lastFinishedPulling="2025-10-03 08:07:01.26267114 +0000 UTC m=+1122.083861630" observedRunningTime="2025-10-03 08:07:36.66634337 +0000 UTC m=+1157.487533880" watchObservedRunningTime="2025-10-03 08:07:36.828994219 +0000 UTC m=+1157.650184709" Oct 03 08:07:36 crc kubenswrapper[4664]: I1003 08:07:36.834248 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-fb5ld-config-hbj69"] Oct 03 08:07:36 crc kubenswrapper[4664]: E1003 08:07:36.834849 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b5c50a5-3090-403e-9207-a48e761544c3" containerName="mariadb-database-create" Oct 03 08:07:36 crc kubenswrapper[4664]: I1003 08:07:36.834874 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b5c50a5-3090-403e-9207-a48e761544c3" containerName="mariadb-database-create" Oct 03 08:07:36 crc kubenswrapper[4664]: E1003 08:07:36.834895 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25492a2d-416b-4e30-a717-2b814d47066e" containerName="mariadb-database-create" Oct 03 08:07:36 crc kubenswrapper[4664]: I1003 08:07:36.834909 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="25492a2d-416b-4e30-a717-2b814d47066e" containerName="mariadb-database-create" Oct 03 08:07:36 crc kubenswrapper[4664]: E1003 08:07:36.834919 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a531dde-5ad1-4118-acef-d104aec77b92" containerName="mariadb-database-create" Oct 03 08:07:36 crc kubenswrapper[4664]: I1003 08:07:36.834925 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a531dde-5ad1-4118-acef-d104aec77b92" containerName="mariadb-database-create" Oct 03 08:07:36 crc kubenswrapper[4664]: E1003 08:07:36.834939 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="861470e0-672f-4457-86cd-9711fc6dd059" containerName="swift-ring-rebalance" Oct 03 08:07:36 crc kubenswrapper[4664]: I1003 08:07:36.834945 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="861470e0-672f-4457-86cd-9711fc6dd059" containerName="swift-ring-rebalance" Oct 03 08:07:36 crc kubenswrapper[4664]: I1003 08:07:36.835309 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a531dde-5ad1-4118-acef-d104aec77b92" containerName="mariadb-database-create" Oct 03 08:07:36 crc kubenswrapper[4664]: I1003 08:07:36.835321 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="861470e0-672f-4457-86cd-9711fc6dd059" containerName="swift-ring-rebalance" Oct 03 08:07:36 crc kubenswrapper[4664]: I1003 08:07:36.835328 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="25492a2d-416b-4e30-a717-2b814d47066e" containerName="mariadb-database-create" Oct 03 08:07:36 crc kubenswrapper[4664]: I1003 08:07:36.835336 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b5c50a5-3090-403e-9207-a48e761544c3" containerName="mariadb-database-create" Oct 03 08:07:36 crc kubenswrapper[4664]: I1003 08:07:36.839031 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fb5ld-config-hbj69" Oct 03 08:07:36 crc kubenswrapper[4664]: I1003 08:07:36.844245 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 03 08:07:36 crc kubenswrapper[4664]: I1003 08:07:36.851196 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-fb5ld-config-hbj69"] Oct 03 08:07:37 crc kubenswrapper[4664]: I1003 08:07:37.015041 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtv6j\" (UniqueName: \"kubernetes.io/projected/a0246995-3535-47a2-a92b-3af739508bb5-kube-api-access-rtv6j\") pod \"ovn-controller-fb5ld-config-hbj69\" (UID: \"a0246995-3535-47a2-a92b-3af739508bb5\") " pod="openstack/ovn-controller-fb5ld-config-hbj69" Oct 03 08:07:37 crc kubenswrapper[4664]: I1003 08:07:37.015141 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a0246995-3535-47a2-a92b-3af739508bb5-additional-scripts\") pod \"ovn-controller-fb5ld-config-hbj69\" (UID: \"a0246995-3535-47a2-a92b-3af739508bb5\") " pod="openstack/ovn-controller-fb5ld-config-hbj69" Oct 03 08:07:37 crc kubenswrapper[4664]: I1003 08:07:37.015514 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a0246995-3535-47a2-a92b-3af739508bb5-var-run-ovn\") pod \"ovn-controller-fb5ld-config-hbj69\" (UID: \"a0246995-3535-47a2-a92b-3af739508bb5\") " pod="openstack/ovn-controller-fb5ld-config-hbj69" Oct 03 08:07:37 crc kubenswrapper[4664]: I1003 08:07:37.015671 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a0246995-3535-47a2-a92b-3af739508bb5-scripts\") pod \"ovn-controller-fb5ld-config-hbj69\" (UID: \"a0246995-3535-47a2-a92b-3af739508bb5\") " pod="openstack/ovn-controller-fb5ld-config-hbj69" Oct 03 08:07:37 crc kubenswrapper[4664]: I1003 08:07:37.015857 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a0246995-3535-47a2-a92b-3af739508bb5-var-log-ovn\") pod \"ovn-controller-fb5ld-config-hbj69\" (UID: \"a0246995-3535-47a2-a92b-3af739508bb5\") " pod="openstack/ovn-controller-fb5ld-config-hbj69" Oct 03 08:07:37 crc kubenswrapper[4664]: I1003 08:07:37.016035 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a0246995-3535-47a2-a92b-3af739508bb5-var-run\") pod \"ovn-controller-fb5ld-config-hbj69\" (UID: \"a0246995-3535-47a2-a92b-3af739508bb5\") " pod="openstack/ovn-controller-fb5ld-config-hbj69" Oct 03 08:07:37 crc kubenswrapper[4664]: I1003 08:07:37.118070 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a0246995-3535-47a2-a92b-3af739508bb5-var-run-ovn\") pod \"ovn-controller-fb5ld-config-hbj69\" (UID: \"a0246995-3535-47a2-a92b-3af739508bb5\") " pod="openstack/ovn-controller-fb5ld-config-hbj69" Oct 03 08:07:37 crc kubenswrapper[4664]: I1003 08:07:37.118176 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a0246995-3535-47a2-a92b-3af739508bb5-scripts\") pod \"ovn-controller-fb5ld-config-hbj69\" (UID: \"a0246995-3535-47a2-a92b-3af739508bb5\") " pod="openstack/ovn-controller-fb5ld-config-hbj69" Oct 03 08:07:37 crc kubenswrapper[4664]: I1003 08:07:37.118226 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a0246995-3535-47a2-a92b-3af739508bb5-var-log-ovn\") pod \"ovn-controller-fb5ld-config-hbj69\" (UID: \"a0246995-3535-47a2-a92b-3af739508bb5\") " pod="openstack/ovn-controller-fb5ld-config-hbj69" Oct 03 08:07:37 crc kubenswrapper[4664]: I1003 08:07:37.118264 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a0246995-3535-47a2-a92b-3af739508bb5-var-run\") pod \"ovn-controller-fb5ld-config-hbj69\" (UID: \"a0246995-3535-47a2-a92b-3af739508bb5\") " pod="openstack/ovn-controller-fb5ld-config-hbj69" Oct 03 08:07:37 crc kubenswrapper[4664]: I1003 08:07:37.118298 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtv6j\" (UniqueName: \"kubernetes.io/projected/a0246995-3535-47a2-a92b-3af739508bb5-kube-api-access-rtv6j\") pod \"ovn-controller-fb5ld-config-hbj69\" (UID: \"a0246995-3535-47a2-a92b-3af739508bb5\") " pod="openstack/ovn-controller-fb5ld-config-hbj69" Oct 03 08:07:37 crc kubenswrapper[4664]: I1003 08:07:37.118329 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a0246995-3535-47a2-a92b-3af739508bb5-additional-scripts\") pod \"ovn-controller-fb5ld-config-hbj69\" (UID: \"a0246995-3535-47a2-a92b-3af739508bb5\") " pod="openstack/ovn-controller-fb5ld-config-hbj69" Oct 03 08:07:37 crc kubenswrapper[4664]: I1003 08:07:37.119273 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a0246995-3535-47a2-a92b-3af739508bb5-additional-scripts\") pod \"ovn-controller-fb5ld-config-hbj69\" (UID: \"a0246995-3535-47a2-a92b-3af739508bb5\") " pod="openstack/ovn-controller-fb5ld-config-hbj69" Oct 03 08:07:37 crc kubenswrapper[4664]: I1003 08:07:37.119708 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a0246995-3535-47a2-a92b-3af739508bb5-var-run-ovn\") pod \"ovn-controller-fb5ld-config-hbj69\" (UID: \"a0246995-3535-47a2-a92b-3af739508bb5\") " pod="openstack/ovn-controller-fb5ld-config-hbj69" Oct 03 08:07:37 crc kubenswrapper[4664]: I1003 08:07:37.121423 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a0246995-3535-47a2-a92b-3af739508bb5-scripts\") pod \"ovn-controller-fb5ld-config-hbj69\" (UID: \"a0246995-3535-47a2-a92b-3af739508bb5\") " pod="openstack/ovn-controller-fb5ld-config-hbj69" Oct 03 08:07:37 crc kubenswrapper[4664]: I1003 08:07:37.121492 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a0246995-3535-47a2-a92b-3af739508bb5-var-log-ovn\") pod \"ovn-controller-fb5ld-config-hbj69\" (UID: \"a0246995-3535-47a2-a92b-3af739508bb5\") " pod="openstack/ovn-controller-fb5ld-config-hbj69" Oct 03 08:07:37 crc kubenswrapper[4664]: I1003 08:07:37.121532 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a0246995-3535-47a2-a92b-3af739508bb5-var-run\") pod \"ovn-controller-fb5ld-config-hbj69\" (UID: \"a0246995-3535-47a2-a92b-3af739508bb5\") " pod="openstack/ovn-controller-fb5ld-config-hbj69" Oct 03 08:07:37 crc kubenswrapper[4664]: I1003 08:07:37.142766 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtv6j\" (UniqueName: \"kubernetes.io/projected/a0246995-3535-47a2-a92b-3af739508bb5-kube-api-access-rtv6j\") pod \"ovn-controller-fb5ld-config-hbj69\" (UID: \"a0246995-3535-47a2-a92b-3af739508bb5\") " pod="openstack/ovn-controller-fb5ld-config-hbj69" Oct 03 08:07:37 crc kubenswrapper[4664]: I1003 08:07:37.170299 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fb5ld-config-hbj69" Oct 03 08:07:37 crc kubenswrapper[4664]: I1003 08:07:37.628355 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-fb5ld-config-hbj69"] Oct 03 08:07:38 crc kubenswrapper[4664]: I1003 08:07:38.656912 4664 generic.go:334] "Generic (PLEG): container finished" podID="a0246995-3535-47a2-a92b-3af739508bb5" containerID="565fead3d3a1e0e35eed5e8760a29b61f356235e0ca65465e8ac8dee0f2e8ee0" exitCode=0 Oct 03 08:07:38 crc kubenswrapper[4664]: I1003 08:07:38.657137 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fb5ld-config-hbj69" event={"ID":"a0246995-3535-47a2-a92b-3af739508bb5","Type":"ContainerDied","Data":"565fead3d3a1e0e35eed5e8760a29b61f356235e0ca65465e8ac8dee0f2e8ee0"} Oct 03 08:07:38 crc kubenswrapper[4664]: I1003 08:07:38.657487 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fb5ld-config-hbj69" event={"ID":"a0246995-3535-47a2-a92b-3af739508bb5","Type":"ContainerStarted","Data":"29250a4bfe64c295c3faf18a0b69f043fe4ac23d8092eeb6f886da2a9f539d13"} Oct 03 08:07:39 crc kubenswrapper[4664]: I1003 08:07:39.972131 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fb5ld-config-hbj69" Oct 03 08:07:40 crc kubenswrapper[4664]: I1003 08:07:40.074027 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a0246995-3535-47a2-a92b-3af739508bb5-var-run\") pod \"a0246995-3535-47a2-a92b-3af739508bb5\" (UID: \"a0246995-3535-47a2-a92b-3af739508bb5\") " Oct 03 08:07:40 crc kubenswrapper[4664]: I1003 08:07:40.074125 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a0246995-3535-47a2-a92b-3af739508bb5-scripts\") pod \"a0246995-3535-47a2-a92b-3af739508bb5\" (UID: \"a0246995-3535-47a2-a92b-3af739508bb5\") " Oct 03 08:07:40 crc kubenswrapper[4664]: I1003 08:07:40.074137 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a0246995-3535-47a2-a92b-3af739508bb5-var-run" (OuterVolumeSpecName: "var-run") pod "a0246995-3535-47a2-a92b-3af739508bb5" (UID: "a0246995-3535-47a2-a92b-3af739508bb5"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:07:40 crc kubenswrapper[4664]: I1003 08:07:40.074275 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a0246995-3535-47a2-a92b-3af739508bb5-var-log-ovn\") pod \"a0246995-3535-47a2-a92b-3af739508bb5\" (UID: \"a0246995-3535-47a2-a92b-3af739508bb5\") " Oct 03 08:07:40 crc kubenswrapper[4664]: I1003 08:07:40.074325 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a0246995-3535-47a2-a92b-3af739508bb5-additional-scripts\") pod \"a0246995-3535-47a2-a92b-3af739508bb5\" (UID: \"a0246995-3535-47a2-a92b-3af739508bb5\") " Oct 03 08:07:40 crc kubenswrapper[4664]: I1003 08:07:40.074350 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a0246995-3535-47a2-a92b-3af739508bb5-var-run-ovn\") pod \"a0246995-3535-47a2-a92b-3af739508bb5\" (UID: \"a0246995-3535-47a2-a92b-3af739508bb5\") " Oct 03 08:07:40 crc kubenswrapper[4664]: I1003 08:07:40.074364 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a0246995-3535-47a2-a92b-3af739508bb5-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "a0246995-3535-47a2-a92b-3af739508bb5" (UID: "a0246995-3535-47a2-a92b-3af739508bb5"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:07:40 crc kubenswrapper[4664]: I1003 08:07:40.074395 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtv6j\" (UniqueName: \"kubernetes.io/projected/a0246995-3535-47a2-a92b-3af739508bb5-kube-api-access-rtv6j\") pod \"a0246995-3535-47a2-a92b-3af739508bb5\" (UID: \"a0246995-3535-47a2-a92b-3af739508bb5\") " Oct 03 08:07:40 crc kubenswrapper[4664]: I1003 08:07:40.074471 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a0246995-3535-47a2-a92b-3af739508bb5-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "a0246995-3535-47a2-a92b-3af739508bb5" (UID: "a0246995-3535-47a2-a92b-3af739508bb5"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:07:40 crc kubenswrapper[4664]: I1003 08:07:40.074856 4664 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a0246995-3535-47a2-a92b-3af739508bb5-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 03 08:07:40 crc kubenswrapper[4664]: I1003 08:07:40.074878 4664 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a0246995-3535-47a2-a92b-3af739508bb5-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 03 08:07:40 crc kubenswrapper[4664]: I1003 08:07:40.074891 4664 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a0246995-3535-47a2-a92b-3af739508bb5-var-run\") on node \"crc\" DevicePath \"\"" Oct 03 08:07:40 crc kubenswrapper[4664]: I1003 08:07:40.075253 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0246995-3535-47a2-a92b-3af739508bb5-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "a0246995-3535-47a2-a92b-3af739508bb5" (UID: "a0246995-3535-47a2-a92b-3af739508bb5"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:07:40 crc kubenswrapper[4664]: I1003 08:07:40.075459 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0246995-3535-47a2-a92b-3af739508bb5-scripts" (OuterVolumeSpecName: "scripts") pod "a0246995-3535-47a2-a92b-3af739508bb5" (UID: "a0246995-3535-47a2-a92b-3af739508bb5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:07:40 crc kubenswrapper[4664]: I1003 08:07:40.090021 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0246995-3535-47a2-a92b-3af739508bb5-kube-api-access-rtv6j" (OuterVolumeSpecName: "kube-api-access-rtv6j") pod "a0246995-3535-47a2-a92b-3af739508bb5" (UID: "a0246995-3535-47a2-a92b-3af739508bb5"). InnerVolumeSpecName "kube-api-access-rtv6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:07:40 crc kubenswrapper[4664]: I1003 08:07:40.176962 4664 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a0246995-3535-47a2-a92b-3af739508bb5-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:07:40 crc kubenswrapper[4664]: I1003 08:07:40.177015 4664 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a0246995-3535-47a2-a92b-3af739508bb5-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:07:40 crc kubenswrapper[4664]: I1003 08:07:40.177032 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtv6j\" (UniqueName: \"kubernetes.io/projected/a0246995-3535-47a2-a92b-3af739508bb5-kube-api-access-rtv6j\") on node \"crc\" DevicePath \"\"" Oct 03 08:07:40 crc kubenswrapper[4664]: I1003 08:07:40.510409 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-3b68-account-create-tcs5r"] Oct 03 08:07:40 crc kubenswrapper[4664]: E1003 08:07:40.510866 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0246995-3535-47a2-a92b-3af739508bb5" containerName="ovn-config" Oct 03 08:07:40 crc kubenswrapper[4664]: I1003 08:07:40.510890 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0246995-3535-47a2-a92b-3af739508bb5" containerName="ovn-config" Oct 03 08:07:40 crc kubenswrapper[4664]: I1003 08:07:40.511094 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0246995-3535-47a2-a92b-3af739508bb5" containerName="ovn-config" Oct 03 08:07:40 crc kubenswrapper[4664]: I1003 08:07:40.511795 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3b68-account-create-tcs5r" Oct 03 08:07:40 crc kubenswrapper[4664]: I1003 08:07:40.514708 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 03 08:07:40 crc kubenswrapper[4664]: I1003 08:07:40.526876 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-3b68-account-create-tcs5r"] Oct 03 08:07:40 crc kubenswrapper[4664]: I1003 08:07:40.673870 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fb5ld-config-hbj69" event={"ID":"a0246995-3535-47a2-a92b-3af739508bb5","Type":"ContainerDied","Data":"29250a4bfe64c295c3faf18a0b69f043fe4ac23d8092eeb6f886da2a9f539d13"} Oct 03 08:07:40 crc kubenswrapper[4664]: I1003 08:07:40.673915 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29250a4bfe64c295c3faf18a0b69f043fe4ac23d8092eeb6f886da2a9f539d13" Oct 03 08:07:40 crc kubenswrapper[4664]: I1003 08:07:40.673917 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fb5ld-config-hbj69" Oct 03 08:07:40 crc kubenswrapper[4664]: I1003 08:07:40.684114 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxtpc\" (UniqueName: \"kubernetes.io/projected/19b6d284-9fc0-4929-97d1-1841f75ea25c-kube-api-access-vxtpc\") pod \"keystone-3b68-account-create-tcs5r\" (UID: \"19b6d284-9fc0-4929-97d1-1841f75ea25c\") " pod="openstack/keystone-3b68-account-create-tcs5r" Oct 03 08:07:40 crc kubenswrapper[4664]: I1003 08:07:40.785846 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxtpc\" (UniqueName: \"kubernetes.io/projected/19b6d284-9fc0-4929-97d1-1841f75ea25c-kube-api-access-vxtpc\") pod \"keystone-3b68-account-create-tcs5r\" (UID: \"19b6d284-9fc0-4929-97d1-1841f75ea25c\") " pod="openstack/keystone-3b68-account-create-tcs5r" Oct 03 08:07:40 crc kubenswrapper[4664]: I1003 08:07:40.816677 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxtpc\" (UniqueName: \"kubernetes.io/projected/19b6d284-9fc0-4929-97d1-1841f75ea25c-kube-api-access-vxtpc\") pod \"keystone-3b68-account-create-tcs5r\" (UID: \"19b6d284-9fc0-4929-97d1-1841f75ea25c\") " pod="openstack/keystone-3b68-account-create-tcs5r" Oct 03 08:07:40 crc kubenswrapper[4664]: I1003 08:07:40.835341 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3b68-account-create-tcs5r" Oct 03 08:07:40 crc kubenswrapper[4664]: I1003 08:07:40.872910 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-fbb7-account-create-ggzfs"] Oct 03 08:07:40 crc kubenswrapper[4664]: I1003 08:07:40.874521 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-fbb7-account-create-ggzfs" Oct 03 08:07:40 crc kubenswrapper[4664]: I1003 08:07:40.878560 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 03 08:07:40 crc kubenswrapper[4664]: I1003 08:07:40.883549 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-fbb7-account-create-ggzfs"] Oct 03 08:07:40 crc kubenswrapper[4664]: I1003 08:07:40.989328 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ns5xk\" (UniqueName: \"kubernetes.io/projected/6b3ab103-d868-4ca1-98ef-90d62174e20d-kube-api-access-ns5xk\") pod \"placement-fbb7-account-create-ggzfs\" (UID: \"6b3ab103-d868-4ca1-98ef-90d62174e20d\") " pod="openstack/placement-fbb7-account-create-ggzfs" Oct 03 08:07:41 crc kubenswrapper[4664]: I1003 08:07:41.091124 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ns5xk\" (UniqueName: \"kubernetes.io/projected/6b3ab103-d868-4ca1-98ef-90d62174e20d-kube-api-access-ns5xk\") pod \"placement-fbb7-account-create-ggzfs\" (UID: \"6b3ab103-d868-4ca1-98ef-90d62174e20d\") " pod="openstack/placement-fbb7-account-create-ggzfs" Oct 03 08:07:41 crc kubenswrapper[4664]: I1003 08:07:41.096410 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-fb5ld-config-hbj69"] Oct 03 08:07:41 crc kubenswrapper[4664]: I1003 08:07:41.104178 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-fb5ld-config-hbj69"] Oct 03 08:07:41 crc kubenswrapper[4664]: I1003 08:07:41.122584 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ns5xk\" (UniqueName: \"kubernetes.io/projected/6b3ab103-d868-4ca1-98ef-90d62174e20d-kube-api-access-ns5xk\") pod \"placement-fbb7-account-create-ggzfs\" (UID: \"6b3ab103-d868-4ca1-98ef-90d62174e20d\") " pod="openstack/placement-fbb7-account-create-ggzfs" Oct 03 08:07:41 crc kubenswrapper[4664]: I1003 08:07:41.148569 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-3b68-account-create-tcs5r"] Oct 03 08:07:41 crc kubenswrapper[4664]: I1003 08:07:41.266470 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-fbb7-account-create-ggzfs" Oct 03 08:07:41 crc kubenswrapper[4664]: I1003 08:07:41.364569 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-f9ab-account-create-62kjt"] Oct 03 08:07:41 crc kubenswrapper[4664]: I1003 08:07:41.365760 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f9ab-account-create-62kjt" Oct 03 08:07:41 crc kubenswrapper[4664]: I1003 08:07:41.371672 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 03 08:07:41 crc kubenswrapper[4664]: I1003 08:07:41.381455 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-f9ab-account-create-62kjt"] Oct 03 08:07:41 crc kubenswrapper[4664]: I1003 08:07:41.501396 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t6cg\" (UniqueName: \"kubernetes.io/projected/6bba7e96-7b1b-4071-84d6-bf2f6705ca0b-kube-api-access-5t6cg\") pod \"glance-f9ab-account-create-62kjt\" (UID: \"6bba7e96-7b1b-4071-84d6-bf2f6705ca0b\") " pod="openstack/glance-f9ab-account-create-62kjt" Oct 03 08:07:41 crc kubenswrapper[4664]: I1003 08:07:41.518835 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-fb5ld" Oct 03 08:07:41 crc kubenswrapper[4664]: I1003 08:07:41.602940 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5t6cg\" (UniqueName: \"kubernetes.io/projected/6bba7e96-7b1b-4071-84d6-bf2f6705ca0b-kube-api-access-5t6cg\") pod \"glance-f9ab-account-create-62kjt\" (UID: \"6bba7e96-7b1b-4071-84d6-bf2f6705ca0b\") " pod="openstack/glance-f9ab-account-create-62kjt" Oct 03 08:07:41 crc kubenswrapper[4664]: I1003 08:07:41.624568 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t6cg\" (UniqueName: \"kubernetes.io/projected/6bba7e96-7b1b-4071-84d6-bf2f6705ca0b-kube-api-access-5t6cg\") pod \"glance-f9ab-account-create-62kjt\" (UID: \"6bba7e96-7b1b-4071-84d6-bf2f6705ca0b\") " pod="openstack/glance-f9ab-account-create-62kjt" Oct 03 08:07:41 crc kubenswrapper[4664]: I1003 08:07:41.683225 4664 generic.go:334] "Generic (PLEG): container finished" podID="19b6d284-9fc0-4929-97d1-1841f75ea25c" containerID="1c2ade37c5164cdd797ae746cdf91c1002d670a2d8140d6cfd5fb3fc1263bfbc" exitCode=0 Oct 03 08:07:41 crc kubenswrapper[4664]: I1003 08:07:41.683268 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3b68-account-create-tcs5r" event={"ID":"19b6d284-9fc0-4929-97d1-1841f75ea25c","Type":"ContainerDied","Data":"1c2ade37c5164cdd797ae746cdf91c1002d670a2d8140d6cfd5fb3fc1263bfbc"} Oct 03 08:07:41 crc kubenswrapper[4664]: I1003 08:07:41.683295 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3b68-account-create-tcs5r" event={"ID":"19b6d284-9fc0-4929-97d1-1841f75ea25c","Type":"ContainerStarted","Data":"ce8eccd585478cb3511a185e65e45b7e2cee137d44440da0b4212c56758df960"} Oct 03 08:07:41 crc kubenswrapper[4664]: I1003 08:07:41.706563 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f9ab-account-create-62kjt" Oct 03 08:07:41 crc kubenswrapper[4664]: I1003 08:07:41.889491 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0246995-3535-47a2-a92b-3af739508bb5" path="/var/lib/kubelet/pods/a0246995-3535-47a2-a92b-3af739508bb5/volumes" Oct 03 08:07:41 crc kubenswrapper[4664]: I1003 08:07:41.920784 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-fbb7-account-create-ggzfs"] Oct 03 08:07:42 crc kubenswrapper[4664]: I1003 08:07:42.498413 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-f9ab-account-create-62kjt"] Oct 03 08:07:42 crc kubenswrapper[4664]: I1003 08:07:42.692035 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f9ab-account-create-62kjt" event={"ID":"6bba7e96-7b1b-4071-84d6-bf2f6705ca0b","Type":"ContainerStarted","Data":"3598baa235357d43849983f1f2fd7dfa67bb716018884758d0750b1aa130832b"} Oct 03 08:07:42 crc kubenswrapper[4664]: I1003 08:07:42.694027 4664 generic.go:334] "Generic (PLEG): container finished" podID="6b3ab103-d868-4ca1-98ef-90d62174e20d" containerID="7299a9da53bdfc3fe22676a95215620d6317d1dac46480e92856710951713fdf" exitCode=0 Oct 03 08:07:42 crc kubenswrapper[4664]: I1003 08:07:42.694487 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-fbb7-account-create-ggzfs" event={"ID":"6b3ab103-d868-4ca1-98ef-90d62174e20d","Type":"ContainerDied","Data":"7299a9da53bdfc3fe22676a95215620d6317d1dac46480e92856710951713fdf"} Oct 03 08:07:42 crc kubenswrapper[4664]: I1003 08:07:42.694551 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-fbb7-account-create-ggzfs" event={"ID":"6b3ab103-d868-4ca1-98ef-90d62174e20d","Type":"ContainerStarted","Data":"99a6b850dac9fa44b9e0351190c987dc028c298808395c9608064b608184b7a7"} Oct 03 08:07:43 crc kubenswrapper[4664]: I1003 08:07:43.005553 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3b68-account-create-tcs5r" Oct 03 08:07:43 crc kubenswrapper[4664]: I1003 08:07:43.062983 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxtpc\" (UniqueName: \"kubernetes.io/projected/19b6d284-9fc0-4929-97d1-1841f75ea25c-kube-api-access-vxtpc\") pod \"19b6d284-9fc0-4929-97d1-1841f75ea25c\" (UID: \"19b6d284-9fc0-4929-97d1-1841f75ea25c\") " Oct 03 08:07:43 crc kubenswrapper[4664]: I1003 08:07:43.078487 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19b6d284-9fc0-4929-97d1-1841f75ea25c-kube-api-access-vxtpc" (OuterVolumeSpecName: "kube-api-access-vxtpc") pod "19b6d284-9fc0-4929-97d1-1841f75ea25c" (UID: "19b6d284-9fc0-4929-97d1-1841f75ea25c"). InnerVolumeSpecName "kube-api-access-vxtpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:07:43 crc kubenswrapper[4664]: I1003 08:07:43.165325 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxtpc\" (UniqueName: \"kubernetes.io/projected/19b6d284-9fc0-4929-97d1-1841f75ea25c-kube-api-access-vxtpc\") on node \"crc\" DevicePath \"\"" Oct 03 08:07:43 crc kubenswrapper[4664]: I1003 08:07:43.704077 4664 generic.go:334] "Generic (PLEG): container finished" podID="6bba7e96-7b1b-4071-84d6-bf2f6705ca0b" containerID="2c4d2a2835d448567dbf445f8021b9906cd57cb07a25acf1f143e6bc32ecdc48" exitCode=0 Oct 03 08:07:43 crc kubenswrapper[4664]: I1003 08:07:43.704132 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f9ab-account-create-62kjt" event={"ID":"6bba7e96-7b1b-4071-84d6-bf2f6705ca0b","Type":"ContainerDied","Data":"2c4d2a2835d448567dbf445f8021b9906cd57cb07a25acf1f143e6bc32ecdc48"} Oct 03 08:07:43 crc kubenswrapper[4664]: I1003 08:07:43.706525 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3b68-account-create-tcs5r" Oct 03 08:07:43 crc kubenswrapper[4664]: I1003 08:07:43.706512 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3b68-account-create-tcs5r" event={"ID":"19b6d284-9fc0-4929-97d1-1841f75ea25c","Type":"ContainerDied","Data":"ce8eccd585478cb3511a185e65e45b7e2cee137d44440da0b4212c56758df960"} Oct 03 08:07:43 crc kubenswrapper[4664]: I1003 08:07:43.706568 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce8eccd585478cb3511a185e65e45b7e2cee137d44440da0b4212c56758df960" Oct 03 08:07:44 crc kubenswrapper[4664]: I1003 08:07:44.046125 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-fbb7-account-create-ggzfs" Oct 03 08:07:44 crc kubenswrapper[4664]: I1003 08:07:44.181850 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ns5xk\" (UniqueName: \"kubernetes.io/projected/6b3ab103-d868-4ca1-98ef-90d62174e20d-kube-api-access-ns5xk\") pod \"6b3ab103-d868-4ca1-98ef-90d62174e20d\" (UID: \"6b3ab103-d868-4ca1-98ef-90d62174e20d\") " Oct 03 08:07:44 crc kubenswrapper[4664]: I1003 08:07:44.188502 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b3ab103-d868-4ca1-98ef-90d62174e20d-kube-api-access-ns5xk" (OuterVolumeSpecName: "kube-api-access-ns5xk") pod "6b3ab103-d868-4ca1-98ef-90d62174e20d" (UID: "6b3ab103-d868-4ca1-98ef-90d62174e20d"). InnerVolumeSpecName "kube-api-access-ns5xk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:07:44 crc kubenswrapper[4664]: I1003 08:07:44.285592 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ns5xk\" (UniqueName: \"kubernetes.io/projected/6b3ab103-d868-4ca1-98ef-90d62174e20d-kube-api-access-ns5xk\") on node \"crc\" DevicePath \"\"" Oct 03 08:07:44 crc kubenswrapper[4664]: I1003 08:07:44.715509 4664 generic.go:334] "Generic (PLEG): container finished" podID="8bb7b62d-f030-45a7-b9f8-87852ea275de" containerID="dea969824e6c260d6adfd6cb873a9ed48c1243cced9fbb9c84161d22c5a1daa9" exitCode=0 Oct 03 08:07:44 crc kubenswrapper[4664]: I1003 08:07:44.715590 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8bb7b62d-f030-45a7-b9f8-87852ea275de","Type":"ContainerDied","Data":"dea969824e6c260d6adfd6cb873a9ed48c1243cced9fbb9c84161d22c5a1daa9"} Oct 03 08:07:44 crc kubenswrapper[4664]: I1003 08:07:44.718169 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-fbb7-account-create-ggzfs" Oct 03 08:07:44 crc kubenswrapper[4664]: I1003 08:07:44.718265 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-fbb7-account-create-ggzfs" event={"ID":"6b3ab103-d868-4ca1-98ef-90d62174e20d","Type":"ContainerDied","Data":"99a6b850dac9fa44b9e0351190c987dc028c298808395c9608064b608184b7a7"} Oct 03 08:07:44 crc kubenswrapper[4664]: I1003 08:07:44.718350 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99a6b850dac9fa44b9e0351190c987dc028c298808395c9608064b608184b7a7" Oct 03 08:07:44 crc kubenswrapper[4664]: I1003 08:07:44.954095 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f9ab-account-create-62kjt" Oct 03 08:07:45 crc kubenswrapper[4664]: I1003 08:07:45.097973 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5t6cg\" (UniqueName: \"kubernetes.io/projected/6bba7e96-7b1b-4071-84d6-bf2f6705ca0b-kube-api-access-5t6cg\") pod \"6bba7e96-7b1b-4071-84d6-bf2f6705ca0b\" (UID: \"6bba7e96-7b1b-4071-84d6-bf2f6705ca0b\") " Oct 03 08:07:45 crc kubenswrapper[4664]: I1003 08:07:45.103010 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bba7e96-7b1b-4071-84d6-bf2f6705ca0b-kube-api-access-5t6cg" (OuterVolumeSpecName: "kube-api-access-5t6cg") pod "6bba7e96-7b1b-4071-84d6-bf2f6705ca0b" (UID: "6bba7e96-7b1b-4071-84d6-bf2f6705ca0b"). InnerVolumeSpecName "kube-api-access-5t6cg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:07:45 crc kubenswrapper[4664]: I1003 08:07:45.199903 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5t6cg\" (UniqueName: \"kubernetes.io/projected/6bba7e96-7b1b-4071-84d6-bf2f6705ca0b-kube-api-access-5t6cg\") on node \"crc\" DevicePath \"\"" Oct 03 08:07:45 crc kubenswrapper[4664]: I1003 08:07:45.728515 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8bb7b62d-f030-45a7-b9f8-87852ea275de","Type":"ContainerStarted","Data":"d549b761d0aa07a6d12f415fc19e3a85b5e532acd6ca6a8b918c4583dae2b9fe"} Oct 03 08:07:45 crc kubenswrapper[4664]: I1003 08:07:45.729105 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 03 08:07:45 crc kubenswrapper[4664]: I1003 08:07:45.733401 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f9ab-account-create-62kjt" event={"ID":"6bba7e96-7b1b-4071-84d6-bf2f6705ca0b","Type":"ContainerDied","Data":"3598baa235357d43849983f1f2fd7dfa67bb716018884758d0750b1aa130832b"} Oct 03 08:07:45 crc kubenswrapper[4664]: I1003 08:07:45.733628 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3598baa235357d43849983f1f2fd7dfa67bb716018884758d0750b1aa130832b" Oct 03 08:07:45 crc kubenswrapper[4664]: I1003 08:07:45.733457 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f9ab-account-create-62kjt" Oct 03 08:07:45 crc kubenswrapper[4664]: I1003 08:07:45.762668 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=-9223371947.092129 podStartE2EDuration="1m29.76264769s" podCreationTimestamp="2025-10-03 08:06:16 +0000 UTC" firstStartedPulling="2025-10-03 08:06:18.144286824 +0000 UTC m=+1078.965477314" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:07:45.755901108 +0000 UTC m=+1166.577091618" watchObservedRunningTime="2025-10-03 08:07:45.76264769 +0000 UTC m=+1166.583838170" Oct 03 08:07:46 crc kubenswrapper[4664]: I1003 08:07:46.115748 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e-etc-swift\") pod \"swift-storage-0\" (UID: \"2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e\") " pod="openstack/swift-storage-0" Oct 03 08:07:46 crc kubenswrapper[4664]: I1003 08:07:46.131295 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e-etc-swift\") pod \"swift-storage-0\" (UID: \"2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e\") " pod="openstack/swift-storage-0" Oct 03 08:07:46 crc kubenswrapper[4664]: I1003 08:07:46.228702 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 03 08:07:46 crc kubenswrapper[4664]: I1003 08:07:46.760207 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-djcmk"] Oct 03 08:07:46 crc kubenswrapper[4664]: E1003 08:07:46.760917 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19b6d284-9fc0-4929-97d1-1841f75ea25c" containerName="mariadb-account-create" Oct 03 08:07:46 crc kubenswrapper[4664]: I1003 08:07:46.760934 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="19b6d284-9fc0-4929-97d1-1841f75ea25c" containerName="mariadb-account-create" Oct 03 08:07:46 crc kubenswrapper[4664]: E1003 08:07:46.760953 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b3ab103-d868-4ca1-98ef-90d62174e20d" containerName="mariadb-account-create" Oct 03 08:07:46 crc kubenswrapper[4664]: I1003 08:07:46.760959 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b3ab103-d868-4ca1-98ef-90d62174e20d" containerName="mariadb-account-create" Oct 03 08:07:46 crc kubenswrapper[4664]: E1003 08:07:46.760974 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bba7e96-7b1b-4071-84d6-bf2f6705ca0b" containerName="mariadb-account-create" Oct 03 08:07:46 crc kubenswrapper[4664]: I1003 08:07:46.760981 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bba7e96-7b1b-4071-84d6-bf2f6705ca0b" containerName="mariadb-account-create" Oct 03 08:07:46 crc kubenswrapper[4664]: I1003 08:07:46.761153 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="19b6d284-9fc0-4929-97d1-1841f75ea25c" containerName="mariadb-account-create" Oct 03 08:07:46 crc kubenswrapper[4664]: I1003 08:07:46.761175 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b3ab103-d868-4ca1-98ef-90d62174e20d" containerName="mariadb-account-create" Oct 03 08:07:46 crc kubenswrapper[4664]: I1003 08:07:46.761187 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bba7e96-7b1b-4071-84d6-bf2f6705ca0b" containerName="mariadb-account-create" Oct 03 08:07:46 crc kubenswrapper[4664]: I1003 08:07:46.761871 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-djcmk" Oct 03 08:07:46 crc kubenswrapper[4664]: I1003 08:07:46.765315 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 03 08:07:46 crc kubenswrapper[4664]: I1003 08:07:46.765383 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-fcz7z" Oct 03 08:07:46 crc kubenswrapper[4664]: I1003 08:07:46.771313 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-djcmk"] Oct 03 08:07:46 crc kubenswrapper[4664]: I1003 08:07:46.932549 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d75d0f35-4cda-4925-8d0d-1666f794ce9b-db-sync-config-data\") pod \"glance-db-sync-djcmk\" (UID: \"d75d0f35-4cda-4925-8d0d-1666f794ce9b\") " pod="openstack/glance-db-sync-djcmk" Oct 03 08:07:46 crc kubenswrapper[4664]: I1003 08:07:46.932691 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x5fz\" (UniqueName: \"kubernetes.io/projected/d75d0f35-4cda-4925-8d0d-1666f794ce9b-kube-api-access-2x5fz\") pod \"glance-db-sync-djcmk\" (UID: \"d75d0f35-4cda-4925-8d0d-1666f794ce9b\") " pod="openstack/glance-db-sync-djcmk" Oct 03 08:07:46 crc kubenswrapper[4664]: I1003 08:07:46.932842 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d75d0f35-4cda-4925-8d0d-1666f794ce9b-combined-ca-bundle\") pod \"glance-db-sync-djcmk\" (UID: \"d75d0f35-4cda-4925-8d0d-1666f794ce9b\") " pod="openstack/glance-db-sync-djcmk" Oct 03 08:07:46 crc kubenswrapper[4664]: I1003 08:07:46.932891 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d75d0f35-4cda-4925-8d0d-1666f794ce9b-config-data\") pod \"glance-db-sync-djcmk\" (UID: \"d75d0f35-4cda-4925-8d0d-1666f794ce9b\") " pod="openstack/glance-db-sync-djcmk" Oct 03 08:07:47 crc kubenswrapper[4664]: I1003 08:07:47.034852 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2x5fz\" (UniqueName: \"kubernetes.io/projected/d75d0f35-4cda-4925-8d0d-1666f794ce9b-kube-api-access-2x5fz\") pod \"glance-db-sync-djcmk\" (UID: \"d75d0f35-4cda-4925-8d0d-1666f794ce9b\") " pod="openstack/glance-db-sync-djcmk" Oct 03 08:07:47 crc kubenswrapper[4664]: I1003 08:07:47.035057 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d75d0f35-4cda-4925-8d0d-1666f794ce9b-combined-ca-bundle\") pod \"glance-db-sync-djcmk\" (UID: \"d75d0f35-4cda-4925-8d0d-1666f794ce9b\") " pod="openstack/glance-db-sync-djcmk" Oct 03 08:07:47 crc kubenswrapper[4664]: I1003 08:07:47.035105 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d75d0f35-4cda-4925-8d0d-1666f794ce9b-config-data\") pod \"glance-db-sync-djcmk\" (UID: \"d75d0f35-4cda-4925-8d0d-1666f794ce9b\") " pod="openstack/glance-db-sync-djcmk" Oct 03 08:07:47 crc kubenswrapper[4664]: I1003 08:07:47.035167 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d75d0f35-4cda-4925-8d0d-1666f794ce9b-db-sync-config-data\") pod \"glance-db-sync-djcmk\" (UID: \"d75d0f35-4cda-4925-8d0d-1666f794ce9b\") " pod="openstack/glance-db-sync-djcmk" Oct 03 08:07:47 crc kubenswrapper[4664]: I1003 08:07:47.041448 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d75d0f35-4cda-4925-8d0d-1666f794ce9b-config-data\") pod \"glance-db-sync-djcmk\" (UID: \"d75d0f35-4cda-4925-8d0d-1666f794ce9b\") " pod="openstack/glance-db-sync-djcmk" Oct 03 08:07:47 crc kubenswrapper[4664]: I1003 08:07:47.042651 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d75d0f35-4cda-4925-8d0d-1666f794ce9b-combined-ca-bundle\") pod \"glance-db-sync-djcmk\" (UID: \"d75d0f35-4cda-4925-8d0d-1666f794ce9b\") " pod="openstack/glance-db-sync-djcmk" Oct 03 08:07:47 crc kubenswrapper[4664]: I1003 08:07:47.049230 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d75d0f35-4cda-4925-8d0d-1666f794ce9b-db-sync-config-data\") pod \"glance-db-sync-djcmk\" (UID: \"d75d0f35-4cda-4925-8d0d-1666f794ce9b\") " pod="openstack/glance-db-sync-djcmk" Oct 03 08:07:47 crc kubenswrapper[4664]: I1003 08:07:47.055170 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x5fz\" (UniqueName: \"kubernetes.io/projected/d75d0f35-4cda-4925-8d0d-1666f794ce9b-kube-api-access-2x5fz\") pod \"glance-db-sync-djcmk\" (UID: \"d75d0f35-4cda-4925-8d0d-1666f794ce9b\") " pod="openstack/glance-db-sync-djcmk" Oct 03 08:07:47 crc kubenswrapper[4664]: I1003 08:07:47.100585 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-djcmk" Oct 03 08:07:47 crc kubenswrapper[4664]: I1003 08:07:47.134487 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 03 08:07:47 crc kubenswrapper[4664]: W1003 08:07:47.138460 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2768ef4c_9c4d_40db_a5e0_6b45d1b0d90e.slice/crio-f9c5eb8cc7613fe0f0cb1569f2e723a4cbc337cba9f036e5dbebb5e5c984e0fc WatchSource:0}: Error finding container f9c5eb8cc7613fe0f0cb1569f2e723a4cbc337cba9f036e5dbebb5e5c984e0fc: Status 404 returned error can't find the container with id f9c5eb8cc7613fe0f0cb1569f2e723a4cbc337cba9f036e5dbebb5e5c984e0fc Oct 03 08:07:47 crc kubenswrapper[4664]: I1003 08:07:47.648478 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-djcmk"] Oct 03 08:07:47 crc kubenswrapper[4664]: I1003 08:07:47.755173 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e","Type":"ContainerStarted","Data":"f9c5eb8cc7613fe0f0cb1569f2e723a4cbc337cba9f036e5dbebb5e5c984e0fc"} Oct 03 08:07:47 crc kubenswrapper[4664]: I1003 08:07:47.757148 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-djcmk" event={"ID":"d75d0f35-4cda-4925-8d0d-1666f794ce9b","Type":"ContainerStarted","Data":"bd1caeea9dc736dc5ba8ee61a006ea55bfef98dd9b57cf82e43a4b437659e188"} Oct 03 08:07:47 crc kubenswrapper[4664]: I1003 08:07:47.802823 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:07:50 crc kubenswrapper[4664]: I1003 08:07:50.782393 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e","Type":"ContainerStarted","Data":"bc8eaf79a1e54c4623367123c7558369b76625a397be1ea4ddfed960f0e9fef1"} Oct 03 08:07:50 crc kubenswrapper[4664]: I1003 08:07:50.782780 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e","Type":"ContainerStarted","Data":"47c515b452f016ab73583bc4e89cc5472f72380fdc78f9d31b2921ecd2862e9f"} Oct 03 08:07:51 crc kubenswrapper[4664]: I1003 08:07:51.810119 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e","Type":"ContainerStarted","Data":"0d04bd8354b0ff27e33a19fe33dc92f16c26f196c99495594c11334faddb3817"} Oct 03 08:07:51 crc kubenswrapper[4664]: I1003 08:07:51.810660 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e","Type":"ContainerStarted","Data":"68a6140149f0d4c8162c0c6222a8cd3c2cbebf5d5adf03b8f83bf237ad377353"} Oct 03 08:07:54 crc kubenswrapper[4664]: I1003 08:07:54.859827 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e","Type":"ContainerStarted","Data":"2ccceae18d142f6b464281d36fd6e70511cc91ab77cd7bcbb901c013f7563a3a"} Oct 03 08:07:54 crc kubenswrapper[4664]: I1003 08:07:54.860461 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e","Type":"ContainerStarted","Data":"808b3090172e95f5c5abc274dd5ae341d865adfbf04feedbdc54c25ae3293b02"} Oct 03 08:07:55 crc kubenswrapper[4664]: I1003 08:07:55.903178 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e","Type":"ContainerStarted","Data":"88045cab0a56db0742380c69e40efcb25b384ee62ccbf25153b92ce61f525f8a"} Oct 03 08:07:55 crc kubenswrapper[4664]: I1003 08:07:55.903473 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e","Type":"ContainerStarted","Data":"5a7ba6ba1ee8384ec89756b5f6ce8f9cc1840be94e268782506e8e950cb8c2f9"} Oct 03 08:07:57 crc kubenswrapper[4664]: I1003 08:07:57.523136 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 03 08:07:57 crc kubenswrapper[4664]: I1003 08:07:57.942959 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e","Type":"ContainerStarted","Data":"ef9a5eacf39f3df38567da72cb3d8ba266fd8c88b775fba7579995fdc4c8c30e"} Oct 03 08:07:57 crc kubenswrapper[4664]: I1003 08:07:57.996651 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-h4n98"] Oct 03 08:07:57 crc kubenswrapper[4664]: I1003 08:07:57.997802 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-h4n98" Oct 03 08:07:58 crc kubenswrapper[4664]: I1003 08:07:58.013960 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-h4n98"] Oct 03 08:07:58 crc kubenswrapper[4664]: I1003 08:07:58.182969 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfrfk\" (UniqueName: \"kubernetes.io/projected/13d51df0-aff8-4202-b94c-814faaf05cbb-kube-api-access-lfrfk\") pod \"barbican-db-create-h4n98\" (UID: \"13d51df0-aff8-4202-b94c-814faaf05cbb\") " pod="openstack/barbican-db-create-h4n98" Oct 03 08:07:58 crc kubenswrapper[4664]: I1003 08:07:58.285672 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfrfk\" (UniqueName: \"kubernetes.io/projected/13d51df0-aff8-4202-b94c-814faaf05cbb-kube-api-access-lfrfk\") pod \"barbican-db-create-h4n98\" (UID: \"13d51df0-aff8-4202-b94c-814faaf05cbb\") " pod="openstack/barbican-db-create-h4n98" Oct 03 08:07:58 crc kubenswrapper[4664]: I1003 08:07:58.286116 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-8mfsn"] Oct 03 08:07:58 crc kubenswrapper[4664]: I1003 08:07:58.287468 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-8mfsn" Oct 03 08:07:58 crc kubenswrapper[4664]: I1003 08:07:58.290557 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-bm9nt" Oct 03 08:07:58 crc kubenswrapper[4664]: I1003 08:07:58.290723 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 03 08:07:58 crc kubenswrapper[4664]: I1003 08:07:58.292993 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-8mfsn"] Oct 03 08:07:58 crc kubenswrapper[4664]: I1003 08:07:58.293865 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 03 08:07:58 crc kubenswrapper[4664]: I1003 08:07:58.294461 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 03 08:07:58 crc kubenswrapper[4664]: I1003 08:07:58.335201 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfrfk\" (UniqueName: \"kubernetes.io/projected/13d51df0-aff8-4202-b94c-814faaf05cbb-kube-api-access-lfrfk\") pod \"barbican-db-create-h4n98\" (UID: \"13d51df0-aff8-4202-b94c-814faaf05cbb\") " pod="openstack/barbican-db-create-h4n98" Oct 03 08:07:58 crc kubenswrapper[4664]: I1003 08:07:58.345599 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-bn2c5"] Oct 03 08:07:58 crc kubenswrapper[4664]: I1003 08:07:58.346805 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-bn2c5" Oct 03 08:07:58 crc kubenswrapper[4664]: I1003 08:07:58.380318 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-bn2c5"] Oct 03 08:07:58 crc kubenswrapper[4664]: I1003 08:07:58.395616 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9197763c-fb03-4db8-9edb-b35fdc61856f-combined-ca-bundle\") pod \"keystone-db-sync-8mfsn\" (UID: \"9197763c-fb03-4db8-9edb-b35fdc61856f\") " pod="openstack/keystone-db-sync-8mfsn" Oct 03 08:07:58 crc kubenswrapper[4664]: I1003 08:07:58.395674 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qsss\" (UniqueName: \"kubernetes.io/projected/9197763c-fb03-4db8-9edb-b35fdc61856f-kube-api-access-6qsss\") pod \"keystone-db-sync-8mfsn\" (UID: \"9197763c-fb03-4db8-9edb-b35fdc61856f\") " pod="openstack/keystone-db-sync-8mfsn" Oct 03 08:07:58 crc kubenswrapper[4664]: I1003 08:07:58.395706 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9197763c-fb03-4db8-9edb-b35fdc61856f-config-data\") pod \"keystone-db-sync-8mfsn\" (UID: \"9197763c-fb03-4db8-9edb-b35fdc61856f\") " pod="openstack/keystone-db-sync-8mfsn" Oct 03 08:07:58 crc kubenswrapper[4664]: I1003 08:07:58.395734 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgkvh\" (UniqueName: \"kubernetes.io/projected/ac091c1c-fab5-4ecd-87ff-c2a7a0367d5a-kube-api-access-vgkvh\") pod \"cinder-db-create-bn2c5\" (UID: \"ac091c1c-fab5-4ecd-87ff-c2a7a0367d5a\") " pod="openstack/cinder-db-create-bn2c5" Oct 03 08:07:58 crc kubenswrapper[4664]: I1003 08:07:58.426679 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-wctgk"] Oct 03 08:07:58 crc kubenswrapper[4664]: I1003 08:07:58.427967 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-wctgk" Oct 03 08:07:58 crc kubenswrapper[4664]: I1003 08:07:58.442044 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-wctgk"] Oct 03 08:07:58 crc kubenswrapper[4664]: I1003 08:07:58.497527 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbq2k\" (UniqueName: \"kubernetes.io/projected/a629e3c9-2dba-46d6-a978-bc6133d150e1-kube-api-access-bbq2k\") pod \"neutron-db-create-wctgk\" (UID: \"a629e3c9-2dba-46d6-a978-bc6133d150e1\") " pod="openstack/neutron-db-create-wctgk" Oct 03 08:07:58 crc kubenswrapper[4664]: I1003 08:07:58.497595 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9197763c-fb03-4db8-9edb-b35fdc61856f-combined-ca-bundle\") pod \"keystone-db-sync-8mfsn\" (UID: \"9197763c-fb03-4db8-9edb-b35fdc61856f\") " pod="openstack/keystone-db-sync-8mfsn" Oct 03 08:07:58 crc kubenswrapper[4664]: I1003 08:07:58.497679 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qsss\" (UniqueName: \"kubernetes.io/projected/9197763c-fb03-4db8-9edb-b35fdc61856f-kube-api-access-6qsss\") pod \"keystone-db-sync-8mfsn\" (UID: \"9197763c-fb03-4db8-9edb-b35fdc61856f\") " pod="openstack/keystone-db-sync-8mfsn" Oct 03 08:07:58 crc kubenswrapper[4664]: I1003 08:07:58.497702 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9197763c-fb03-4db8-9edb-b35fdc61856f-config-data\") pod \"keystone-db-sync-8mfsn\" (UID: \"9197763c-fb03-4db8-9edb-b35fdc61856f\") " pod="openstack/keystone-db-sync-8mfsn" Oct 03 08:07:58 crc kubenswrapper[4664]: I1003 08:07:58.497720 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgkvh\" (UniqueName: \"kubernetes.io/projected/ac091c1c-fab5-4ecd-87ff-c2a7a0367d5a-kube-api-access-vgkvh\") pod \"cinder-db-create-bn2c5\" (UID: \"ac091c1c-fab5-4ecd-87ff-c2a7a0367d5a\") " pod="openstack/cinder-db-create-bn2c5" Oct 03 08:07:58 crc kubenswrapper[4664]: I1003 08:07:58.518975 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9197763c-fb03-4db8-9edb-b35fdc61856f-combined-ca-bundle\") pod \"keystone-db-sync-8mfsn\" (UID: \"9197763c-fb03-4db8-9edb-b35fdc61856f\") " pod="openstack/keystone-db-sync-8mfsn" Oct 03 08:07:58 crc kubenswrapper[4664]: I1003 08:07:58.520302 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9197763c-fb03-4db8-9edb-b35fdc61856f-config-data\") pod \"keystone-db-sync-8mfsn\" (UID: \"9197763c-fb03-4db8-9edb-b35fdc61856f\") " pod="openstack/keystone-db-sync-8mfsn" Oct 03 08:07:58 crc kubenswrapper[4664]: I1003 08:07:58.526148 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgkvh\" (UniqueName: \"kubernetes.io/projected/ac091c1c-fab5-4ecd-87ff-c2a7a0367d5a-kube-api-access-vgkvh\") pod \"cinder-db-create-bn2c5\" (UID: \"ac091c1c-fab5-4ecd-87ff-c2a7a0367d5a\") " pod="openstack/cinder-db-create-bn2c5" Oct 03 08:07:58 crc kubenswrapper[4664]: I1003 08:07:58.536354 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qsss\" (UniqueName: \"kubernetes.io/projected/9197763c-fb03-4db8-9edb-b35fdc61856f-kube-api-access-6qsss\") pod \"keystone-db-sync-8mfsn\" (UID: \"9197763c-fb03-4db8-9edb-b35fdc61856f\") " pod="openstack/keystone-db-sync-8mfsn" Oct 03 08:07:58 crc kubenswrapper[4664]: I1003 08:07:58.601741 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbq2k\" (UniqueName: \"kubernetes.io/projected/a629e3c9-2dba-46d6-a978-bc6133d150e1-kube-api-access-bbq2k\") pod \"neutron-db-create-wctgk\" (UID: \"a629e3c9-2dba-46d6-a978-bc6133d150e1\") " pod="openstack/neutron-db-create-wctgk" Oct 03 08:07:58 crc kubenswrapper[4664]: I1003 08:07:58.620242 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbq2k\" (UniqueName: \"kubernetes.io/projected/a629e3c9-2dba-46d6-a978-bc6133d150e1-kube-api-access-bbq2k\") pod \"neutron-db-create-wctgk\" (UID: \"a629e3c9-2dba-46d6-a978-bc6133d150e1\") " pod="openstack/neutron-db-create-wctgk" Oct 03 08:07:58 crc kubenswrapper[4664]: I1003 08:07:58.622521 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-h4n98" Oct 03 08:07:58 crc kubenswrapper[4664]: I1003 08:07:58.694323 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-8mfsn" Oct 03 08:07:58 crc kubenswrapper[4664]: I1003 08:07:58.714052 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-bn2c5" Oct 03 08:07:58 crc kubenswrapper[4664]: I1003 08:07:58.767731 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-wctgk" Oct 03 08:08:08 crc kubenswrapper[4664]: E1003 08:08:08.407439 4664 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Oct 03 08:08:08 crc kubenswrapper[4664]: E1003 08:08:08.408291 4664 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2x5fz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-djcmk_openstack(d75d0f35-4cda-4925-8d0d-1666f794ce9b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 08:08:08 crc kubenswrapper[4664]: E1003 08:08:08.409528 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-djcmk" podUID="d75d0f35-4cda-4925-8d0d-1666f794ce9b" Oct 03 08:08:09 crc kubenswrapper[4664]: I1003 08:08:09.090589 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-wctgk"] Oct 03 08:08:09 crc kubenswrapper[4664]: I1003 08:08:09.150099 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-8mfsn"] Oct 03 08:08:09 crc kubenswrapper[4664]: I1003 08:08:09.156594 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-h4n98"] Oct 03 08:08:09 crc kubenswrapper[4664]: I1003 08:08:09.241108 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-bn2c5"] Oct 03 08:08:09 crc kubenswrapper[4664]: I1003 08:08:09.256330 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e","Type":"ContainerStarted","Data":"73146d594b3cc3b96f3384e4c35d1958db1ee696f93b8573a2ba33e3b13952fb"} Oct 03 08:08:09 crc kubenswrapper[4664]: I1003 08:08:09.256567 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e","Type":"ContainerStarted","Data":"15a54c932625aaad40a91c095c4662c5860e55daffb1c9be08667087e58b6c01"} Oct 03 08:08:09 crc kubenswrapper[4664]: I1003 08:08:09.256584 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e","Type":"ContainerStarted","Data":"f16d3726375e0d2721988db7669f9eb0ae6a9b5eb1280f85ec4ff5333b9e7a80"} Oct 03 08:08:09 crc kubenswrapper[4664]: I1003 08:08:09.258184 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-h4n98" event={"ID":"13d51df0-aff8-4202-b94c-814faaf05cbb","Type":"ContainerStarted","Data":"2b573e87da1461e51e5ab8a8f032d7ad287d6c5a9b7ec4e028c3d457d7cb3599"} Oct 03 08:08:09 crc kubenswrapper[4664]: I1003 08:08:09.260447 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-8mfsn" event={"ID":"9197763c-fb03-4db8-9edb-b35fdc61856f","Type":"ContainerStarted","Data":"b0388cfd51d8cfdb126d3bfe8ab96be69bebe99ce3d2318cb85423400d42d690"} Oct 03 08:08:09 crc kubenswrapper[4664]: I1003 08:08:09.268467 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-wctgk" event={"ID":"a629e3c9-2dba-46d6-a978-bc6133d150e1","Type":"ContainerStarted","Data":"6d4e94ca81e9e9ae7e8b1b95e5c42883938d51aa54297c13c7de636270458640"} Oct 03 08:08:09 crc kubenswrapper[4664]: E1003 08:08:09.280799 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-djcmk" podUID="d75d0f35-4cda-4925-8d0d-1666f794ce9b" Oct 03 08:08:10 crc kubenswrapper[4664]: I1003 08:08:10.283243 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e","Type":"ContainerStarted","Data":"d75cea37e9a3939a34224173685a03238fe3f6e0ef9928723b8b8f72cd81aace"} Oct 03 08:08:10 crc kubenswrapper[4664]: I1003 08:08:10.285591 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e","Type":"ContainerStarted","Data":"a309112c38289333297ab22a3809c7c485db47e34b0eeef4004a5830dfafdbb2"} Oct 03 08:08:10 crc kubenswrapper[4664]: I1003 08:08:10.285986 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e","Type":"ContainerStarted","Data":"481da70e4e031fa1cc2501a5f7689ad9b761060abbc3f7a0fb9e95439ec86a1f"} Oct 03 08:08:10 crc kubenswrapper[4664]: I1003 08:08:10.287937 4664 generic.go:334] "Generic (PLEG): container finished" podID="13d51df0-aff8-4202-b94c-814faaf05cbb" containerID="039bbc1e3505a3edb29319c43b0c1ae59c617ffc88403d3741a4819dbff7a581" exitCode=0 Oct 03 08:08:10 crc kubenswrapper[4664]: I1003 08:08:10.288001 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-h4n98" event={"ID":"13d51df0-aff8-4202-b94c-814faaf05cbb","Type":"ContainerDied","Data":"039bbc1e3505a3edb29319c43b0c1ae59c617ffc88403d3741a4819dbff7a581"} Oct 03 08:08:10 crc kubenswrapper[4664]: I1003 08:08:10.289851 4664 generic.go:334] "Generic (PLEG): container finished" podID="a629e3c9-2dba-46d6-a978-bc6133d150e1" containerID="30a26a31e0379301afd7539be361b0208d595d01bd2a18bb8ade062568ec1f4f" exitCode=0 Oct 03 08:08:10 crc kubenswrapper[4664]: I1003 08:08:10.290073 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-wctgk" event={"ID":"a629e3c9-2dba-46d6-a978-bc6133d150e1","Type":"ContainerDied","Data":"30a26a31e0379301afd7539be361b0208d595d01bd2a18bb8ade062568ec1f4f"} Oct 03 08:08:10 crc kubenswrapper[4664]: I1003 08:08:10.292024 4664 generic.go:334] "Generic (PLEG): container finished" podID="ac091c1c-fab5-4ecd-87ff-c2a7a0367d5a" containerID="ddc484af05261aab8ea4328fe71be3ffd1e86e359c997c314493cf278dec7610" exitCode=0 Oct 03 08:08:10 crc kubenswrapper[4664]: I1003 08:08:10.292069 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-bn2c5" event={"ID":"ac091c1c-fab5-4ecd-87ff-c2a7a0367d5a","Type":"ContainerDied","Data":"ddc484af05261aab8ea4328fe71be3ffd1e86e359c997c314493cf278dec7610"} Oct 03 08:08:10 crc kubenswrapper[4664]: I1003 08:08:10.292092 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-bn2c5" event={"ID":"ac091c1c-fab5-4ecd-87ff-c2a7a0367d5a","Type":"ContainerStarted","Data":"3999ef3e4a8992c6f8103c33f211030bfbb56162181cd2cd3c877c5e8b116e54"} Oct 03 08:08:10 crc kubenswrapper[4664]: I1003 08:08:10.327615 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=48.202913239 podStartE2EDuration="58.327563074s" podCreationTimestamp="2025-10-03 08:07:12 +0000 UTC" firstStartedPulling="2025-10-03 08:07:47.140684532 +0000 UTC m=+1167.961875022" lastFinishedPulling="2025-10-03 08:07:57.265334367 +0000 UTC m=+1178.086524857" observedRunningTime="2025-10-03 08:08:10.317426835 +0000 UTC m=+1191.138617345" watchObservedRunningTime="2025-10-03 08:08:10.327563074 +0000 UTC m=+1191.148753564" Oct 03 08:08:10 crc kubenswrapper[4664]: I1003 08:08:10.623696 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-t5mkq"] Oct 03 08:08:10 crc kubenswrapper[4664]: I1003 08:08:10.625882 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-t5mkq" Oct 03 08:08:10 crc kubenswrapper[4664]: I1003 08:08:10.629717 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 03 08:08:10 crc kubenswrapper[4664]: I1003 08:08:10.645132 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-t5mkq"] Oct 03 08:08:10 crc kubenswrapper[4664]: I1003 08:08:10.749297 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93914ba6-0d1c-4808-ab73-9e0496e68f51-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-t5mkq\" (UID: \"93914ba6-0d1c-4808-ab73-9e0496e68f51\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-t5mkq" Oct 03 08:08:10 crc kubenswrapper[4664]: I1003 08:08:10.749772 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/93914ba6-0d1c-4808-ab73-9e0496e68f51-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-t5mkq\" (UID: \"93914ba6-0d1c-4808-ab73-9e0496e68f51\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-t5mkq" Oct 03 08:08:10 crc kubenswrapper[4664]: I1003 08:08:10.749828 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93914ba6-0d1c-4808-ab73-9e0496e68f51-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-t5mkq\" (UID: \"93914ba6-0d1c-4808-ab73-9e0496e68f51\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-t5mkq" Oct 03 08:08:10 crc kubenswrapper[4664]: I1003 08:08:10.749899 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93914ba6-0d1c-4808-ab73-9e0496e68f51-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-t5mkq\" (UID: \"93914ba6-0d1c-4808-ab73-9e0496e68f51\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-t5mkq" Oct 03 08:08:10 crc kubenswrapper[4664]: I1003 08:08:10.749952 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tkzq\" (UniqueName: \"kubernetes.io/projected/93914ba6-0d1c-4808-ab73-9e0496e68f51-kube-api-access-7tkzq\") pod \"dnsmasq-dns-6d5b6d6b67-t5mkq\" (UID: \"93914ba6-0d1c-4808-ab73-9e0496e68f51\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-t5mkq" Oct 03 08:08:10 crc kubenswrapper[4664]: I1003 08:08:10.749981 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93914ba6-0d1c-4808-ab73-9e0496e68f51-config\") pod \"dnsmasq-dns-6d5b6d6b67-t5mkq\" (UID: \"93914ba6-0d1c-4808-ab73-9e0496e68f51\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-t5mkq" Oct 03 08:08:10 crc kubenswrapper[4664]: I1003 08:08:10.851661 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tkzq\" (UniqueName: \"kubernetes.io/projected/93914ba6-0d1c-4808-ab73-9e0496e68f51-kube-api-access-7tkzq\") pod \"dnsmasq-dns-6d5b6d6b67-t5mkq\" (UID: \"93914ba6-0d1c-4808-ab73-9e0496e68f51\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-t5mkq" Oct 03 08:08:10 crc kubenswrapper[4664]: I1003 08:08:10.851711 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93914ba6-0d1c-4808-ab73-9e0496e68f51-config\") pod \"dnsmasq-dns-6d5b6d6b67-t5mkq\" (UID: \"93914ba6-0d1c-4808-ab73-9e0496e68f51\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-t5mkq" Oct 03 08:08:10 crc kubenswrapper[4664]: I1003 08:08:10.851783 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93914ba6-0d1c-4808-ab73-9e0496e68f51-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-t5mkq\" (UID: \"93914ba6-0d1c-4808-ab73-9e0496e68f51\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-t5mkq" Oct 03 08:08:10 crc kubenswrapper[4664]: I1003 08:08:10.851837 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/93914ba6-0d1c-4808-ab73-9e0496e68f51-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-t5mkq\" (UID: \"93914ba6-0d1c-4808-ab73-9e0496e68f51\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-t5mkq" Oct 03 08:08:10 crc kubenswrapper[4664]: I1003 08:08:10.851875 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93914ba6-0d1c-4808-ab73-9e0496e68f51-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-t5mkq\" (UID: \"93914ba6-0d1c-4808-ab73-9e0496e68f51\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-t5mkq" Oct 03 08:08:10 crc kubenswrapper[4664]: I1003 08:08:10.851897 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93914ba6-0d1c-4808-ab73-9e0496e68f51-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-t5mkq\" (UID: \"93914ba6-0d1c-4808-ab73-9e0496e68f51\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-t5mkq" Oct 03 08:08:10 crc kubenswrapper[4664]: I1003 08:08:10.852940 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93914ba6-0d1c-4808-ab73-9e0496e68f51-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-t5mkq\" (UID: \"93914ba6-0d1c-4808-ab73-9e0496e68f51\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-t5mkq" Oct 03 08:08:10 crc kubenswrapper[4664]: I1003 08:08:10.854308 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/93914ba6-0d1c-4808-ab73-9e0496e68f51-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-t5mkq\" (UID: \"93914ba6-0d1c-4808-ab73-9e0496e68f51\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-t5mkq" Oct 03 08:08:10 crc kubenswrapper[4664]: I1003 08:08:10.854520 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93914ba6-0d1c-4808-ab73-9e0496e68f51-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-t5mkq\" (UID: \"93914ba6-0d1c-4808-ab73-9e0496e68f51\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-t5mkq" Oct 03 08:08:10 crc kubenswrapper[4664]: I1003 08:08:10.854722 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93914ba6-0d1c-4808-ab73-9e0496e68f51-config\") pod \"dnsmasq-dns-6d5b6d6b67-t5mkq\" (UID: \"93914ba6-0d1c-4808-ab73-9e0496e68f51\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-t5mkq" Oct 03 08:08:10 crc kubenswrapper[4664]: I1003 08:08:10.855028 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93914ba6-0d1c-4808-ab73-9e0496e68f51-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-t5mkq\" (UID: \"93914ba6-0d1c-4808-ab73-9e0496e68f51\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-t5mkq" Oct 03 08:08:10 crc kubenswrapper[4664]: I1003 08:08:10.885903 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tkzq\" (UniqueName: \"kubernetes.io/projected/93914ba6-0d1c-4808-ab73-9e0496e68f51-kube-api-access-7tkzq\") pod \"dnsmasq-dns-6d5b6d6b67-t5mkq\" (UID: \"93914ba6-0d1c-4808-ab73-9e0496e68f51\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-t5mkq" Oct 03 08:08:10 crc kubenswrapper[4664]: I1003 08:08:10.949423 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-t5mkq" Oct 03 08:08:11 crc kubenswrapper[4664]: I1003 08:08:11.501919 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-t5mkq"] Oct 03 08:08:11 crc kubenswrapper[4664]: I1003 08:08:11.591517 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-wctgk" Oct 03 08:08:11 crc kubenswrapper[4664]: I1003 08:08:11.678559 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbq2k\" (UniqueName: \"kubernetes.io/projected/a629e3c9-2dba-46d6-a978-bc6133d150e1-kube-api-access-bbq2k\") pod \"a629e3c9-2dba-46d6-a978-bc6133d150e1\" (UID: \"a629e3c9-2dba-46d6-a978-bc6133d150e1\") " Oct 03 08:08:11 crc kubenswrapper[4664]: I1003 08:08:11.684789 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a629e3c9-2dba-46d6-a978-bc6133d150e1-kube-api-access-bbq2k" (OuterVolumeSpecName: "kube-api-access-bbq2k") pod "a629e3c9-2dba-46d6-a978-bc6133d150e1" (UID: "a629e3c9-2dba-46d6-a978-bc6133d150e1"). InnerVolumeSpecName "kube-api-access-bbq2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:08:11 crc kubenswrapper[4664]: I1003 08:08:11.752393 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-h4n98" Oct 03 08:08:11 crc kubenswrapper[4664]: I1003 08:08:11.774918 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-bn2c5" Oct 03 08:08:11 crc kubenswrapper[4664]: I1003 08:08:11.780862 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbq2k\" (UniqueName: \"kubernetes.io/projected/a629e3c9-2dba-46d6-a978-bc6133d150e1-kube-api-access-bbq2k\") on node \"crc\" DevicePath \"\"" Oct 03 08:08:11 crc kubenswrapper[4664]: I1003 08:08:11.881671 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgkvh\" (UniqueName: \"kubernetes.io/projected/ac091c1c-fab5-4ecd-87ff-c2a7a0367d5a-kube-api-access-vgkvh\") pod \"ac091c1c-fab5-4ecd-87ff-c2a7a0367d5a\" (UID: \"ac091c1c-fab5-4ecd-87ff-c2a7a0367d5a\") " Oct 03 08:08:11 crc kubenswrapper[4664]: I1003 08:08:11.881962 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfrfk\" (UniqueName: \"kubernetes.io/projected/13d51df0-aff8-4202-b94c-814faaf05cbb-kube-api-access-lfrfk\") pod \"13d51df0-aff8-4202-b94c-814faaf05cbb\" (UID: \"13d51df0-aff8-4202-b94c-814faaf05cbb\") " Oct 03 08:08:11 crc kubenswrapper[4664]: I1003 08:08:11.886555 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13d51df0-aff8-4202-b94c-814faaf05cbb-kube-api-access-lfrfk" (OuterVolumeSpecName: "kube-api-access-lfrfk") pod "13d51df0-aff8-4202-b94c-814faaf05cbb" (UID: "13d51df0-aff8-4202-b94c-814faaf05cbb"). InnerVolumeSpecName "kube-api-access-lfrfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:08:11 crc kubenswrapper[4664]: I1003 08:08:11.889710 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac091c1c-fab5-4ecd-87ff-c2a7a0367d5a-kube-api-access-vgkvh" (OuterVolumeSpecName: "kube-api-access-vgkvh") pod "ac091c1c-fab5-4ecd-87ff-c2a7a0367d5a" (UID: "ac091c1c-fab5-4ecd-87ff-c2a7a0367d5a"). InnerVolumeSpecName "kube-api-access-vgkvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:08:11 crc kubenswrapper[4664]: I1003 08:08:11.985173 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgkvh\" (UniqueName: \"kubernetes.io/projected/ac091c1c-fab5-4ecd-87ff-c2a7a0367d5a-kube-api-access-vgkvh\") on node \"crc\" DevicePath \"\"" Oct 03 08:08:11 crc kubenswrapper[4664]: I1003 08:08:11.985214 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfrfk\" (UniqueName: \"kubernetes.io/projected/13d51df0-aff8-4202-b94c-814faaf05cbb-kube-api-access-lfrfk\") on node \"crc\" DevicePath \"\"" Oct 03 08:08:12 crc kubenswrapper[4664]: I1003 08:08:12.316348 4664 generic.go:334] "Generic (PLEG): container finished" podID="93914ba6-0d1c-4808-ab73-9e0496e68f51" containerID="df6cc01e86b33f67c257bf38c657edf17a2b893728e293db060151d521c5f468" exitCode=0 Oct 03 08:08:12 crc kubenswrapper[4664]: I1003 08:08:12.316486 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-t5mkq" event={"ID":"93914ba6-0d1c-4808-ab73-9e0496e68f51","Type":"ContainerDied","Data":"df6cc01e86b33f67c257bf38c657edf17a2b893728e293db060151d521c5f468"} Oct 03 08:08:12 crc kubenswrapper[4664]: I1003 08:08:12.316755 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-t5mkq" event={"ID":"93914ba6-0d1c-4808-ab73-9e0496e68f51","Type":"ContainerStarted","Data":"321deae3cd8e887531be923d82d52b21f1e4df5e33aafcaba3bad624ff39ea2f"} Oct 03 08:08:12 crc kubenswrapper[4664]: I1003 08:08:12.319979 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-h4n98" event={"ID":"13d51df0-aff8-4202-b94c-814faaf05cbb","Type":"ContainerDied","Data":"2b573e87da1461e51e5ab8a8f032d7ad287d6c5a9b7ec4e028c3d457d7cb3599"} Oct 03 08:08:12 crc kubenswrapper[4664]: I1003 08:08:12.320040 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b573e87da1461e51e5ab8a8f032d7ad287d6c5a9b7ec4e028c3d457d7cb3599" Oct 03 08:08:12 crc kubenswrapper[4664]: I1003 08:08:12.320087 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-h4n98" Oct 03 08:08:12 crc kubenswrapper[4664]: I1003 08:08:12.327553 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-wctgk" event={"ID":"a629e3c9-2dba-46d6-a978-bc6133d150e1","Type":"ContainerDied","Data":"6d4e94ca81e9e9ae7e8b1b95e5c42883938d51aa54297c13c7de636270458640"} Oct 03 08:08:12 crc kubenswrapper[4664]: I1003 08:08:12.327594 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d4e94ca81e9e9ae7e8b1b95e5c42883938d51aa54297c13c7de636270458640" Oct 03 08:08:12 crc kubenswrapper[4664]: I1003 08:08:12.327684 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-wctgk" Oct 03 08:08:12 crc kubenswrapper[4664]: I1003 08:08:12.336466 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-bn2c5" event={"ID":"ac091c1c-fab5-4ecd-87ff-c2a7a0367d5a","Type":"ContainerDied","Data":"3999ef3e4a8992c6f8103c33f211030bfbb56162181cd2cd3c877c5e8b116e54"} Oct 03 08:08:12 crc kubenswrapper[4664]: I1003 08:08:12.336507 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3999ef3e4a8992c6f8103c33f211030bfbb56162181cd2cd3c877c5e8b116e54" Oct 03 08:08:12 crc kubenswrapper[4664]: I1003 08:08:12.336546 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-bn2c5" Oct 03 08:08:16 crc kubenswrapper[4664]: I1003 08:08:16.371961 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-t5mkq" event={"ID":"93914ba6-0d1c-4808-ab73-9e0496e68f51","Type":"ContainerStarted","Data":"c80ae21670cfdf45390f944d9befb056e9cfb58c56c5fa77b4aecca711c06e02"} Oct 03 08:08:16 crc kubenswrapper[4664]: I1003 08:08:16.372290 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d5b6d6b67-t5mkq" Oct 03 08:08:16 crc kubenswrapper[4664]: I1003 08:08:16.373514 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-8mfsn" event={"ID":"9197763c-fb03-4db8-9edb-b35fdc61856f","Type":"ContainerStarted","Data":"eb64e74ea9a2cba531637e440f05cb932c6b4375f5f1c63ec64f3e11fe1fff6d"} Oct 03 08:08:16 crc kubenswrapper[4664]: I1003 08:08:16.416808 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-8mfsn" podStartSLOduration=11.714274356 podStartE2EDuration="18.416786676s" podCreationTimestamp="2025-10-03 08:07:58 +0000 UTC" firstStartedPulling="2025-10-03 08:08:09.171786048 +0000 UTC m=+1189.992976538" lastFinishedPulling="2025-10-03 08:08:15.874298368 +0000 UTC m=+1196.695488858" observedRunningTime="2025-10-03 08:08:16.412918946 +0000 UTC m=+1197.234109466" watchObservedRunningTime="2025-10-03 08:08:16.416786676 +0000 UTC m=+1197.237977176" Oct 03 08:08:16 crc kubenswrapper[4664]: I1003 08:08:16.418958 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d5b6d6b67-t5mkq" podStartSLOduration=6.418947578 podStartE2EDuration="6.418947578s" podCreationTimestamp="2025-10-03 08:08:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:08:16.394793459 +0000 UTC m=+1197.215983959" watchObservedRunningTime="2025-10-03 08:08:16.418947578 +0000 UTC m=+1197.240138068" Oct 03 08:08:18 crc kubenswrapper[4664]: I1003 08:08:18.145614 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-6b46-account-create-fgbmb"] Oct 03 08:08:18 crc kubenswrapper[4664]: E1003 08:08:18.148784 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a629e3c9-2dba-46d6-a978-bc6133d150e1" containerName="mariadb-database-create" Oct 03 08:08:18 crc kubenswrapper[4664]: I1003 08:08:18.148908 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="a629e3c9-2dba-46d6-a978-bc6133d150e1" containerName="mariadb-database-create" Oct 03 08:08:18 crc kubenswrapper[4664]: E1003 08:08:18.148979 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac091c1c-fab5-4ecd-87ff-c2a7a0367d5a" containerName="mariadb-database-create" Oct 03 08:08:18 crc kubenswrapper[4664]: I1003 08:08:18.149033 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac091c1c-fab5-4ecd-87ff-c2a7a0367d5a" containerName="mariadb-database-create" Oct 03 08:08:18 crc kubenswrapper[4664]: E1003 08:08:18.149095 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13d51df0-aff8-4202-b94c-814faaf05cbb" containerName="mariadb-database-create" Oct 03 08:08:18 crc kubenswrapper[4664]: I1003 08:08:18.149155 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="13d51df0-aff8-4202-b94c-814faaf05cbb" containerName="mariadb-database-create" Oct 03 08:08:18 crc kubenswrapper[4664]: I1003 08:08:18.149413 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="13d51df0-aff8-4202-b94c-814faaf05cbb" containerName="mariadb-database-create" Oct 03 08:08:18 crc kubenswrapper[4664]: I1003 08:08:18.149495 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac091c1c-fab5-4ecd-87ff-c2a7a0367d5a" containerName="mariadb-database-create" Oct 03 08:08:18 crc kubenswrapper[4664]: I1003 08:08:18.149564 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="a629e3c9-2dba-46d6-a978-bc6133d150e1" containerName="mariadb-database-create" Oct 03 08:08:18 crc kubenswrapper[4664]: I1003 08:08:18.150183 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6b46-account-create-fgbmb" Oct 03 08:08:18 crc kubenswrapper[4664]: I1003 08:08:18.155311 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 03 08:08:18 crc kubenswrapper[4664]: I1003 08:08:18.169221 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-6b46-account-create-fgbmb"] Oct 03 08:08:18 crc kubenswrapper[4664]: I1003 08:08:18.249365 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp4gp\" (UniqueName: \"kubernetes.io/projected/9dfe6751-9a34-4e24-9769-6fe97328ef4a-kube-api-access-kp4gp\") pod \"barbican-6b46-account-create-fgbmb\" (UID: \"9dfe6751-9a34-4e24-9769-6fe97328ef4a\") " pod="openstack/barbican-6b46-account-create-fgbmb" Oct 03 08:08:18 crc kubenswrapper[4664]: I1003 08:08:18.346245 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-ab56-account-create-2c74m"] Oct 03 08:08:18 crc kubenswrapper[4664]: I1003 08:08:18.347574 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ab56-account-create-2c74m" Oct 03 08:08:18 crc kubenswrapper[4664]: I1003 08:08:18.349807 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 03 08:08:18 crc kubenswrapper[4664]: I1003 08:08:18.350675 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kp4gp\" (UniqueName: \"kubernetes.io/projected/9dfe6751-9a34-4e24-9769-6fe97328ef4a-kube-api-access-kp4gp\") pod \"barbican-6b46-account-create-fgbmb\" (UID: \"9dfe6751-9a34-4e24-9769-6fe97328ef4a\") " pod="openstack/barbican-6b46-account-create-fgbmb" Oct 03 08:08:18 crc kubenswrapper[4664]: I1003 08:08:18.360501 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-ab56-account-create-2c74m"] Oct 03 08:08:18 crc kubenswrapper[4664]: I1003 08:08:18.375296 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp4gp\" (UniqueName: \"kubernetes.io/projected/9dfe6751-9a34-4e24-9769-6fe97328ef4a-kube-api-access-kp4gp\") pod \"barbican-6b46-account-create-fgbmb\" (UID: \"9dfe6751-9a34-4e24-9769-6fe97328ef4a\") " pod="openstack/barbican-6b46-account-create-fgbmb" Oct 03 08:08:18 crc kubenswrapper[4664]: I1003 08:08:18.452955 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkc5n\" (UniqueName: \"kubernetes.io/projected/caaff67b-4184-4473-a892-305de99f3886-kube-api-access-xkc5n\") pod \"cinder-ab56-account-create-2c74m\" (UID: \"caaff67b-4184-4473-a892-305de99f3886\") " pod="openstack/cinder-ab56-account-create-2c74m" Oct 03 08:08:18 crc kubenswrapper[4664]: I1003 08:08:18.471481 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6b46-account-create-fgbmb" Oct 03 08:08:18 crc kubenswrapper[4664]: I1003 08:08:18.548224 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-691a-account-create-v52t7"] Oct 03 08:08:18 crc kubenswrapper[4664]: I1003 08:08:18.551733 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-691a-account-create-v52t7" Oct 03 08:08:18 crc kubenswrapper[4664]: I1003 08:08:18.555165 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 03 08:08:18 crc kubenswrapper[4664]: I1003 08:08:18.555469 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkc5n\" (UniqueName: \"kubernetes.io/projected/caaff67b-4184-4473-a892-305de99f3886-kube-api-access-xkc5n\") pod \"cinder-ab56-account-create-2c74m\" (UID: \"caaff67b-4184-4473-a892-305de99f3886\") " pod="openstack/cinder-ab56-account-create-2c74m" Oct 03 08:08:18 crc kubenswrapper[4664]: I1003 08:08:18.559354 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-691a-account-create-v52t7"] Oct 03 08:08:18 crc kubenswrapper[4664]: I1003 08:08:18.580184 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkc5n\" (UniqueName: \"kubernetes.io/projected/caaff67b-4184-4473-a892-305de99f3886-kube-api-access-xkc5n\") pod \"cinder-ab56-account-create-2c74m\" (UID: \"caaff67b-4184-4473-a892-305de99f3886\") " pod="openstack/cinder-ab56-account-create-2c74m" Oct 03 08:08:18 crc kubenswrapper[4664]: I1003 08:08:18.657273 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grdb4\" (UniqueName: \"kubernetes.io/projected/9c1c6c43-9fc4-46f4-b47e-620efb9d7c4a-kube-api-access-grdb4\") pod \"neutron-691a-account-create-v52t7\" (UID: \"9c1c6c43-9fc4-46f4-b47e-620efb9d7c4a\") " pod="openstack/neutron-691a-account-create-v52t7" Oct 03 08:08:18 crc kubenswrapper[4664]: I1003 08:08:18.667535 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ab56-account-create-2c74m" Oct 03 08:08:18 crc kubenswrapper[4664]: I1003 08:08:18.759939 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grdb4\" (UniqueName: \"kubernetes.io/projected/9c1c6c43-9fc4-46f4-b47e-620efb9d7c4a-kube-api-access-grdb4\") pod \"neutron-691a-account-create-v52t7\" (UID: \"9c1c6c43-9fc4-46f4-b47e-620efb9d7c4a\") " pod="openstack/neutron-691a-account-create-v52t7" Oct 03 08:08:18 crc kubenswrapper[4664]: I1003 08:08:18.818721 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grdb4\" (UniqueName: \"kubernetes.io/projected/9c1c6c43-9fc4-46f4-b47e-620efb9d7c4a-kube-api-access-grdb4\") pod \"neutron-691a-account-create-v52t7\" (UID: \"9c1c6c43-9fc4-46f4-b47e-620efb9d7c4a\") " pod="openstack/neutron-691a-account-create-v52t7" Oct 03 08:08:18 crc kubenswrapper[4664]: I1003 08:08:18.952360 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-691a-account-create-v52t7" Oct 03 08:08:19 crc kubenswrapper[4664]: I1003 08:08:19.264073 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-6b46-account-create-fgbmb"] Oct 03 08:08:19 crc kubenswrapper[4664]: W1003 08:08:19.267239 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9dfe6751_9a34_4e24_9769_6fe97328ef4a.slice/crio-8ab8cbe67f69cab5bec84b0ece11c34875c18477556a472cc021a7029a5846cb WatchSource:0}: Error finding container 8ab8cbe67f69cab5bec84b0ece11c34875c18477556a472cc021a7029a5846cb: Status 404 returned error can't find the container with id 8ab8cbe67f69cab5bec84b0ece11c34875c18477556a472cc021a7029a5846cb Oct 03 08:08:19 crc kubenswrapper[4664]: I1003 08:08:19.373533 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-ab56-account-create-2c74m"] Oct 03 08:08:19 crc kubenswrapper[4664]: W1003 08:08:19.388341 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcaaff67b_4184_4473_a892_305de99f3886.slice/crio-1e172864cc0262ce9b62b66f5ca597980aa684ffef104ef50ec719b8962bda18 WatchSource:0}: Error finding container 1e172864cc0262ce9b62b66f5ca597980aa684ffef104ef50ec719b8962bda18: Status 404 returned error can't find the container with id 1e172864cc0262ce9b62b66f5ca597980aa684ffef104ef50ec719b8962bda18 Oct 03 08:08:19 crc kubenswrapper[4664]: I1003 08:08:19.410338 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ab56-account-create-2c74m" event={"ID":"caaff67b-4184-4473-a892-305de99f3886","Type":"ContainerStarted","Data":"1e172864cc0262ce9b62b66f5ca597980aa684ffef104ef50ec719b8962bda18"} Oct 03 08:08:19 crc kubenswrapper[4664]: I1003 08:08:19.411634 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6b46-account-create-fgbmb" event={"ID":"9dfe6751-9a34-4e24-9769-6fe97328ef4a","Type":"ContainerStarted","Data":"8ab8cbe67f69cab5bec84b0ece11c34875c18477556a472cc021a7029a5846cb"} Oct 03 08:08:19 crc kubenswrapper[4664]: I1003 08:08:19.457935 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-691a-account-create-v52t7"] Oct 03 08:08:19 crc kubenswrapper[4664]: W1003 08:08:19.469507 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c1c6c43_9fc4_46f4_b47e_620efb9d7c4a.slice/crio-185ea6f45bc04e966d2a7fba9d6787e0a99a4476456f31a3cf8c30d1852b3b1f WatchSource:0}: Error finding container 185ea6f45bc04e966d2a7fba9d6787e0a99a4476456f31a3cf8c30d1852b3b1f: Status 404 returned error can't find the container with id 185ea6f45bc04e966d2a7fba9d6787e0a99a4476456f31a3cf8c30d1852b3b1f Oct 03 08:08:19 crc kubenswrapper[4664]: W1003 08:08:19.932972 4664 container.go:586] Failed to update stats for container "/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcaaff67b_4184_4473_a892_305de99f3886.slice/crio-1e172864cc0262ce9b62b66f5ca597980aa684ffef104ef50ec719b8962bda18": error while statting cgroup v2: [unable to parse /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcaaff67b_4184_4473_a892_305de99f3886.slice/crio-1e172864cc0262ce9b62b66f5ca597980aa684ffef104ef50ec719b8962bda18/memory.stat: read /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcaaff67b_4184_4473_a892_305de99f3886.slice/crio-1e172864cc0262ce9b62b66f5ca597980aa684ffef104ef50ec719b8962bda18/memory.stat: no such device], continuing to push stats Oct 03 08:08:20 crc kubenswrapper[4664]: I1003 08:08:20.424054 4664 generic.go:334] "Generic (PLEG): container finished" podID="9c1c6c43-9fc4-46f4-b47e-620efb9d7c4a" containerID="90ae26e109a6e5662ae4b4f690e8d4514c32b784b4472b876cffad013773a11f" exitCode=0 Oct 03 08:08:20 crc kubenswrapper[4664]: I1003 08:08:20.424202 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-691a-account-create-v52t7" event={"ID":"9c1c6c43-9fc4-46f4-b47e-620efb9d7c4a","Type":"ContainerDied","Data":"90ae26e109a6e5662ae4b4f690e8d4514c32b784b4472b876cffad013773a11f"} Oct 03 08:08:20 crc kubenswrapper[4664]: I1003 08:08:20.424255 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-691a-account-create-v52t7" event={"ID":"9c1c6c43-9fc4-46f4-b47e-620efb9d7c4a","Type":"ContainerStarted","Data":"185ea6f45bc04e966d2a7fba9d6787e0a99a4476456f31a3cf8c30d1852b3b1f"} Oct 03 08:08:20 crc kubenswrapper[4664]: I1003 08:08:20.430448 4664 generic.go:334] "Generic (PLEG): container finished" podID="caaff67b-4184-4473-a892-305de99f3886" containerID="72fce17de0d48c9396b4eed1952bbe92dec8d0573dbad65dfb7a261bc47ce5c1" exitCode=0 Oct 03 08:08:20 crc kubenswrapper[4664]: I1003 08:08:20.430614 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ab56-account-create-2c74m" event={"ID":"caaff67b-4184-4473-a892-305de99f3886","Type":"ContainerDied","Data":"72fce17de0d48c9396b4eed1952bbe92dec8d0573dbad65dfb7a261bc47ce5c1"} Oct 03 08:08:20 crc kubenswrapper[4664]: I1003 08:08:20.442952 4664 generic.go:334] "Generic (PLEG): container finished" podID="9dfe6751-9a34-4e24-9769-6fe97328ef4a" containerID="fc414354f59d3a8c75fb2e543abed1b2e50124b4fecb4834ef68f4531fca7eea" exitCode=0 Oct 03 08:08:20 crc kubenswrapper[4664]: I1003 08:08:20.443032 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6b46-account-create-fgbmb" event={"ID":"9dfe6751-9a34-4e24-9769-6fe97328ef4a","Type":"ContainerDied","Data":"fc414354f59d3a8c75fb2e543abed1b2e50124b4fecb4834ef68f4531fca7eea"} Oct 03 08:08:21 crc kubenswrapper[4664]: I1003 08:08:21.455188 4664 generic.go:334] "Generic (PLEG): container finished" podID="9197763c-fb03-4db8-9edb-b35fdc61856f" containerID="eb64e74ea9a2cba531637e440f05cb932c6b4375f5f1c63ec64f3e11fe1fff6d" exitCode=0 Oct 03 08:08:21 crc kubenswrapper[4664]: I1003 08:08:21.455238 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-8mfsn" event={"ID":"9197763c-fb03-4db8-9edb-b35fdc61856f","Type":"ContainerDied","Data":"eb64e74ea9a2cba531637e440f05cb932c6b4375f5f1c63ec64f3e11fe1fff6d"} Oct 03 08:08:21 crc kubenswrapper[4664]: I1003 08:08:21.844261 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-691a-account-create-v52t7" Oct 03 08:08:21 crc kubenswrapper[4664]: I1003 08:08:21.852420 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6b46-account-create-fgbmb" Oct 03 08:08:21 crc kubenswrapper[4664]: I1003 08:08:21.859628 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ab56-account-create-2c74m" Oct 03 08:08:21 crc kubenswrapper[4664]: I1003 08:08:21.921512 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grdb4\" (UniqueName: \"kubernetes.io/projected/9c1c6c43-9fc4-46f4-b47e-620efb9d7c4a-kube-api-access-grdb4\") pod \"9c1c6c43-9fc4-46f4-b47e-620efb9d7c4a\" (UID: \"9c1c6c43-9fc4-46f4-b47e-620efb9d7c4a\") " Oct 03 08:08:21 crc kubenswrapper[4664]: I1003 08:08:21.956801 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c1c6c43-9fc4-46f4-b47e-620efb9d7c4a-kube-api-access-grdb4" (OuterVolumeSpecName: "kube-api-access-grdb4") pod "9c1c6c43-9fc4-46f4-b47e-620efb9d7c4a" (UID: "9c1c6c43-9fc4-46f4-b47e-620efb9d7c4a"). InnerVolumeSpecName "kube-api-access-grdb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:08:22 crc kubenswrapper[4664]: I1003 08:08:22.023259 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkc5n\" (UniqueName: \"kubernetes.io/projected/caaff67b-4184-4473-a892-305de99f3886-kube-api-access-xkc5n\") pod \"caaff67b-4184-4473-a892-305de99f3886\" (UID: \"caaff67b-4184-4473-a892-305de99f3886\") " Oct 03 08:08:22 crc kubenswrapper[4664]: I1003 08:08:22.023378 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kp4gp\" (UniqueName: \"kubernetes.io/projected/9dfe6751-9a34-4e24-9769-6fe97328ef4a-kube-api-access-kp4gp\") pod \"9dfe6751-9a34-4e24-9769-6fe97328ef4a\" (UID: \"9dfe6751-9a34-4e24-9769-6fe97328ef4a\") " Oct 03 08:08:22 crc kubenswrapper[4664]: I1003 08:08:22.024006 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grdb4\" (UniqueName: \"kubernetes.io/projected/9c1c6c43-9fc4-46f4-b47e-620efb9d7c4a-kube-api-access-grdb4\") on node \"crc\" DevicePath \"\"" Oct 03 08:08:22 crc kubenswrapper[4664]: I1003 08:08:22.035035 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/caaff67b-4184-4473-a892-305de99f3886-kube-api-access-xkc5n" (OuterVolumeSpecName: "kube-api-access-xkc5n") pod "caaff67b-4184-4473-a892-305de99f3886" (UID: "caaff67b-4184-4473-a892-305de99f3886"). InnerVolumeSpecName "kube-api-access-xkc5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:08:22 crc kubenswrapper[4664]: I1003 08:08:22.035554 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dfe6751-9a34-4e24-9769-6fe97328ef4a-kube-api-access-kp4gp" (OuterVolumeSpecName: "kube-api-access-kp4gp") pod "9dfe6751-9a34-4e24-9769-6fe97328ef4a" (UID: "9dfe6751-9a34-4e24-9769-6fe97328ef4a"). InnerVolumeSpecName "kube-api-access-kp4gp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:08:22 crc kubenswrapper[4664]: I1003 08:08:22.125746 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkc5n\" (UniqueName: \"kubernetes.io/projected/caaff67b-4184-4473-a892-305de99f3886-kube-api-access-xkc5n\") on node \"crc\" DevicePath \"\"" Oct 03 08:08:22 crc kubenswrapper[4664]: I1003 08:08:22.125782 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kp4gp\" (UniqueName: \"kubernetes.io/projected/9dfe6751-9a34-4e24-9769-6fe97328ef4a-kube-api-access-kp4gp\") on node \"crc\" DevicePath \"\"" Oct 03 08:08:22 crc kubenswrapper[4664]: I1003 08:08:22.468622 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-691a-account-create-v52t7" event={"ID":"9c1c6c43-9fc4-46f4-b47e-620efb9d7c4a","Type":"ContainerDied","Data":"185ea6f45bc04e966d2a7fba9d6787e0a99a4476456f31a3cf8c30d1852b3b1f"} Oct 03 08:08:22 crc kubenswrapper[4664]: I1003 08:08:22.469155 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="185ea6f45bc04e966d2a7fba9d6787e0a99a4476456f31a3cf8c30d1852b3b1f" Oct 03 08:08:22 crc kubenswrapper[4664]: I1003 08:08:22.468710 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-691a-account-create-v52t7" Oct 03 08:08:22 crc kubenswrapper[4664]: I1003 08:08:22.472405 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ab56-account-create-2c74m" event={"ID":"caaff67b-4184-4473-a892-305de99f3886","Type":"ContainerDied","Data":"1e172864cc0262ce9b62b66f5ca597980aa684ffef104ef50ec719b8962bda18"} Oct 03 08:08:22 crc kubenswrapper[4664]: I1003 08:08:22.472441 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e172864cc0262ce9b62b66f5ca597980aa684ffef104ef50ec719b8962bda18" Oct 03 08:08:22 crc kubenswrapper[4664]: I1003 08:08:22.472507 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ab56-account-create-2c74m" Oct 03 08:08:22 crc kubenswrapper[4664]: I1003 08:08:22.487625 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6b46-account-create-fgbmb" event={"ID":"9dfe6751-9a34-4e24-9769-6fe97328ef4a","Type":"ContainerDied","Data":"8ab8cbe67f69cab5bec84b0ece11c34875c18477556a472cc021a7029a5846cb"} Oct 03 08:08:22 crc kubenswrapper[4664]: I1003 08:08:22.487668 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ab8cbe67f69cab5bec84b0ece11c34875c18477556a472cc021a7029a5846cb" Oct 03 08:08:22 crc kubenswrapper[4664]: I1003 08:08:22.487724 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6b46-account-create-fgbmb" Oct 03 08:08:22 crc kubenswrapper[4664]: I1003 08:08:22.498432 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-djcmk" event={"ID":"d75d0f35-4cda-4925-8d0d-1666f794ce9b","Type":"ContainerStarted","Data":"c62a70250bacca48817b9ee175f91b0a12a26ee8d51bc5f0a3fe50be3677205e"} Oct 03 08:08:22 crc kubenswrapper[4664]: I1003 08:08:22.529258 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-djcmk" podStartSLOduration=2.680822663 podStartE2EDuration="36.529236218s" podCreationTimestamp="2025-10-03 08:07:46 +0000 UTC" firstStartedPulling="2025-10-03 08:07:47.652295359 +0000 UTC m=+1168.473485839" lastFinishedPulling="2025-10-03 08:08:21.500708904 +0000 UTC m=+1202.321899394" observedRunningTime="2025-10-03 08:08:22.527478688 +0000 UTC m=+1203.348669188" watchObservedRunningTime="2025-10-03 08:08:22.529236218 +0000 UTC m=+1203.350426708" Oct 03 08:08:22 crc kubenswrapper[4664]: I1003 08:08:22.822493 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-8mfsn" Oct 03 08:08:22 crc kubenswrapper[4664]: I1003 08:08:22.938208 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qsss\" (UniqueName: \"kubernetes.io/projected/9197763c-fb03-4db8-9edb-b35fdc61856f-kube-api-access-6qsss\") pod \"9197763c-fb03-4db8-9edb-b35fdc61856f\" (UID: \"9197763c-fb03-4db8-9edb-b35fdc61856f\") " Oct 03 08:08:22 crc kubenswrapper[4664]: I1003 08:08:22.938451 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9197763c-fb03-4db8-9edb-b35fdc61856f-combined-ca-bundle\") pod \"9197763c-fb03-4db8-9edb-b35fdc61856f\" (UID: \"9197763c-fb03-4db8-9edb-b35fdc61856f\") " Oct 03 08:08:22 crc kubenswrapper[4664]: I1003 08:08:22.938505 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9197763c-fb03-4db8-9edb-b35fdc61856f-config-data\") pod \"9197763c-fb03-4db8-9edb-b35fdc61856f\" (UID: \"9197763c-fb03-4db8-9edb-b35fdc61856f\") " Oct 03 08:08:22 crc kubenswrapper[4664]: I1003 08:08:22.944648 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9197763c-fb03-4db8-9edb-b35fdc61856f-kube-api-access-6qsss" (OuterVolumeSpecName: "kube-api-access-6qsss") pod "9197763c-fb03-4db8-9edb-b35fdc61856f" (UID: "9197763c-fb03-4db8-9edb-b35fdc61856f"). InnerVolumeSpecName "kube-api-access-6qsss". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:08:22 crc kubenswrapper[4664]: I1003 08:08:22.968860 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9197763c-fb03-4db8-9edb-b35fdc61856f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9197763c-fb03-4db8-9edb-b35fdc61856f" (UID: "9197763c-fb03-4db8-9edb-b35fdc61856f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:08:22 crc kubenswrapper[4664]: I1003 08:08:22.991178 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9197763c-fb03-4db8-9edb-b35fdc61856f-config-data" (OuterVolumeSpecName: "config-data") pod "9197763c-fb03-4db8-9edb-b35fdc61856f" (UID: "9197763c-fb03-4db8-9edb-b35fdc61856f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:08:23 crc kubenswrapper[4664]: I1003 08:08:23.040658 4664 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9197763c-fb03-4db8-9edb-b35fdc61856f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:08:23 crc kubenswrapper[4664]: I1003 08:08:23.040692 4664 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9197763c-fb03-4db8-9edb-b35fdc61856f-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:08:23 crc kubenswrapper[4664]: I1003 08:08:23.040701 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qsss\" (UniqueName: \"kubernetes.io/projected/9197763c-fb03-4db8-9edb-b35fdc61856f-kube-api-access-6qsss\") on node \"crc\" DevicePath \"\"" Oct 03 08:08:23 crc kubenswrapper[4664]: I1003 08:08:23.508314 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-8mfsn" event={"ID":"9197763c-fb03-4db8-9edb-b35fdc61856f","Type":"ContainerDied","Data":"b0388cfd51d8cfdb126d3bfe8ab96be69bebe99ce3d2318cb85423400d42d690"} Oct 03 08:08:23 crc kubenswrapper[4664]: I1003 08:08:23.508361 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0388cfd51d8cfdb126d3bfe8ab96be69bebe99ce3d2318cb85423400d42d690" Oct 03 08:08:23 crc kubenswrapper[4664]: I1003 08:08:23.508376 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-8mfsn" Oct 03 08:08:23 crc kubenswrapper[4664]: I1003 08:08:23.705187 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-55xrn"] Oct 03 08:08:23 crc kubenswrapper[4664]: E1003 08:08:23.705772 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9197763c-fb03-4db8-9edb-b35fdc61856f" containerName="keystone-db-sync" Oct 03 08:08:23 crc kubenswrapper[4664]: I1003 08:08:23.705850 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="9197763c-fb03-4db8-9edb-b35fdc61856f" containerName="keystone-db-sync" Oct 03 08:08:23 crc kubenswrapper[4664]: E1003 08:08:23.705908 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dfe6751-9a34-4e24-9769-6fe97328ef4a" containerName="mariadb-account-create" Oct 03 08:08:23 crc kubenswrapper[4664]: I1003 08:08:23.712010 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dfe6751-9a34-4e24-9769-6fe97328ef4a" containerName="mariadb-account-create" Oct 03 08:08:23 crc kubenswrapper[4664]: E1003 08:08:23.712251 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caaff67b-4184-4473-a892-305de99f3886" containerName="mariadb-account-create" Oct 03 08:08:23 crc kubenswrapper[4664]: I1003 08:08:23.712318 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="caaff67b-4184-4473-a892-305de99f3886" containerName="mariadb-account-create" Oct 03 08:08:23 crc kubenswrapper[4664]: E1003 08:08:23.712378 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c1c6c43-9fc4-46f4-b47e-620efb9d7c4a" containerName="mariadb-account-create" Oct 03 08:08:23 crc kubenswrapper[4664]: I1003 08:08:23.712421 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c1c6c43-9fc4-46f4-b47e-620efb9d7c4a" containerName="mariadb-account-create" Oct 03 08:08:23 crc kubenswrapper[4664]: I1003 08:08:23.712838 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="caaff67b-4184-4473-a892-305de99f3886" containerName="mariadb-account-create" Oct 03 08:08:23 crc kubenswrapper[4664]: I1003 08:08:23.712948 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c1c6c43-9fc4-46f4-b47e-620efb9d7c4a" containerName="mariadb-account-create" Oct 03 08:08:23 crc kubenswrapper[4664]: I1003 08:08:23.713046 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dfe6751-9a34-4e24-9769-6fe97328ef4a" containerName="mariadb-account-create" Oct 03 08:08:23 crc kubenswrapper[4664]: I1003 08:08:23.713146 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="9197763c-fb03-4db8-9edb-b35fdc61856f" containerName="keystone-db-sync" Oct 03 08:08:23 crc kubenswrapper[4664]: I1003 08:08:23.713940 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-55xrn" Oct 03 08:08:23 crc kubenswrapper[4664]: I1003 08:08:23.716319 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 03 08:08:23 crc kubenswrapper[4664]: I1003 08:08:23.716706 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 03 08:08:23 crc kubenswrapper[4664]: I1003 08:08:23.716954 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 03 08:08:23 crc kubenswrapper[4664]: I1003 08:08:23.717297 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-bm9nt" Oct 03 08:08:23 crc kubenswrapper[4664]: I1003 08:08:23.731796 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-55xrn"] Oct 03 08:08:23 crc kubenswrapper[4664]: I1003 08:08:23.765752 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-t5mkq"] Oct 03 08:08:23 crc kubenswrapper[4664]: I1003 08:08:23.766027 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d5b6d6b67-t5mkq" podUID="93914ba6-0d1c-4808-ab73-9e0496e68f51" containerName="dnsmasq-dns" containerID="cri-o://c80ae21670cfdf45390f944d9befb056e9cfb58c56c5fa77b4aecca711c06e02" gracePeriod=10 Oct 03 08:08:23 crc kubenswrapper[4664]: I1003 08:08:23.782422 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d5b6d6b67-t5mkq" Oct 03 08:08:23 crc kubenswrapper[4664]: I1003 08:08:23.804299 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f8c45789f-hgzvv"] Oct 03 08:08:23 crc kubenswrapper[4664]: I1003 08:08:23.813692 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f8c45789f-hgzvv" Oct 03 08:08:23 crc kubenswrapper[4664]: I1003 08:08:23.858956 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/576a3dfe-80a8-4676-b79e-1bac2c67b8fb-combined-ca-bundle\") pod \"keystone-bootstrap-55xrn\" (UID: \"576a3dfe-80a8-4676-b79e-1bac2c67b8fb\") " pod="openstack/keystone-bootstrap-55xrn" Oct 03 08:08:23 crc kubenswrapper[4664]: I1003 08:08:23.859017 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/576a3dfe-80a8-4676-b79e-1bac2c67b8fb-fernet-keys\") pod \"keystone-bootstrap-55xrn\" (UID: \"576a3dfe-80a8-4676-b79e-1bac2c67b8fb\") " pod="openstack/keystone-bootstrap-55xrn" Oct 03 08:08:23 crc kubenswrapper[4664]: I1003 08:08:23.859226 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/576a3dfe-80a8-4676-b79e-1bac2c67b8fb-credential-keys\") pod \"keystone-bootstrap-55xrn\" (UID: \"576a3dfe-80a8-4676-b79e-1bac2c67b8fb\") " pod="openstack/keystone-bootstrap-55xrn" Oct 03 08:08:23 crc kubenswrapper[4664]: I1003 08:08:23.859324 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ljrh\" (UniqueName: \"kubernetes.io/projected/576a3dfe-80a8-4676-b79e-1bac2c67b8fb-kube-api-access-7ljrh\") pod \"keystone-bootstrap-55xrn\" (UID: \"576a3dfe-80a8-4676-b79e-1bac2c67b8fb\") " pod="openstack/keystone-bootstrap-55xrn" Oct 03 08:08:23 crc kubenswrapper[4664]: I1003 08:08:23.859383 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/576a3dfe-80a8-4676-b79e-1bac2c67b8fb-scripts\") pod \"keystone-bootstrap-55xrn\" (UID: \"576a3dfe-80a8-4676-b79e-1bac2c67b8fb\") " pod="openstack/keystone-bootstrap-55xrn" Oct 03 08:08:23 crc kubenswrapper[4664]: I1003 08:08:23.859534 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/576a3dfe-80a8-4676-b79e-1bac2c67b8fb-config-data\") pod \"keystone-bootstrap-55xrn\" (UID: \"576a3dfe-80a8-4676-b79e-1bac2c67b8fb\") " pod="openstack/keystone-bootstrap-55xrn" Oct 03 08:08:23 crc kubenswrapper[4664]: I1003 08:08:23.950415 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f8c45789f-hgzvv"] Oct 03 08:08:23 crc kubenswrapper[4664]: I1003 08:08:23.961507 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ljrh\" (UniqueName: \"kubernetes.io/projected/576a3dfe-80a8-4676-b79e-1bac2c67b8fb-kube-api-access-7ljrh\") pod \"keystone-bootstrap-55xrn\" (UID: \"576a3dfe-80a8-4676-b79e-1bac2c67b8fb\") " pod="openstack/keystone-bootstrap-55xrn" Oct 03 08:08:23 crc kubenswrapper[4664]: I1003 08:08:23.961583 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c5234c9-faac-4793-9514-d444ddca8a0d-dns-svc\") pod \"dnsmasq-dns-6f8c45789f-hgzvv\" (UID: \"8c5234c9-faac-4793-9514-d444ddca8a0d\") " pod="openstack/dnsmasq-dns-6f8c45789f-hgzvv" Oct 03 08:08:23 crc kubenswrapper[4664]: I1003 08:08:23.961636 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c5234c9-faac-4793-9514-d444ddca8a0d-config\") pod \"dnsmasq-dns-6f8c45789f-hgzvv\" (UID: \"8c5234c9-faac-4793-9514-d444ddca8a0d\") " pod="openstack/dnsmasq-dns-6f8c45789f-hgzvv" Oct 03 08:08:23 crc kubenswrapper[4664]: I1003 08:08:23.961669 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/576a3dfe-80a8-4676-b79e-1bac2c67b8fb-scripts\") pod \"keystone-bootstrap-55xrn\" (UID: \"576a3dfe-80a8-4676-b79e-1bac2c67b8fb\") " pod="openstack/keystone-bootstrap-55xrn" Oct 03 08:08:23 crc kubenswrapper[4664]: I1003 08:08:23.961719 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/576a3dfe-80a8-4676-b79e-1bac2c67b8fb-config-data\") pod \"keystone-bootstrap-55xrn\" (UID: \"576a3dfe-80a8-4676-b79e-1bac2c67b8fb\") " pod="openstack/keystone-bootstrap-55xrn" Oct 03 08:08:23 crc kubenswrapper[4664]: I1003 08:08:23.961758 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8c5234c9-faac-4793-9514-d444ddca8a0d-dns-swift-storage-0\") pod \"dnsmasq-dns-6f8c45789f-hgzvv\" (UID: \"8c5234c9-faac-4793-9514-d444ddca8a0d\") " pod="openstack/dnsmasq-dns-6f8c45789f-hgzvv" Oct 03 08:08:23 crc kubenswrapper[4664]: I1003 08:08:23.961799 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/576a3dfe-80a8-4676-b79e-1bac2c67b8fb-combined-ca-bundle\") pod \"keystone-bootstrap-55xrn\" (UID: \"576a3dfe-80a8-4676-b79e-1bac2c67b8fb\") " pod="openstack/keystone-bootstrap-55xrn" Oct 03 08:08:23 crc kubenswrapper[4664]: I1003 08:08:23.961830 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/576a3dfe-80a8-4676-b79e-1bac2c67b8fb-fernet-keys\") pod \"keystone-bootstrap-55xrn\" (UID: \"576a3dfe-80a8-4676-b79e-1bac2c67b8fb\") " pod="openstack/keystone-bootstrap-55xrn" Oct 03 08:08:23 crc kubenswrapper[4664]: I1003 08:08:23.961904 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8c5234c9-faac-4793-9514-d444ddca8a0d-ovsdbserver-nb\") pod \"dnsmasq-dns-6f8c45789f-hgzvv\" (UID: \"8c5234c9-faac-4793-9514-d444ddca8a0d\") " pod="openstack/dnsmasq-dns-6f8c45789f-hgzvv" Oct 03 08:08:23 crc kubenswrapper[4664]: I1003 08:08:23.961934 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8c5234c9-faac-4793-9514-d444ddca8a0d-ovsdbserver-sb\") pod \"dnsmasq-dns-6f8c45789f-hgzvv\" (UID: \"8c5234c9-faac-4793-9514-d444ddca8a0d\") " pod="openstack/dnsmasq-dns-6f8c45789f-hgzvv" Oct 03 08:08:23 crc kubenswrapper[4664]: I1003 08:08:23.961969 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lf2lm\" (UniqueName: \"kubernetes.io/projected/8c5234c9-faac-4793-9514-d444ddca8a0d-kube-api-access-lf2lm\") pod \"dnsmasq-dns-6f8c45789f-hgzvv\" (UID: \"8c5234c9-faac-4793-9514-d444ddca8a0d\") " pod="openstack/dnsmasq-dns-6f8c45789f-hgzvv" Oct 03 08:08:23 crc kubenswrapper[4664]: I1003 08:08:23.961995 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/576a3dfe-80a8-4676-b79e-1bac2c67b8fb-credential-keys\") pod \"keystone-bootstrap-55xrn\" (UID: \"576a3dfe-80a8-4676-b79e-1bac2c67b8fb\") " pod="openstack/keystone-bootstrap-55xrn" Oct 03 08:08:23 crc kubenswrapper[4664]: I1003 08:08:23.975267 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/576a3dfe-80a8-4676-b79e-1bac2c67b8fb-scripts\") pod \"keystone-bootstrap-55xrn\" (UID: \"576a3dfe-80a8-4676-b79e-1bac2c67b8fb\") " pod="openstack/keystone-bootstrap-55xrn" Oct 03 08:08:23 crc kubenswrapper[4664]: I1003 08:08:23.989222 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/576a3dfe-80a8-4676-b79e-1bac2c67b8fb-credential-keys\") pod \"keystone-bootstrap-55xrn\" (UID: \"576a3dfe-80a8-4676-b79e-1bac2c67b8fb\") " pod="openstack/keystone-bootstrap-55xrn" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.002915 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-lgr4h"] Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.004376 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-lgr4h" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.007247 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/576a3dfe-80a8-4676-b79e-1bac2c67b8fb-combined-ca-bundle\") pod \"keystone-bootstrap-55xrn\" (UID: \"576a3dfe-80a8-4676-b79e-1bac2c67b8fb\") " pod="openstack/keystone-bootstrap-55xrn" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.008079 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/576a3dfe-80a8-4676-b79e-1bac2c67b8fb-fernet-keys\") pod \"keystone-bootstrap-55xrn\" (UID: \"576a3dfe-80a8-4676-b79e-1bac2c67b8fb\") " pod="openstack/keystone-bootstrap-55xrn" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.010213 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/576a3dfe-80a8-4676-b79e-1bac2c67b8fb-config-data\") pod \"keystone-bootstrap-55xrn\" (UID: \"576a3dfe-80a8-4676-b79e-1bac2c67b8fb\") " pod="openstack/keystone-bootstrap-55xrn" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.011969 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.012187 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-fbgj8" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.012297 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.017855 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ljrh\" (UniqueName: \"kubernetes.io/projected/576a3dfe-80a8-4676-b79e-1bac2c67b8fb-kube-api-access-7ljrh\") pod \"keystone-bootstrap-55xrn\" (UID: \"576a3dfe-80a8-4676-b79e-1bac2c67b8fb\") " pod="openstack/keystone-bootstrap-55xrn" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.024954 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-lgr4h"] Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.050925 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-55xrn" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.064066 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhjb2\" (UniqueName: \"kubernetes.io/projected/4d316d5e-f411-4940-af4d-9c42f5baae63-kube-api-access-qhjb2\") pod \"cinder-db-sync-lgr4h\" (UID: \"4d316d5e-f411-4940-af4d-9c42f5baae63\") " pod="openstack/cinder-db-sync-lgr4h" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.064117 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8c5234c9-faac-4793-9514-d444ddca8a0d-ovsdbserver-nb\") pod \"dnsmasq-dns-6f8c45789f-hgzvv\" (UID: \"8c5234c9-faac-4793-9514-d444ddca8a0d\") " pod="openstack/dnsmasq-dns-6f8c45789f-hgzvv" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.064146 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8c5234c9-faac-4793-9514-d444ddca8a0d-ovsdbserver-sb\") pod \"dnsmasq-dns-6f8c45789f-hgzvv\" (UID: \"8c5234c9-faac-4793-9514-d444ddca8a0d\") " pod="openstack/dnsmasq-dns-6f8c45789f-hgzvv" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.064165 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d316d5e-f411-4940-af4d-9c42f5baae63-combined-ca-bundle\") pod \"cinder-db-sync-lgr4h\" (UID: \"4d316d5e-f411-4940-af4d-9c42f5baae63\") " pod="openstack/cinder-db-sync-lgr4h" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.064191 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lf2lm\" (UniqueName: \"kubernetes.io/projected/8c5234c9-faac-4793-9514-d444ddca8a0d-kube-api-access-lf2lm\") pod \"dnsmasq-dns-6f8c45789f-hgzvv\" (UID: \"8c5234c9-faac-4793-9514-d444ddca8a0d\") " pod="openstack/dnsmasq-dns-6f8c45789f-hgzvv" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.064222 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d316d5e-f411-4940-af4d-9c42f5baae63-scripts\") pod \"cinder-db-sync-lgr4h\" (UID: \"4d316d5e-f411-4940-af4d-9c42f5baae63\") " pod="openstack/cinder-db-sync-lgr4h" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.064253 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4d316d5e-f411-4940-af4d-9c42f5baae63-db-sync-config-data\") pod \"cinder-db-sync-lgr4h\" (UID: \"4d316d5e-f411-4940-af4d-9c42f5baae63\") " pod="openstack/cinder-db-sync-lgr4h" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.064278 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c5234c9-faac-4793-9514-d444ddca8a0d-dns-svc\") pod \"dnsmasq-dns-6f8c45789f-hgzvv\" (UID: \"8c5234c9-faac-4793-9514-d444ddca8a0d\") " pod="openstack/dnsmasq-dns-6f8c45789f-hgzvv" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.064295 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d316d5e-f411-4940-af4d-9c42f5baae63-config-data\") pod \"cinder-db-sync-lgr4h\" (UID: \"4d316d5e-f411-4940-af4d-9c42f5baae63\") " pod="openstack/cinder-db-sync-lgr4h" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.064315 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c5234c9-faac-4793-9514-d444ddca8a0d-config\") pod \"dnsmasq-dns-6f8c45789f-hgzvv\" (UID: \"8c5234c9-faac-4793-9514-d444ddca8a0d\") " pod="openstack/dnsmasq-dns-6f8c45789f-hgzvv" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.064359 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8c5234c9-faac-4793-9514-d444ddca8a0d-dns-swift-storage-0\") pod \"dnsmasq-dns-6f8c45789f-hgzvv\" (UID: \"8c5234c9-faac-4793-9514-d444ddca8a0d\") " pod="openstack/dnsmasq-dns-6f8c45789f-hgzvv" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.064409 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4d316d5e-f411-4940-af4d-9c42f5baae63-etc-machine-id\") pod \"cinder-db-sync-lgr4h\" (UID: \"4d316d5e-f411-4940-af4d-9c42f5baae63\") " pod="openstack/cinder-db-sync-lgr4h" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.065370 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8c5234c9-faac-4793-9514-d444ddca8a0d-ovsdbserver-nb\") pod \"dnsmasq-dns-6f8c45789f-hgzvv\" (UID: \"8c5234c9-faac-4793-9514-d444ddca8a0d\") " pod="openstack/dnsmasq-dns-6f8c45789f-hgzvv" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.066467 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c5234c9-faac-4793-9514-d444ddca8a0d-dns-svc\") pod \"dnsmasq-dns-6f8c45789f-hgzvv\" (UID: \"8c5234c9-faac-4793-9514-d444ddca8a0d\") " pod="openstack/dnsmasq-dns-6f8c45789f-hgzvv" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.066553 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8c5234c9-faac-4793-9514-d444ddca8a0d-ovsdbserver-sb\") pod \"dnsmasq-dns-6f8c45789f-hgzvv\" (UID: \"8c5234c9-faac-4793-9514-d444ddca8a0d\") " pod="openstack/dnsmasq-dns-6f8c45789f-hgzvv" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.067116 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c5234c9-faac-4793-9514-d444ddca8a0d-config\") pod \"dnsmasq-dns-6f8c45789f-hgzvv\" (UID: \"8c5234c9-faac-4793-9514-d444ddca8a0d\") " pod="openstack/dnsmasq-dns-6f8c45789f-hgzvv" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.067206 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8c5234c9-faac-4793-9514-d444ddca8a0d-dns-swift-storage-0\") pod \"dnsmasq-dns-6f8c45789f-hgzvv\" (UID: \"8c5234c9-faac-4793-9514-d444ddca8a0d\") " pod="openstack/dnsmasq-dns-6f8c45789f-hgzvv" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.088418 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-vb5f6"] Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.089682 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vb5f6" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.093397 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-km9tz" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.093597 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.100934 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.142413 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lf2lm\" (UniqueName: \"kubernetes.io/projected/8c5234c9-faac-4793-9514-d444ddca8a0d-kube-api-access-lf2lm\") pod \"dnsmasq-dns-6f8c45789f-hgzvv\" (UID: \"8c5234c9-faac-4793-9514-d444ddca8a0d\") " pod="openstack/dnsmasq-dns-6f8c45789f-hgzvv" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.162386 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f8c45789f-hgzvv" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.167002 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bc164c8e-05b6-4b2e-bc8d-5eeb8970b4dc-config\") pod \"neutron-db-sync-vb5f6\" (UID: \"bc164c8e-05b6-4b2e-bc8d-5eeb8970b4dc\") " pod="openstack/neutron-db-sync-vb5f6" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.167050 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4d316d5e-f411-4940-af4d-9c42f5baae63-etc-machine-id\") pod \"cinder-db-sync-lgr4h\" (UID: \"4d316d5e-f411-4940-af4d-9c42f5baae63\") " pod="openstack/cinder-db-sync-lgr4h" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.167100 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhjb2\" (UniqueName: \"kubernetes.io/projected/4d316d5e-f411-4940-af4d-9c42f5baae63-kube-api-access-qhjb2\") pod \"cinder-db-sync-lgr4h\" (UID: \"4d316d5e-f411-4940-af4d-9c42f5baae63\") " pod="openstack/cinder-db-sync-lgr4h" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.167140 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d316d5e-f411-4940-af4d-9c42f5baae63-combined-ca-bundle\") pod \"cinder-db-sync-lgr4h\" (UID: \"4d316d5e-f411-4940-af4d-9c42f5baae63\") " pod="openstack/cinder-db-sync-lgr4h" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.167173 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d316d5e-f411-4940-af4d-9c42f5baae63-scripts\") pod \"cinder-db-sync-lgr4h\" (UID: \"4d316d5e-f411-4940-af4d-9c42f5baae63\") " pod="openstack/cinder-db-sync-lgr4h" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.167210 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4d316d5e-f411-4940-af4d-9c42f5baae63-db-sync-config-data\") pod \"cinder-db-sync-lgr4h\" (UID: \"4d316d5e-f411-4940-af4d-9c42f5baae63\") " pod="openstack/cinder-db-sync-lgr4h" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.167234 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d316d5e-f411-4940-af4d-9c42f5baae63-config-data\") pod \"cinder-db-sync-lgr4h\" (UID: \"4d316d5e-f411-4940-af4d-9c42f5baae63\") " pod="openstack/cinder-db-sync-lgr4h" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.167257 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc164c8e-05b6-4b2e-bc8d-5eeb8970b4dc-combined-ca-bundle\") pod \"neutron-db-sync-vb5f6\" (UID: \"bc164c8e-05b6-4b2e-bc8d-5eeb8970b4dc\") " pod="openstack/neutron-db-sync-vb5f6" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.167282 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bndr2\" (UniqueName: \"kubernetes.io/projected/bc164c8e-05b6-4b2e-bc8d-5eeb8970b4dc-kube-api-access-bndr2\") pod \"neutron-db-sync-vb5f6\" (UID: \"bc164c8e-05b6-4b2e-bc8d-5eeb8970b4dc\") " pod="openstack/neutron-db-sync-vb5f6" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.167400 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4d316d5e-f411-4940-af4d-9c42f5baae63-etc-machine-id\") pod \"cinder-db-sync-lgr4h\" (UID: \"4d316d5e-f411-4940-af4d-9c42f5baae63\") " pod="openstack/cinder-db-sync-lgr4h" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.172786 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6978d9ddc7-5hs4k"] Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.174722 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6978d9ddc7-5hs4k" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.179445 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d316d5e-f411-4940-af4d-9c42f5baae63-config-data\") pod \"cinder-db-sync-lgr4h\" (UID: \"4d316d5e-f411-4940-af4d-9c42f5baae63\") " pod="openstack/cinder-db-sync-lgr4h" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.179828 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-tp9js" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.180115 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.180246 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.180677 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.186208 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d316d5e-f411-4940-af4d-9c42f5baae63-scripts\") pod \"cinder-db-sync-lgr4h\" (UID: \"4d316d5e-f411-4940-af4d-9c42f5baae63\") " pod="openstack/cinder-db-sync-lgr4h" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.190003 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4d316d5e-f411-4940-af4d-9c42f5baae63-db-sync-config-data\") pod \"cinder-db-sync-lgr4h\" (UID: \"4d316d5e-f411-4940-af4d-9c42f5baae63\") " pod="openstack/cinder-db-sync-lgr4h" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.190506 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d316d5e-f411-4940-af4d-9c42f5baae63-combined-ca-bundle\") pod \"cinder-db-sync-lgr4h\" (UID: \"4d316d5e-f411-4940-af4d-9c42f5baae63\") " pod="openstack/cinder-db-sync-lgr4h" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.233935 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-vb5f6"] Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.243403 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhjb2\" (UniqueName: \"kubernetes.io/projected/4d316d5e-f411-4940-af4d-9c42f5baae63-kube-api-access-qhjb2\") pod \"cinder-db-sync-lgr4h\" (UID: \"4d316d5e-f411-4940-af4d-9c42f5baae63\") " pod="openstack/cinder-db-sync-lgr4h" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.253937 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6978d9ddc7-5hs4k"] Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.268690 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e07f6c1a-40d5-409f-bb52-c32cbcdca08f-logs\") pod \"horizon-6978d9ddc7-5hs4k\" (UID: \"e07f6c1a-40d5-409f-bb52-c32cbcdca08f\") " pod="openstack/horizon-6978d9ddc7-5hs4k" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.268786 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bc164c8e-05b6-4b2e-bc8d-5eeb8970b4dc-config\") pod \"neutron-db-sync-vb5f6\" (UID: \"bc164c8e-05b6-4b2e-bc8d-5eeb8970b4dc\") " pod="openstack/neutron-db-sync-vb5f6" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.268821 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e07f6c1a-40d5-409f-bb52-c32cbcdca08f-scripts\") pod \"horizon-6978d9ddc7-5hs4k\" (UID: \"e07f6c1a-40d5-409f-bb52-c32cbcdca08f\") " pod="openstack/horizon-6978d9ddc7-5hs4k" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.268874 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e07f6c1a-40d5-409f-bb52-c32cbcdca08f-horizon-secret-key\") pod \"horizon-6978d9ddc7-5hs4k\" (UID: \"e07f6c1a-40d5-409f-bb52-c32cbcdca08f\") " pod="openstack/horizon-6978d9ddc7-5hs4k" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.268932 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e07f6c1a-40d5-409f-bb52-c32cbcdca08f-config-data\") pod \"horizon-6978d9ddc7-5hs4k\" (UID: \"e07f6c1a-40d5-409f-bb52-c32cbcdca08f\") " pod="openstack/horizon-6978d9ddc7-5hs4k" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.269074 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfkn4\" (UniqueName: \"kubernetes.io/projected/e07f6c1a-40d5-409f-bb52-c32cbcdca08f-kube-api-access-rfkn4\") pod \"horizon-6978d9ddc7-5hs4k\" (UID: \"e07f6c1a-40d5-409f-bb52-c32cbcdca08f\") " pod="openstack/horizon-6978d9ddc7-5hs4k" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.269125 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc164c8e-05b6-4b2e-bc8d-5eeb8970b4dc-combined-ca-bundle\") pod \"neutron-db-sync-vb5f6\" (UID: \"bc164c8e-05b6-4b2e-bc8d-5eeb8970b4dc\") " pod="openstack/neutron-db-sync-vb5f6" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.269159 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bndr2\" (UniqueName: \"kubernetes.io/projected/bc164c8e-05b6-4b2e-bc8d-5eeb8970b4dc-kube-api-access-bndr2\") pod \"neutron-db-sync-vb5f6\" (UID: \"bc164c8e-05b6-4b2e-bc8d-5eeb8970b4dc\") " pod="openstack/neutron-db-sync-vb5f6" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.277575 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/bc164c8e-05b6-4b2e-bc8d-5eeb8970b4dc-config\") pod \"neutron-db-sync-vb5f6\" (UID: \"bc164c8e-05b6-4b2e-bc8d-5eeb8970b4dc\") " pod="openstack/neutron-db-sync-vb5f6" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.283358 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc164c8e-05b6-4b2e-bc8d-5eeb8970b4dc-combined-ca-bundle\") pod \"neutron-db-sync-vb5f6\" (UID: \"bc164c8e-05b6-4b2e-bc8d-5eeb8970b4dc\") " pod="openstack/neutron-db-sync-vb5f6" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.312595 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.314449 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bndr2\" (UniqueName: \"kubernetes.io/projected/bc164c8e-05b6-4b2e-bc8d-5eeb8970b4dc-kube-api-access-bndr2\") pod \"neutron-db-sync-vb5f6\" (UID: \"bc164c8e-05b6-4b2e-bc8d-5eeb8970b4dc\") " pod="openstack/neutron-db-sync-vb5f6" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.317129 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.322939 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.323290 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.368592 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.371668 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7050eaa4-061a-4dd3-b4da-73e2abd04458-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7050eaa4-061a-4dd3-b4da-73e2abd04458\") " pod="openstack/ceilometer-0" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.371725 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nts6v\" (UniqueName: \"kubernetes.io/projected/7050eaa4-061a-4dd3-b4da-73e2abd04458-kube-api-access-nts6v\") pod \"ceilometer-0\" (UID: \"7050eaa4-061a-4dd3-b4da-73e2abd04458\") " pod="openstack/ceilometer-0" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.371766 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7050eaa4-061a-4dd3-b4da-73e2abd04458-config-data\") pod \"ceilometer-0\" (UID: \"7050eaa4-061a-4dd3-b4da-73e2abd04458\") " pod="openstack/ceilometer-0" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.371814 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfkn4\" (UniqueName: \"kubernetes.io/projected/e07f6c1a-40d5-409f-bb52-c32cbcdca08f-kube-api-access-rfkn4\") pod \"horizon-6978d9ddc7-5hs4k\" (UID: \"e07f6c1a-40d5-409f-bb52-c32cbcdca08f\") " pod="openstack/horizon-6978d9ddc7-5hs4k" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.371833 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7050eaa4-061a-4dd3-b4da-73e2abd04458-scripts\") pod \"ceilometer-0\" (UID: \"7050eaa4-061a-4dd3-b4da-73e2abd04458\") " pod="openstack/ceilometer-0" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.372011 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e07f6c1a-40d5-409f-bb52-c32cbcdca08f-logs\") pod \"horizon-6978d9ddc7-5hs4k\" (UID: \"e07f6c1a-40d5-409f-bb52-c32cbcdca08f\") " pod="openstack/horizon-6978d9ddc7-5hs4k" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.372118 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7050eaa4-061a-4dd3-b4da-73e2abd04458-log-httpd\") pod \"ceilometer-0\" (UID: \"7050eaa4-061a-4dd3-b4da-73e2abd04458\") " pod="openstack/ceilometer-0" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.372290 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e07f6c1a-40d5-409f-bb52-c32cbcdca08f-scripts\") pod \"horizon-6978d9ddc7-5hs4k\" (UID: \"e07f6c1a-40d5-409f-bb52-c32cbcdca08f\") " pod="openstack/horizon-6978d9ddc7-5hs4k" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.372321 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7050eaa4-061a-4dd3-b4da-73e2abd04458-run-httpd\") pod \"ceilometer-0\" (UID: \"7050eaa4-061a-4dd3-b4da-73e2abd04458\") " pod="openstack/ceilometer-0" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.372415 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e07f6c1a-40d5-409f-bb52-c32cbcdca08f-horizon-secret-key\") pod \"horizon-6978d9ddc7-5hs4k\" (UID: \"e07f6c1a-40d5-409f-bb52-c32cbcdca08f\") " pod="openstack/horizon-6978d9ddc7-5hs4k" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.372485 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7050eaa4-061a-4dd3-b4da-73e2abd04458-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7050eaa4-061a-4dd3-b4da-73e2abd04458\") " pod="openstack/ceilometer-0" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.372877 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e07f6c1a-40d5-409f-bb52-c32cbcdca08f-config-data\") pod \"horizon-6978d9ddc7-5hs4k\" (UID: \"e07f6c1a-40d5-409f-bb52-c32cbcdca08f\") " pod="openstack/horizon-6978d9ddc7-5hs4k" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.374823 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e07f6c1a-40d5-409f-bb52-c32cbcdca08f-config-data\") pod \"horizon-6978d9ddc7-5hs4k\" (UID: \"e07f6c1a-40d5-409f-bb52-c32cbcdca08f\") " pod="openstack/horizon-6978d9ddc7-5hs4k" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.375775 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e07f6c1a-40d5-409f-bb52-c32cbcdca08f-logs\") pod \"horizon-6978d9ddc7-5hs4k\" (UID: \"e07f6c1a-40d5-409f-bb52-c32cbcdca08f\") " pod="openstack/horizon-6978d9ddc7-5hs4k" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.376403 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e07f6c1a-40d5-409f-bb52-c32cbcdca08f-scripts\") pod \"horizon-6978d9ddc7-5hs4k\" (UID: \"e07f6c1a-40d5-409f-bb52-c32cbcdca08f\") " pod="openstack/horizon-6978d9ddc7-5hs4k" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.388677 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-wcvcw"] Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.390128 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wcvcw" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.391200 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-wcvcw"] Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.400433 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.400811 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-8xbqs" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.400875 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e07f6c1a-40d5-409f-bb52-c32cbcdca08f-horizon-secret-key\") pod \"horizon-6978d9ddc7-5hs4k\" (UID: \"e07f6c1a-40d5-409f-bb52-c32cbcdca08f\") " pod="openstack/horizon-6978d9ddc7-5hs4k" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.402456 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-796f89ff4f-qpp7l"] Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.421962 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-796f89ff4f-qpp7l" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.439352 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-796f89ff4f-qpp7l"] Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.441099 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfkn4\" (UniqueName: \"kubernetes.io/projected/e07f6c1a-40d5-409f-bb52-c32cbcdca08f-kube-api-access-rfkn4\") pod \"horizon-6978d9ddc7-5hs4k\" (UID: \"e07f6c1a-40d5-409f-bb52-c32cbcdca08f\") " pod="openstack/horizon-6978d9ddc7-5hs4k" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.480966 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7050eaa4-061a-4dd3-b4da-73e2abd04458-log-httpd\") pod \"ceilometer-0\" (UID: \"7050eaa4-061a-4dd3-b4da-73e2abd04458\") " pod="openstack/ceilometer-0" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.481044 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7050eaa4-061a-4dd3-b4da-73e2abd04458-run-httpd\") pod \"ceilometer-0\" (UID: \"7050eaa4-061a-4dd3-b4da-73e2abd04458\") " pod="openstack/ceilometer-0" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.481086 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82832c17-408b-4b89-992f-09e393024fe2-combined-ca-bundle\") pod \"barbican-db-sync-wcvcw\" (UID: \"82832c17-408b-4b89-992f-09e393024fe2\") " pod="openstack/barbican-db-sync-wcvcw" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.481120 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/43c23d38-4d5e-4b2e-a062-8384dd8d8138-config-data\") pod \"horizon-796f89ff4f-qpp7l\" (UID: \"43c23d38-4d5e-4b2e-a062-8384dd8d8138\") " pod="openstack/horizon-796f89ff4f-qpp7l" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.481149 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k89f8\" (UniqueName: \"kubernetes.io/projected/43c23d38-4d5e-4b2e-a062-8384dd8d8138-kube-api-access-k89f8\") pod \"horizon-796f89ff4f-qpp7l\" (UID: \"43c23d38-4d5e-4b2e-a062-8384dd8d8138\") " pod="openstack/horizon-796f89ff4f-qpp7l" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.481176 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7050eaa4-061a-4dd3-b4da-73e2abd04458-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7050eaa4-061a-4dd3-b4da-73e2abd04458\") " pod="openstack/ceilometer-0" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.481195 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4q5z\" (UniqueName: \"kubernetes.io/projected/82832c17-408b-4b89-992f-09e393024fe2-kube-api-access-q4q5z\") pod \"barbican-db-sync-wcvcw\" (UID: \"82832c17-408b-4b89-992f-09e393024fe2\") " pod="openstack/barbican-db-sync-wcvcw" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.481227 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/43c23d38-4d5e-4b2e-a062-8384dd8d8138-scripts\") pod \"horizon-796f89ff4f-qpp7l\" (UID: \"43c23d38-4d5e-4b2e-a062-8384dd8d8138\") " pod="openstack/horizon-796f89ff4f-qpp7l" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.481429 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7050eaa4-061a-4dd3-b4da-73e2abd04458-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7050eaa4-061a-4dd3-b4da-73e2abd04458\") " pod="openstack/ceilometer-0" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.481460 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nts6v\" (UniqueName: \"kubernetes.io/projected/7050eaa4-061a-4dd3-b4da-73e2abd04458-kube-api-access-nts6v\") pod \"ceilometer-0\" (UID: \"7050eaa4-061a-4dd3-b4da-73e2abd04458\") " pod="openstack/ceilometer-0" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.481503 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/43c23d38-4d5e-4b2e-a062-8384dd8d8138-horizon-secret-key\") pod \"horizon-796f89ff4f-qpp7l\" (UID: \"43c23d38-4d5e-4b2e-a062-8384dd8d8138\") " pod="openstack/horizon-796f89ff4f-qpp7l" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.481542 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7050eaa4-061a-4dd3-b4da-73e2abd04458-config-data\") pod \"ceilometer-0\" (UID: \"7050eaa4-061a-4dd3-b4da-73e2abd04458\") " pod="openstack/ceilometer-0" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.481574 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43c23d38-4d5e-4b2e-a062-8384dd8d8138-logs\") pod \"horizon-796f89ff4f-qpp7l\" (UID: \"43c23d38-4d5e-4b2e-a062-8384dd8d8138\") " pod="openstack/horizon-796f89ff4f-qpp7l" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.481619 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7050eaa4-061a-4dd3-b4da-73e2abd04458-scripts\") pod \"ceilometer-0\" (UID: \"7050eaa4-061a-4dd3-b4da-73e2abd04458\") " pod="openstack/ceilometer-0" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.481682 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/82832c17-408b-4b89-992f-09e393024fe2-db-sync-config-data\") pod \"barbican-db-sync-wcvcw\" (UID: \"82832c17-408b-4b89-992f-09e393024fe2\") " pod="openstack/barbican-db-sync-wcvcw" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.482252 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7050eaa4-061a-4dd3-b4da-73e2abd04458-log-httpd\") pod \"ceilometer-0\" (UID: \"7050eaa4-061a-4dd3-b4da-73e2abd04458\") " pod="openstack/ceilometer-0" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.482483 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7050eaa4-061a-4dd3-b4da-73e2abd04458-run-httpd\") pod \"ceilometer-0\" (UID: \"7050eaa4-061a-4dd3-b4da-73e2abd04458\") " pod="openstack/ceilometer-0" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.492067 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7050eaa4-061a-4dd3-b4da-73e2abd04458-config-data\") pod \"ceilometer-0\" (UID: \"7050eaa4-061a-4dd3-b4da-73e2abd04458\") " pod="openstack/ceilometer-0" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.504761 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7050eaa4-061a-4dd3-b4da-73e2abd04458-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7050eaa4-061a-4dd3-b4da-73e2abd04458\") " pod="openstack/ceilometer-0" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.513907 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7050eaa4-061a-4dd3-b4da-73e2abd04458-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7050eaa4-061a-4dd3-b4da-73e2abd04458\") " pod="openstack/ceilometer-0" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.516893 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7050eaa4-061a-4dd3-b4da-73e2abd04458-scripts\") pod \"ceilometer-0\" (UID: \"7050eaa4-061a-4dd3-b4da-73e2abd04458\") " pod="openstack/ceilometer-0" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.522040 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-lgr4h" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.526827 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f8c45789f-hgzvv"] Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.544782 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-fcfdd6f9f-666mt"] Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.546323 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fcfdd6f9f-666mt" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.552597 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nts6v\" (UniqueName: \"kubernetes.io/projected/7050eaa4-061a-4dd3-b4da-73e2abd04458-kube-api-access-nts6v\") pod \"ceilometer-0\" (UID: \"7050eaa4-061a-4dd3-b4da-73e2abd04458\") " pod="openstack/ceilometer-0" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.564394 4664 generic.go:334] "Generic (PLEG): container finished" podID="93914ba6-0d1c-4808-ab73-9e0496e68f51" containerID="c80ae21670cfdf45390f944d9befb056e9cfb58c56c5fa77b4aecca711c06e02" exitCode=0 Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.564443 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-t5mkq" event={"ID":"93914ba6-0d1c-4808-ab73-9e0496e68f51","Type":"ContainerDied","Data":"c80ae21670cfdf45390f944d9befb056e9cfb58c56c5fa77b4aecca711c06e02"} Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.573866 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vb5f6" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.586259 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-k59vh"] Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.587901 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-k59vh" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.588062 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd-dns-svc\") pod \"dnsmasq-dns-fcfdd6f9f-666mt\" (UID: \"9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-666mt" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.588114 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43c23d38-4d5e-4b2e-a062-8384dd8d8138-logs\") pod \"horizon-796f89ff4f-qpp7l\" (UID: \"43c23d38-4d5e-4b2e-a062-8384dd8d8138\") " pod="openstack/horizon-796f89ff4f-qpp7l" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.588164 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd-ovsdbserver-sb\") pod \"dnsmasq-dns-fcfdd6f9f-666mt\" (UID: \"9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-666mt" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.588223 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd-ovsdbserver-nb\") pod \"dnsmasq-dns-fcfdd6f9f-666mt\" (UID: \"9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-666mt" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.588261 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/82832c17-408b-4b89-992f-09e393024fe2-db-sync-config-data\") pod \"barbican-db-sync-wcvcw\" (UID: \"82832c17-408b-4b89-992f-09e393024fe2\") " pod="openstack/barbican-db-sync-wcvcw" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.588346 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckgbt\" (UniqueName: \"kubernetes.io/projected/9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd-kube-api-access-ckgbt\") pod \"dnsmasq-dns-fcfdd6f9f-666mt\" (UID: \"9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-666mt" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.588395 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82832c17-408b-4b89-992f-09e393024fe2-combined-ca-bundle\") pod \"barbican-db-sync-wcvcw\" (UID: \"82832c17-408b-4b89-992f-09e393024fe2\") " pod="openstack/barbican-db-sync-wcvcw" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.588435 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/43c23d38-4d5e-4b2e-a062-8384dd8d8138-config-data\") pod \"horizon-796f89ff4f-qpp7l\" (UID: \"43c23d38-4d5e-4b2e-a062-8384dd8d8138\") " pod="openstack/horizon-796f89ff4f-qpp7l" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.588472 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k89f8\" (UniqueName: \"kubernetes.io/projected/43c23d38-4d5e-4b2e-a062-8384dd8d8138-kube-api-access-k89f8\") pod \"horizon-796f89ff4f-qpp7l\" (UID: \"43c23d38-4d5e-4b2e-a062-8384dd8d8138\") " pod="openstack/horizon-796f89ff4f-qpp7l" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.588506 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4q5z\" (UniqueName: \"kubernetes.io/projected/82832c17-408b-4b89-992f-09e393024fe2-kube-api-access-q4q5z\") pod \"barbican-db-sync-wcvcw\" (UID: \"82832c17-408b-4b89-992f-09e393024fe2\") " pod="openstack/barbican-db-sync-wcvcw" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.588551 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/43c23d38-4d5e-4b2e-a062-8384dd8d8138-scripts\") pod \"horizon-796f89ff4f-qpp7l\" (UID: \"43c23d38-4d5e-4b2e-a062-8384dd8d8138\") " pod="openstack/horizon-796f89ff4f-qpp7l" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.588576 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd-dns-swift-storage-0\") pod \"dnsmasq-dns-fcfdd6f9f-666mt\" (UID: \"9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-666mt" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.588681 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd-config\") pod \"dnsmasq-dns-fcfdd6f9f-666mt\" (UID: \"9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-666mt" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.588758 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/43c23d38-4d5e-4b2e-a062-8384dd8d8138-horizon-secret-key\") pod \"horizon-796f89ff4f-qpp7l\" (UID: \"43c23d38-4d5e-4b2e-a062-8384dd8d8138\") " pod="openstack/horizon-796f89ff4f-qpp7l" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.604464 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-8qjww" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.604949 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.605220 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.610339 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6978d9ddc7-5hs4k" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.617982 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/43c23d38-4d5e-4b2e-a062-8384dd8d8138-horizon-secret-key\") pod \"horizon-796f89ff4f-qpp7l\" (UID: \"43c23d38-4d5e-4b2e-a062-8384dd8d8138\") " pod="openstack/horizon-796f89ff4f-qpp7l" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.618669 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/43c23d38-4d5e-4b2e-a062-8384dd8d8138-scripts\") pod \"horizon-796f89ff4f-qpp7l\" (UID: \"43c23d38-4d5e-4b2e-a062-8384dd8d8138\") " pod="openstack/horizon-796f89ff4f-qpp7l" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.623207 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43c23d38-4d5e-4b2e-a062-8384dd8d8138-logs\") pod \"horizon-796f89ff4f-qpp7l\" (UID: \"43c23d38-4d5e-4b2e-a062-8384dd8d8138\") " pod="openstack/horizon-796f89ff4f-qpp7l" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.623277 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-k59vh"] Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.626102 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82832c17-408b-4b89-992f-09e393024fe2-combined-ca-bundle\") pod \"barbican-db-sync-wcvcw\" (UID: \"82832c17-408b-4b89-992f-09e393024fe2\") " pod="openstack/barbican-db-sync-wcvcw" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.627655 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/43c23d38-4d5e-4b2e-a062-8384dd8d8138-config-data\") pod \"horizon-796f89ff4f-qpp7l\" (UID: \"43c23d38-4d5e-4b2e-a062-8384dd8d8138\") " pod="openstack/horizon-796f89ff4f-qpp7l" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.650334 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/82832c17-408b-4b89-992f-09e393024fe2-db-sync-config-data\") pod \"barbican-db-sync-wcvcw\" (UID: \"82832c17-408b-4b89-992f-09e393024fe2\") " pod="openstack/barbican-db-sync-wcvcw" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.669736 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.670347 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4q5z\" (UniqueName: \"kubernetes.io/projected/82832c17-408b-4b89-992f-09e393024fe2-kube-api-access-q4q5z\") pod \"barbican-db-sync-wcvcw\" (UID: \"82832c17-408b-4b89-992f-09e393024fe2\") " pod="openstack/barbican-db-sync-wcvcw" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.675697 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k89f8\" (UniqueName: \"kubernetes.io/projected/43c23d38-4d5e-4b2e-a062-8384dd8d8138-kube-api-access-k89f8\") pod \"horizon-796f89ff4f-qpp7l\" (UID: \"43c23d38-4d5e-4b2e-a062-8384dd8d8138\") " pod="openstack/horizon-796f89ff4f-qpp7l" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.683430 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fcfdd6f9f-666mt"] Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.690829 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c13de59d-0879-41ca-95cc-e8bf05c223eb-scripts\") pod \"placement-db-sync-k59vh\" (UID: \"c13de59d-0879-41ca-95cc-e8bf05c223eb\") " pod="openstack/placement-db-sync-k59vh" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.690881 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckgbt\" (UniqueName: \"kubernetes.io/projected/9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd-kube-api-access-ckgbt\") pod \"dnsmasq-dns-fcfdd6f9f-666mt\" (UID: \"9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-666mt" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.690922 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr44v\" (UniqueName: \"kubernetes.io/projected/c13de59d-0879-41ca-95cc-e8bf05c223eb-kube-api-access-gr44v\") pod \"placement-db-sync-k59vh\" (UID: \"c13de59d-0879-41ca-95cc-e8bf05c223eb\") " pod="openstack/placement-db-sync-k59vh" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.690956 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c13de59d-0879-41ca-95cc-e8bf05c223eb-config-data\") pod \"placement-db-sync-k59vh\" (UID: \"c13de59d-0879-41ca-95cc-e8bf05c223eb\") " pod="openstack/placement-db-sync-k59vh" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.691012 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd-dns-swift-storage-0\") pod \"dnsmasq-dns-fcfdd6f9f-666mt\" (UID: \"9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-666mt" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.691042 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd-config\") pod \"dnsmasq-dns-fcfdd6f9f-666mt\" (UID: \"9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-666mt" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.691087 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd-dns-svc\") pod \"dnsmasq-dns-fcfdd6f9f-666mt\" (UID: \"9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-666mt" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.691105 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c13de59d-0879-41ca-95cc-e8bf05c223eb-logs\") pod \"placement-db-sync-k59vh\" (UID: \"c13de59d-0879-41ca-95cc-e8bf05c223eb\") " pod="openstack/placement-db-sync-k59vh" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.691132 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd-ovsdbserver-sb\") pod \"dnsmasq-dns-fcfdd6f9f-666mt\" (UID: \"9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-666mt" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.691150 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c13de59d-0879-41ca-95cc-e8bf05c223eb-combined-ca-bundle\") pod \"placement-db-sync-k59vh\" (UID: \"c13de59d-0879-41ca-95cc-e8bf05c223eb\") " pod="openstack/placement-db-sync-k59vh" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.691173 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd-ovsdbserver-nb\") pod \"dnsmasq-dns-fcfdd6f9f-666mt\" (UID: \"9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-666mt" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.692197 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd-ovsdbserver-nb\") pod \"dnsmasq-dns-fcfdd6f9f-666mt\" (UID: \"9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-666mt" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.693137 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd-dns-swift-storage-0\") pod \"dnsmasq-dns-fcfdd6f9f-666mt\" (UID: \"9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-666mt" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.694447 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd-dns-svc\") pod \"dnsmasq-dns-fcfdd6f9f-666mt\" (UID: \"9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-666mt" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.695414 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd-ovsdbserver-sb\") pod \"dnsmasq-dns-fcfdd6f9f-666mt\" (UID: \"9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-666mt" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.708615 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd-config\") pod \"dnsmasq-dns-fcfdd6f9f-666mt\" (UID: \"9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-666mt" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.759766 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckgbt\" (UniqueName: \"kubernetes.io/projected/9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd-kube-api-access-ckgbt\") pod \"dnsmasq-dns-fcfdd6f9f-666mt\" (UID: \"9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-666mt" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.760224 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wcvcw" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.792924 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c13de59d-0879-41ca-95cc-e8bf05c223eb-logs\") pod \"placement-db-sync-k59vh\" (UID: \"c13de59d-0879-41ca-95cc-e8bf05c223eb\") " pod="openstack/placement-db-sync-k59vh" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.792995 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c13de59d-0879-41ca-95cc-e8bf05c223eb-combined-ca-bundle\") pod \"placement-db-sync-k59vh\" (UID: \"c13de59d-0879-41ca-95cc-e8bf05c223eb\") " pod="openstack/placement-db-sync-k59vh" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.793061 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c13de59d-0879-41ca-95cc-e8bf05c223eb-scripts\") pod \"placement-db-sync-k59vh\" (UID: \"c13de59d-0879-41ca-95cc-e8bf05c223eb\") " pod="openstack/placement-db-sync-k59vh" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.793088 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gr44v\" (UniqueName: \"kubernetes.io/projected/c13de59d-0879-41ca-95cc-e8bf05c223eb-kube-api-access-gr44v\") pod \"placement-db-sync-k59vh\" (UID: \"c13de59d-0879-41ca-95cc-e8bf05c223eb\") " pod="openstack/placement-db-sync-k59vh" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.793115 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c13de59d-0879-41ca-95cc-e8bf05c223eb-config-data\") pod \"placement-db-sync-k59vh\" (UID: \"c13de59d-0879-41ca-95cc-e8bf05c223eb\") " pod="openstack/placement-db-sync-k59vh" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.806288 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c13de59d-0879-41ca-95cc-e8bf05c223eb-logs\") pod \"placement-db-sync-k59vh\" (UID: \"c13de59d-0879-41ca-95cc-e8bf05c223eb\") " pod="openstack/placement-db-sync-k59vh" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.812721 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c13de59d-0879-41ca-95cc-e8bf05c223eb-config-data\") pod \"placement-db-sync-k59vh\" (UID: \"c13de59d-0879-41ca-95cc-e8bf05c223eb\") " pod="openstack/placement-db-sync-k59vh" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.820374 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c13de59d-0879-41ca-95cc-e8bf05c223eb-scripts\") pod \"placement-db-sync-k59vh\" (UID: \"c13de59d-0879-41ca-95cc-e8bf05c223eb\") " pod="openstack/placement-db-sync-k59vh" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.820993 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c13de59d-0879-41ca-95cc-e8bf05c223eb-combined-ca-bundle\") pod \"placement-db-sync-k59vh\" (UID: \"c13de59d-0879-41ca-95cc-e8bf05c223eb\") " pod="openstack/placement-db-sync-k59vh" Oct 03 08:08:24 crc kubenswrapper[4664]: I1003 08:08:24.890384 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr44v\" (UniqueName: \"kubernetes.io/projected/c13de59d-0879-41ca-95cc-e8bf05c223eb-kube-api-access-gr44v\") pod \"placement-db-sync-k59vh\" (UID: \"c13de59d-0879-41ca-95cc-e8bf05c223eb\") " pod="openstack/placement-db-sync-k59vh" Oct 03 08:08:25 crc kubenswrapper[4664]: I1003 08:08:25.114526 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-796f89ff4f-qpp7l" Oct 03 08:08:25 crc kubenswrapper[4664]: I1003 08:08:25.168028 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fcfdd6f9f-666mt" Oct 03 08:08:25 crc kubenswrapper[4664]: I1003 08:08:25.347664 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-k59vh" Oct 03 08:08:25 crc kubenswrapper[4664]: I1003 08:08:25.638734 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f8c45789f-hgzvv"] Oct 03 08:08:25 crc kubenswrapper[4664]: I1003 08:08:25.781598 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-55xrn"] Oct 03 08:08:26 crc kubenswrapper[4664]: I1003 08:08:26.201896 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6978d9ddc7-5hs4k"] Oct 03 08:08:26 crc kubenswrapper[4664]: I1003 08:08:26.214513 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-wcvcw"] Oct 03 08:08:26 crc kubenswrapper[4664]: I1003 08:08:26.224729 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:08:26 crc kubenswrapper[4664]: I1003 08:08:26.233646 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-vb5f6"] Oct 03 08:08:26 crc kubenswrapper[4664]: I1003 08:08:26.244616 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-lgr4h"] Oct 03 08:08:26 crc kubenswrapper[4664]: I1003 08:08:26.253449 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-796f89ff4f-qpp7l"] Oct 03 08:08:26 crc kubenswrapper[4664]: I1003 08:08:26.262107 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fcfdd6f9f-666mt"] Oct 03 08:08:27 crc kubenswrapper[4664]: I1003 08:08:27.643299 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:08:27 crc kubenswrapper[4664]: I1003 08:08:27.653448 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6978d9ddc7-5hs4k"] Oct 03 08:08:27 crc kubenswrapper[4664]: I1003 08:08:27.724100 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6c84f6cb8c-j8pcf"] Oct 03 08:08:27 crc kubenswrapper[4664]: I1003 08:08:27.725587 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6c84f6cb8c-j8pcf" Oct 03 08:08:27 crc kubenswrapper[4664]: I1003 08:08:27.757749 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/23020330-9f07-4330-9861-ba58dd62e6b2-scripts\") pod \"horizon-6c84f6cb8c-j8pcf\" (UID: \"23020330-9f07-4330-9861-ba58dd62e6b2\") " pod="openstack/horizon-6c84f6cb8c-j8pcf" Oct 03 08:08:27 crc kubenswrapper[4664]: I1003 08:08:27.757848 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/23020330-9f07-4330-9861-ba58dd62e6b2-horizon-secret-key\") pod \"horizon-6c84f6cb8c-j8pcf\" (UID: \"23020330-9f07-4330-9861-ba58dd62e6b2\") " pod="openstack/horizon-6c84f6cb8c-j8pcf" Oct 03 08:08:27 crc kubenswrapper[4664]: I1003 08:08:27.757938 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/23020330-9f07-4330-9861-ba58dd62e6b2-config-data\") pod \"horizon-6c84f6cb8c-j8pcf\" (UID: \"23020330-9f07-4330-9861-ba58dd62e6b2\") " pod="openstack/horizon-6c84f6cb8c-j8pcf" Oct 03 08:08:27 crc kubenswrapper[4664]: I1003 08:08:27.758025 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r7gb\" (UniqueName: \"kubernetes.io/projected/23020330-9f07-4330-9861-ba58dd62e6b2-kube-api-access-6r7gb\") pod \"horizon-6c84f6cb8c-j8pcf\" (UID: \"23020330-9f07-4330-9861-ba58dd62e6b2\") " pod="openstack/horizon-6c84f6cb8c-j8pcf" Oct 03 08:08:27 crc kubenswrapper[4664]: I1003 08:08:27.758313 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23020330-9f07-4330-9861-ba58dd62e6b2-logs\") pod \"horizon-6c84f6cb8c-j8pcf\" (UID: \"23020330-9f07-4330-9861-ba58dd62e6b2\") " pod="openstack/horizon-6c84f6cb8c-j8pcf" Oct 03 08:08:27 crc kubenswrapper[4664]: I1003 08:08:27.772779 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6c84f6cb8c-j8pcf"] Oct 03 08:08:27 crc kubenswrapper[4664]: I1003 08:08:27.862386 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23020330-9f07-4330-9861-ba58dd62e6b2-logs\") pod \"horizon-6c84f6cb8c-j8pcf\" (UID: \"23020330-9f07-4330-9861-ba58dd62e6b2\") " pod="openstack/horizon-6c84f6cb8c-j8pcf" Oct 03 08:08:27 crc kubenswrapper[4664]: I1003 08:08:27.862462 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/23020330-9f07-4330-9861-ba58dd62e6b2-scripts\") pod \"horizon-6c84f6cb8c-j8pcf\" (UID: \"23020330-9f07-4330-9861-ba58dd62e6b2\") " pod="openstack/horizon-6c84f6cb8c-j8pcf" Oct 03 08:08:27 crc kubenswrapper[4664]: I1003 08:08:27.862504 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/23020330-9f07-4330-9861-ba58dd62e6b2-horizon-secret-key\") pod \"horizon-6c84f6cb8c-j8pcf\" (UID: \"23020330-9f07-4330-9861-ba58dd62e6b2\") " pod="openstack/horizon-6c84f6cb8c-j8pcf" Oct 03 08:08:27 crc kubenswrapper[4664]: I1003 08:08:27.862543 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/23020330-9f07-4330-9861-ba58dd62e6b2-config-data\") pod \"horizon-6c84f6cb8c-j8pcf\" (UID: \"23020330-9f07-4330-9861-ba58dd62e6b2\") " pod="openstack/horizon-6c84f6cb8c-j8pcf" Oct 03 08:08:27 crc kubenswrapper[4664]: I1003 08:08:27.862596 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6r7gb\" (UniqueName: \"kubernetes.io/projected/23020330-9f07-4330-9861-ba58dd62e6b2-kube-api-access-6r7gb\") pod \"horizon-6c84f6cb8c-j8pcf\" (UID: \"23020330-9f07-4330-9861-ba58dd62e6b2\") " pod="openstack/horizon-6c84f6cb8c-j8pcf" Oct 03 08:08:27 crc kubenswrapper[4664]: I1003 08:08:27.863375 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23020330-9f07-4330-9861-ba58dd62e6b2-logs\") pod \"horizon-6c84f6cb8c-j8pcf\" (UID: \"23020330-9f07-4330-9861-ba58dd62e6b2\") " pod="openstack/horizon-6c84f6cb8c-j8pcf" Oct 03 08:08:27 crc kubenswrapper[4664]: I1003 08:08:27.863967 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/23020330-9f07-4330-9861-ba58dd62e6b2-scripts\") pod \"horizon-6c84f6cb8c-j8pcf\" (UID: \"23020330-9f07-4330-9861-ba58dd62e6b2\") " pod="openstack/horizon-6c84f6cb8c-j8pcf" Oct 03 08:08:27 crc kubenswrapper[4664]: I1003 08:08:27.866393 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/23020330-9f07-4330-9861-ba58dd62e6b2-config-data\") pod \"horizon-6c84f6cb8c-j8pcf\" (UID: \"23020330-9f07-4330-9861-ba58dd62e6b2\") " pod="openstack/horizon-6c84f6cb8c-j8pcf" Oct 03 08:08:27 crc kubenswrapper[4664]: I1003 08:08:27.905091 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/23020330-9f07-4330-9861-ba58dd62e6b2-horizon-secret-key\") pod \"horizon-6c84f6cb8c-j8pcf\" (UID: \"23020330-9f07-4330-9861-ba58dd62e6b2\") " pod="openstack/horizon-6c84f6cb8c-j8pcf" Oct 03 08:08:27 crc kubenswrapper[4664]: I1003 08:08:27.909478 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r7gb\" (UniqueName: \"kubernetes.io/projected/23020330-9f07-4330-9861-ba58dd62e6b2-kube-api-access-6r7gb\") pod \"horizon-6c84f6cb8c-j8pcf\" (UID: \"23020330-9f07-4330-9861-ba58dd62e6b2\") " pod="openstack/horizon-6c84f6cb8c-j8pcf" Oct 03 08:08:28 crc kubenswrapper[4664]: I1003 08:08:28.045731 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6c84f6cb8c-j8pcf" Oct 03 08:08:28 crc kubenswrapper[4664]: W1003 08:08:28.157063 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c5234c9_faac_4793_9514_d444ddca8a0d.slice/crio-0771f4273262085683bcda42ec05f1a084327d5be4d9429a2270531162750e51 WatchSource:0}: Error finding container 0771f4273262085683bcda42ec05f1a084327d5be4d9429a2270531162750e51: Status 404 returned error can't find the container with id 0771f4273262085683bcda42ec05f1a084327d5be4d9429a2270531162750e51 Oct 03 08:08:28 crc kubenswrapper[4664]: W1003 08:08:28.175366 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82832c17_408b_4b89_992f_09e393024fe2.slice/crio-6f27a5822343bfedb48b948d2add12e9b4d284c7e6f64fd32867fbdd95f0d54e WatchSource:0}: Error finding container 6f27a5822343bfedb48b948d2add12e9b4d284c7e6f64fd32867fbdd95f0d54e: Status 404 returned error can't find the container with id 6f27a5822343bfedb48b948d2add12e9b4d284c7e6f64fd32867fbdd95f0d54e Oct 03 08:08:28 crc kubenswrapper[4664]: W1003 08:08:28.207234 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d8fc6c1_462f_4dd9_af1b_9a53f665e0cd.slice/crio-8a86d3f624f5fa81c852cc50e11736566da68a1f4c7c14b914ee4bfb32a649de WatchSource:0}: Error finding container 8a86d3f624f5fa81c852cc50e11736566da68a1f4c7c14b914ee4bfb32a649de: Status 404 returned error can't find the container with id 8a86d3f624f5fa81c852cc50e11736566da68a1f4c7c14b914ee4bfb32a649de Oct 03 08:08:28 crc kubenswrapper[4664]: I1003 08:08:28.344704 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-t5mkq" Oct 03 08:08:28 crc kubenswrapper[4664]: I1003 08:08:28.370225 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/93914ba6-0d1c-4808-ab73-9e0496e68f51-dns-swift-storage-0\") pod \"93914ba6-0d1c-4808-ab73-9e0496e68f51\" (UID: \"93914ba6-0d1c-4808-ab73-9e0496e68f51\") " Oct 03 08:08:28 crc kubenswrapper[4664]: I1003 08:08:28.370294 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tkzq\" (UniqueName: \"kubernetes.io/projected/93914ba6-0d1c-4808-ab73-9e0496e68f51-kube-api-access-7tkzq\") pod \"93914ba6-0d1c-4808-ab73-9e0496e68f51\" (UID: \"93914ba6-0d1c-4808-ab73-9e0496e68f51\") " Oct 03 08:08:28 crc kubenswrapper[4664]: I1003 08:08:28.370342 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93914ba6-0d1c-4808-ab73-9e0496e68f51-ovsdbserver-nb\") pod \"93914ba6-0d1c-4808-ab73-9e0496e68f51\" (UID: \"93914ba6-0d1c-4808-ab73-9e0496e68f51\") " Oct 03 08:08:28 crc kubenswrapper[4664]: I1003 08:08:28.370375 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93914ba6-0d1c-4808-ab73-9e0496e68f51-ovsdbserver-sb\") pod \"93914ba6-0d1c-4808-ab73-9e0496e68f51\" (UID: \"93914ba6-0d1c-4808-ab73-9e0496e68f51\") " Oct 03 08:08:28 crc kubenswrapper[4664]: I1003 08:08:28.370397 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93914ba6-0d1c-4808-ab73-9e0496e68f51-dns-svc\") pod \"93914ba6-0d1c-4808-ab73-9e0496e68f51\" (UID: \"93914ba6-0d1c-4808-ab73-9e0496e68f51\") " Oct 03 08:08:28 crc kubenswrapper[4664]: I1003 08:08:28.370413 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93914ba6-0d1c-4808-ab73-9e0496e68f51-config\") pod \"93914ba6-0d1c-4808-ab73-9e0496e68f51\" (UID: \"93914ba6-0d1c-4808-ab73-9e0496e68f51\") " Oct 03 08:08:28 crc kubenswrapper[4664]: I1003 08:08:28.405408 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93914ba6-0d1c-4808-ab73-9e0496e68f51-kube-api-access-7tkzq" (OuterVolumeSpecName: "kube-api-access-7tkzq") pod "93914ba6-0d1c-4808-ab73-9e0496e68f51" (UID: "93914ba6-0d1c-4808-ab73-9e0496e68f51"). InnerVolumeSpecName "kube-api-access-7tkzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:08:28 crc kubenswrapper[4664]: I1003 08:08:28.474575 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tkzq\" (UniqueName: \"kubernetes.io/projected/93914ba6-0d1c-4808-ab73-9e0496e68f51-kube-api-access-7tkzq\") on node \"crc\" DevicePath \"\"" Oct 03 08:08:28 crc kubenswrapper[4664]: I1003 08:08:28.477551 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93914ba6-0d1c-4808-ab73-9e0496e68f51-config" (OuterVolumeSpecName: "config") pod "93914ba6-0d1c-4808-ab73-9e0496e68f51" (UID: "93914ba6-0d1c-4808-ab73-9e0496e68f51"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:08:28 crc kubenswrapper[4664]: I1003 08:08:28.513265 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93914ba6-0d1c-4808-ab73-9e0496e68f51-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "93914ba6-0d1c-4808-ab73-9e0496e68f51" (UID: "93914ba6-0d1c-4808-ab73-9e0496e68f51"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:08:28 crc kubenswrapper[4664]: I1003 08:08:28.515559 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93914ba6-0d1c-4808-ab73-9e0496e68f51-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "93914ba6-0d1c-4808-ab73-9e0496e68f51" (UID: "93914ba6-0d1c-4808-ab73-9e0496e68f51"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:08:28 crc kubenswrapper[4664]: I1003 08:08:28.530413 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93914ba6-0d1c-4808-ab73-9e0496e68f51-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "93914ba6-0d1c-4808-ab73-9e0496e68f51" (UID: "93914ba6-0d1c-4808-ab73-9e0496e68f51"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:08:28 crc kubenswrapper[4664]: I1003 08:08:28.556809 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93914ba6-0d1c-4808-ab73-9e0496e68f51-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "93914ba6-0d1c-4808-ab73-9e0496e68f51" (UID: "93914ba6-0d1c-4808-ab73-9e0496e68f51"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:08:28 crc kubenswrapper[4664]: I1003 08:08:28.576763 4664 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93914ba6-0d1c-4808-ab73-9e0496e68f51-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 08:08:28 crc kubenswrapper[4664]: I1003 08:08:28.576816 4664 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93914ba6-0d1c-4808-ab73-9e0496e68f51-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 08:08:28 crc kubenswrapper[4664]: I1003 08:08:28.576828 4664 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93914ba6-0d1c-4808-ab73-9e0496e68f51-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 08:08:28 crc kubenswrapper[4664]: I1003 08:08:28.576837 4664 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93914ba6-0d1c-4808-ab73-9e0496e68f51-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:08:28 crc kubenswrapper[4664]: I1003 08:08:28.576846 4664 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/93914ba6-0d1c-4808-ab73-9e0496e68f51-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 08:08:28 crc kubenswrapper[4664]: I1003 08:08:28.628636 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-t5mkq" event={"ID":"93914ba6-0d1c-4808-ab73-9e0496e68f51","Type":"ContainerDied","Data":"321deae3cd8e887531be923d82d52b21f1e4df5e33aafcaba3bad624ff39ea2f"} Oct 03 08:08:28 crc kubenswrapper[4664]: I1003 08:08:28.628690 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-t5mkq" Oct 03 08:08:28 crc kubenswrapper[4664]: I1003 08:08:28.628717 4664 scope.go:117] "RemoveContainer" containerID="c80ae21670cfdf45390f944d9befb056e9cfb58c56c5fa77b4aecca711c06e02" Oct 03 08:08:28 crc kubenswrapper[4664]: I1003 08:08:28.639169 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vb5f6" event={"ID":"bc164c8e-05b6-4b2e-bc8d-5eeb8970b4dc","Type":"ContainerStarted","Data":"97a116047312948a9ca99f68ced0c3fab929cf794a876804e83cee49c3c5a67e"} Oct 03 08:08:28 crc kubenswrapper[4664]: I1003 08:08:28.642218 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcfdd6f9f-666mt" event={"ID":"9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd","Type":"ContainerStarted","Data":"8a86d3f624f5fa81c852cc50e11736566da68a1f4c7c14b914ee4bfb32a649de"} Oct 03 08:08:28 crc kubenswrapper[4664]: I1003 08:08:28.646029 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-55xrn" event={"ID":"576a3dfe-80a8-4676-b79e-1bac2c67b8fb","Type":"ContainerStarted","Data":"9746f1b56dffe77641f2c7fe5728e7f6c632c4fb9b2fded0613a031a10e9e8a6"} Oct 03 08:08:28 crc kubenswrapper[4664]: I1003 08:08:28.648102 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f8c45789f-hgzvv" event={"ID":"8c5234c9-faac-4793-9514-d444ddca8a0d","Type":"ContainerStarted","Data":"0771f4273262085683bcda42ec05f1a084327d5be4d9429a2270531162750e51"} Oct 03 08:08:28 crc kubenswrapper[4664]: I1003 08:08:28.651001 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-lgr4h" event={"ID":"4d316d5e-f411-4940-af4d-9c42f5baae63","Type":"ContainerStarted","Data":"f52b6d1984a3852d12d9bb25723573b748d50ed3012cbefe0f42203494fb5cc7"} Oct 03 08:08:28 crc kubenswrapper[4664]: I1003 08:08:28.658404 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-796f89ff4f-qpp7l" event={"ID":"43c23d38-4d5e-4b2e-a062-8384dd8d8138","Type":"ContainerStarted","Data":"67b379af3161bb40d9a5f23c577c5e1618a5175c4eab01bca9510678cf5fd2d2"} Oct 03 08:08:28 crc kubenswrapper[4664]: I1003 08:08:28.666577 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wcvcw" event={"ID":"82832c17-408b-4b89-992f-09e393024fe2","Type":"ContainerStarted","Data":"6f27a5822343bfedb48b948d2add12e9b4d284c7e6f64fd32867fbdd95f0d54e"} Oct 03 08:08:28 crc kubenswrapper[4664]: I1003 08:08:28.671830 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6978d9ddc7-5hs4k" event={"ID":"e07f6c1a-40d5-409f-bb52-c32cbcdca08f","Type":"ContainerStarted","Data":"05f3f74fb706d149120fb512b6a193958e85a662a16338c718ab91a5eb92939f"} Oct 03 08:08:28 crc kubenswrapper[4664]: I1003 08:08:28.673542 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7050eaa4-061a-4dd3-b4da-73e2abd04458","Type":"ContainerStarted","Data":"9202a00eed664a74b2d05e905ce3a5fac90948c27a204f5e1b3d0ea9524b3ed4"} Oct 03 08:08:28 crc kubenswrapper[4664]: I1003 08:08:28.694516 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-t5mkq"] Oct 03 08:08:28 crc kubenswrapper[4664]: I1003 08:08:28.706034 4664 scope.go:117] "RemoveContainer" containerID="df6cc01e86b33f67c257bf38c657edf17a2b893728e293db060151d521c5f468" Oct 03 08:08:28 crc kubenswrapper[4664]: I1003 08:08:28.721691 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-t5mkq"] Oct 03 08:08:28 crc kubenswrapper[4664]: I1003 08:08:28.812022 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-k59vh"] Oct 03 08:08:28 crc kubenswrapper[4664]: I1003 08:08:28.938768 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6c84f6cb8c-j8pcf"] Oct 03 08:08:28 crc kubenswrapper[4664]: W1003 08:08:28.968747 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23020330_9f07_4330_9861_ba58dd62e6b2.slice/crio-cba91cf302a556ea15d9c821d2d4e40c33b6405dfd2411e54ea7ac322ea05b17 WatchSource:0}: Error finding container cba91cf302a556ea15d9c821d2d4e40c33b6405dfd2411e54ea7ac322ea05b17: Status 404 returned error can't find the container with id cba91cf302a556ea15d9c821d2d4e40c33b6405dfd2411e54ea7ac322ea05b17 Oct 03 08:08:29 crc kubenswrapper[4664]: I1003 08:08:29.702941 4664 generic.go:334] "Generic (PLEG): container finished" podID="8c5234c9-faac-4793-9514-d444ddca8a0d" containerID="40caa1e8c7229f9ba81c0f8cb249614c0e2ec251b2066a9c8185b30b3945ab58" exitCode=0 Oct 03 08:08:29 crc kubenswrapper[4664]: I1003 08:08:29.703682 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f8c45789f-hgzvv" event={"ID":"8c5234c9-faac-4793-9514-d444ddca8a0d","Type":"ContainerDied","Data":"40caa1e8c7229f9ba81c0f8cb249614c0e2ec251b2066a9c8185b30b3945ab58"} Oct 03 08:08:29 crc kubenswrapper[4664]: I1003 08:08:29.707527 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-k59vh" event={"ID":"c13de59d-0879-41ca-95cc-e8bf05c223eb","Type":"ContainerStarted","Data":"19851dc8d6e21a2ae3d5f3d4df81b600f75122f7696c8c39b15d5b044c20f252"} Oct 03 08:08:29 crc kubenswrapper[4664]: I1003 08:08:29.726652 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vb5f6" event={"ID":"bc164c8e-05b6-4b2e-bc8d-5eeb8970b4dc","Type":"ContainerStarted","Data":"b9c3c4d136277110b5b01a873e525e489aff77e0d0ef27946aaeaf217e35ca6c"} Oct 03 08:08:29 crc kubenswrapper[4664]: I1003 08:08:29.740143 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c84f6cb8c-j8pcf" event={"ID":"23020330-9f07-4330-9861-ba58dd62e6b2","Type":"ContainerStarted","Data":"cba91cf302a556ea15d9c821d2d4e40c33b6405dfd2411e54ea7ac322ea05b17"} Oct 03 08:08:29 crc kubenswrapper[4664]: I1003 08:08:29.745002 4664 generic.go:334] "Generic (PLEG): container finished" podID="9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd" containerID="99b3c3c6e99c5f67161b1252b5c38eedb50cc2f6291ee26d308f4cebd24707be" exitCode=0 Oct 03 08:08:29 crc kubenswrapper[4664]: I1003 08:08:29.745081 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcfdd6f9f-666mt" event={"ID":"9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd","Type":"ContainerDied","Data":"99b3c3c6e99c5f67161b1252b5c38eedb50cc2f6291ee26d308f4cebd24707be"} Oct 03 08:08:29 crc kubenswrapper[4664]: I1003 08:08:29.758069 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-vb5f6" podStartSLOduration=5.758050402 podStartE2EDuration="5.758050402s" podCreationTimestamp="2025-10-03 08:08:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:08:29.757320191 +0000 UTC m=+1210.578510711" watchObservedRunningTime="2025-10-03 08:08:29.758050402 +0000 UTC m=+1210.579240902" Oct 03 08:08:29 crc kubenswrapper[4664]: I1003 08:08:29.805373 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-55xrn" event={"ID":"576a3dfe-80a8-4676-b79e-1bac2c67b8fb","Type":"ContainerStarted","Data":"d521b5f2af42cfcb808e03eca31aa839fac7c4f87393779a5956121158d6926c"} Oct 03 08:08:29 crc kubenswrapper[4664]: I1003 08:08:29.853852 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-55xrn" podStartSLOduration=6.853832275 podStartE2EDuration="6.853832275s" podCreationTimestamp="2025-10-03 08:08:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:08:29.835160832 +0000 UTC m=+1210.656351342" watchObservedRunningTime="2025-10-03 08:08:29.853832275 +0000 UTC m=+1210.675022765" Oct 03 08:08:29 crc kubenswrapper[4664]: I1003 08:08:29.968633 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93914ba6-0d1c-4808-ab73-9e0496e68f51" path="/var/lib/kubelet/pods/93914ba6-0d1c-4808-ab73-9e0496e68f51/volumes" Oct 03 08:08:30 crc kubenswrapper[4664]: I1003 08:08:30.387856 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f8c45789f-hgzvv" Oct 03 08:08:30 crc kubenswrapper[4664]: I1003 08:08:30.448140 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8c5234c9-faac-4793-9514-d444ddca8a0d-dns-swift-storage-0\") pod \"8c5234c9-faac-4793-9514-d444ddca8a0d\" (UID: \"8c5234c9-faac-4793-9514-d444ddca8a0d\") " Oct 03 08:08:30 crc kubenswrapper[4664]: I1003 08:08:30.448200 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8c5234c9-faac-4793-9514-d444ddca8a0d-ovsdbserver-sb\") pod \"8c5234c9-faac-4793-9514-d444ddca8a0d\" (UID: \"8c5234c9-faac-4793-9514-d444ddca8a0d\") " Oct 03 08:08:30 crc kubenswrapper[4664]: I1003 08:08:30.448247 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8c5234c9-faac-4793-9514-d444ddca8a0d-ovsdbserver-nb\") pod \"8c5234c9-faac-4793-9514-d444ddca8a0d\" (UID: \"8c5234c9-faac-4793-9514-d444ddca8a0d\") " Oct 03 08:08:30 crc kubenswrapper[4664]: I1003 08:08:30.448302 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c5234c9-faac-4793-9514-d444ddca8a0d-dns-svc\") pod \"8c5234c9-faac-4793-9514-d444ddca8a0d\" (UID: \"8c5234c9-faac-4793-9514-d444ddca8a0d\") " Oct 03 08:08:30 crc kubenswrapper[4664]: I1003 08:08:30.448438 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lf2lm\" (UniqueName: \"kubernetes.io/projected/8c5234c9-faac-4793-9514-d444ddca8a0d-kube-api-access-lf2lm\") pod \"8c5234c9-faac-4793-9514-d444ddca8a0d\" (UID: \"8c5234c9-faac-4793-9514-d444ddca8a0d\") " Oct 03 08:08:30 crc kubenswrapper[4664]: I1003 08:08:30.448484 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c5234c9-faac-4793-9514-d444ddca8a0d-config\") pod \"8c5234c9-faac-4793-9514-d444ddca8a0d\" (UID: \"8c5234c9-faac-4793-9514-d444ddca8a0d\") " Oct 03 08:08:30 crc kubenswrapper[4664]: I1003 08:08:30.457891 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c5234c9-faac-4793-9514-d444ddca8a0d-kube-api-access-lf2lm" (OuterVolumeSpecName: "kube-api-access-lf2lm") pod "8c5234c9-faac-4793-9514-d444ddca8a0d" (UID: "8c5234c9-faac-4793-9514-d444ddca8a0d"). InnerVolumeSpecName "kube-api-access-lf2lm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:08:30 crc kubenswrapper[4664]: I1003 08:08:30.480225 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c5234c9-faac-4793-9514-d444ddca8a0d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8c5234c9-faac-4793-9514-d444ddca8a0d" (UID: "8c5234c9-faac-4793-9514-d444ddca8a0d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:08:30 crc kubenswrapper[4664]: I1003 08:08:30.497241 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c5234c9-faac-4793-9514-d444ddca8a0d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8c5234c9-faac-4793-9514-d444ddca8a0d" (UID: "8c5234c9-faac-4793-9514-d444ddca8a0d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:08:30 crc kubenswrapper[4664]: I1003 08:08:30.509284 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c5234c9-faac-4793-9514-d444ddca8a0d-config" (OuterVolumeSpecName: "config") pod "8c5234c9-faac-4793-9514-d444ddca8a0d" (UID: "8c5234c9-faac-4793-9514-d444ddca8a0d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:08:30 crc kubenswrapper[4664]: I1003 08:08:30.516778 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c5234c9-faac-4793-9514-d444ddca8a0d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8c5234c9-faac-4793-9514-d444ddca8a0d" (UID: "8c5234c9-faac-4793-9514-d444ddca8a0d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:08:30 crc kubenswrapper[4664]: I1003 08:08:30.518980 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c5234c9-faac-4793-9514-d444ddca8a0d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8c5234c9-faac-4793-9514-d444ddca8a0d" (UID: "8c5234c9-faac-4793-9514-d444ddca8a0d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:08:30 crc kubenswrapper[4664]: I1003 08:08:30.550568 4664 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8c5234c9-faac-4793-9514-d444ddca8a0d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 08:08:30 crc kubenswrapper[4664]: I1003 08:08:30.550614 4664 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8c5234c9-faac-4793-9514-d444ddca8a0d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 08:08:30 crc kubenswrapper[4664]: I1003 08:08:30.550630 4664 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8c5234c9-faac-4793-9514-d444ddca8a0d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 08:08:30 crc kubenswrapper[4664]: I1003 08:08:30.550643 4664 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c5234c9-faac-4793-9514-d444ddca8a0d-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 08:08:30 crc kubenswrapper[4664]: I1003 08:08:30.550654 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lf2lm\" (UniqueName: \"kubernetes.io/projected/8c5234c9-faac-4793-9514-d444ddca8a0d-kube-api-access-lf2lm\") on node \"crc\" DevicePath \"\"" Oct 03 08:08:30 crc kubenswrapper[4664]: I1003 08:08:30.550696 4664 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c5234c9-faac-4793-9514-d444ddca8a0d-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:08:30 crc kubenswrapper[4664]: I1003 08:08:30.828492 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcfdd6f9f-666mt" event={"ID":"9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd","Type":"ContainerStarted","Data":"64d835e96ba4280eb7e483668dddb5bde20f1a1b8703908dd6b6d2c6bd5af680"} Oct 03 08:08:30 crc kubenswrapper[4664]: I1003 08:08:30.828562 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-fcfdd6f9f-666mt" Oct 03 08:08:30 crc kubenswrapper[4664]: I1003 08:08:30.832222 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f8c45789f-hgzvv" Oct 03 08:08:30 crc kubenswrapper[4664]: I1003 08:08:30.832679 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f8c45789f-hgzvv" event={"ID":"8c5234c9-faac-4793-9514-d444ddca8a0d","Type":"ContainerDied","Data":"0771f4273262085683bcda42ec05f1a084327d5be4d9429a2270531162750e51"} Oct 03 08:08:30 crc kubenswrapper[4664]: I1003 08:08:30.832740 4664 scope.go:117] "RemoveContainer" containerID="40caa1e8c7229f9ba81c0f8cb249614c0e2ec251b2066a9c8185b30b3945ab58" Oct 03 08:08:30 crc kubenswrapper[4664]: I1003 08:08:30.873360 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-fcfdd6f9f-666mt" podStartSLOduration=6.873343302 podStartE2EDuration="6.873343302s" podCreationTimestamp="2025-10-03 08:08:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:08:30.871985903 +0000 UTC m=+1211.693176403" watchObservedRunningTime="2025-10-03 08:08:30.873343302 +0000 UTC m=+1211.694533792" Oct 03 08:08:30 crc kubenswrapper[4664]: I1003 08:08:30.952381 4664 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6d5b6d6b67-t5mkq" podUID="93914ba6-0d1c-4808-ab73-9e0496e68f51" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: i/o timeout" Oct 03 08:08:31 crc kubenswrapper[4664]: I1003 08:08:31.005699 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f8c45789f-hgzvv"] Oct 03 08:08:31 crc kubenswrapper[4664]: I1003 08:08:31.061702 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6f8c45789f-hgzvv"] Oct 03 08:08:31 crc kubenswrapper[4664]: I1003 08:08:31.947849 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c5234c9-faac-4793-9514-d444ddca8a0d" path="/var/lib/kubelet/pods/8c5234c9-faac-4793-9514-d444ddca8a0d/volumes" Oct 03 08:08:32 crc kubenswrapper[4664]: I1003 08:08:32.924199 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-796f89ff4f-qpp7l"] Oct 03 08:08:32 crc kubenswrapper[4664]: I1003 08:08:32.952544 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-76644f9584-br5jb"] Oct 03 08:08:32 crc kubenswrapper[4664]: E1003 08:08:32.953040 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93914ba6-0d1c-4808-ab73-9e0496e68f51" containerName="dnsmasq-dns" Oct 03 08:08:32 crc kubenswrapper[4664]: I1003 08:08:32.953057 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="93914ba6-0d1c-4808-ab73-9e0496e68f51" containerName="dnsmasq-dns" Oct 03 08:08:32 crc kubenswrapper[4664]: E1003 08:08:32.953083 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93914ba6-0d1c-4808-ab73-9e0496e68f51" containerName="init" Oct 03 08:08:32 crc kubenswrapper[4664]: I1003 08:08:32.953091 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="93914ba6-0d1c-4808-ab73-9e0496e68f51" containerName="init" Oct 03 08:08:32 crc kubenswrapper[4664]: E1003 08:08:32.953111 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c5234c9-faac-4793-9514-d444ddca8a0d" containerName="init" Oct 03 08:08:32 crc kubenswrapper[4664]: I1003 08:08:32.953118 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c5234c9-faac-4793-9514-d444ddca8a0d" containerName="init" Oct 03 08:08:32 crc kubenswrapper[4664]: I1003 08:08:32.953310 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="93914ba6-0d1c-4808-ab73-9e0496e68f51" containerName="dnsmasq-dns" Oct 03 08:08:32 crc kubenswrapper[4664]: I1003 08:08:32.953332 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c5234c9-faac-4793-9514-d444ddca8a0d" containerName="init" Oct 03 08:08:32 crc kubenswrapper[4664]: I1003 08:08:32.954320 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76644f9584-br5jb" Oct 03 08:08:32 crc kubenswrapper[4664]: I1003 08:08:32.959336 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Oct 03 08:08:32 crc kubenswrapper[4664]: I1003 08:08:32.967325 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-76644f9584-br5jb"] Oct 03 08:08:33 crc kubenswrapper[4664]: I1003 08:08:33.129671 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6c84f6cb8c-j8pcf"] Oct 03 08:08:33 crc kubenswrapper[4664]: I1003 08:08:33.131159 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30ce3373-ef30-4727-b57f-5be7963d1892-combined-ca-bundle\") pod \"horizon-76644f9584-br5jb\" (UID: \"30ce3373-ef30-4727-b57f-5be7963d1892\") " pod="openstack/horizon-76644f9584-br5jb" Oct 03 08:08:33 crc kubenswrapper[4664]: I1003 08:08:33.131228 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/30ce3373-ef30-4727-b57f-5be7963d1892-config-data\") pod \"horizon-76644f9584-br5jb\" (UID: \"30ce3373-ef30-4727-b57f-5be7963d1892\") " pod="openstack/horizon-76644f9584-br5jb" Oct 03 08:08:33 crc kubenswrapper[4664]: I1003 08:08:33.131294 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwxlz\" (UniqueName: \"kubernetes.io/projected/30ce3373-ef30-4727-b57f-5be7963d1892-kube-api-access-qwxlz\") pod \"horizon-76644f9584-br5jb\" (UID: \"30ce3373-ef30-4727-b57f-5be7963d1892\") " pod="openstack/horizon-76644f9584-br5jb" Oct 03 08:08:33 crc kubenswrapper[4664]: I1003 08:08:33.131374 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/30ce3373-ef30-4727-b57f-5be7963d1892-horizon-secret-key\") pod \"horizon-76644f9584-br5jb\" (UID: \"30ce3373-ef30-4727-b57f-5be7963d1892\") " pod="openstack/horizon-76644f9584-br5jb" Oct 03 08:08:33 crc kubenswrapper[4664]: I1003 08:08:33.131422 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/30ce3373-ef30-4727-b57f-5be7963d1892-horizon-tls-certs\") pod \"horizon-76644f9584-br5jb\" (UID: \"30ce3373-ef30-4727-b57f-5be7963d1892\") " pod="openstack/horizon-76644f9584-br5jb" Oct 03 08:08:33 crc kubenswrapper[4664]: I1003 08:08:33.131454 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/30ce3373-ef30-4727-b57f-5be7963d1892-scripts\") pod \"horizon-76644f9584-br5jb\" (UID: \"30ce3373-ef30-4727-b57f-5be7963d1892\") " pod="openstack/horizon-76644f9584-br5jb" Oct 03 08:08:33 crc kubenswrapper[4664]: I1003 08:08:33.131626 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30ce3373-ef30-4727-b57f-5be7963d1892-logs\") pod \"horizon-76644f9584-br5jb\" (UID: \"30ce3373-ef30-4727-b57f-5be7963d1892\") " pod="openstack/horizon-76644f9584-br5jb" Oct 03 08:08:33 crc kubenswrapper[4664]: I1003 08:08:33.168396 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-85fcf9fb6-r8r76"] Oct 03 08:08:33 crc kubenswrapper[4664]: I1003 08:08:33.170328 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-85fcf9fb6-r8r76" Oct 03 08:08:33 crc kubenswrapper[4664]: I1003 08:08:33.180055 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-85fcf9fb6-r8r76"] Oct 03 08:08:33 crc kubenswrapper[4664]: I1003 08:08:33.242390 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30ce3373-ef30-4727-b57f-5be7963d1892-logs\") pod \"horizon-76644f9584-br5jb\" (UID: \"30ce3373-ef30-4727-b57f-5be7963d1892\") " pod="openstack/horizon-76644f9584-br5jb" Oct 03 08:08:33 crc kubenswrapper[4664]: I1003 08:08:33.242468 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30ce3373-ef30-4727-b57f-5be7963d1892-combined-ca-bundle\") pod \"horizon-76644f9584-br5jb\" (UID: \"30ce3373-ef30-4727-b57f-5be7963d1892\") " pod="openstack/horizon-76644f9584-br5jb" Oct 03 08:08:33 crc kubenswrapper[4664]: I1003 08:08:33.252630 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/30ce3373-ef30-4727-b57f-5be7963d1892-config-data\") pod \"horizon-76644f9584-br5jb\" (UID: \"30ce3373-ef30-4727-b57f-5be7963d1892\") " pod="openstack/horizon-76644f9584-br5jb" Oct 03 08:08:33 crc kubenswrapper[4664]: I1003 08:08:33.244229 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30ce3373-ef30-4727-b57f-5be7963d1892-logs\") pod \"horizon-76644f9584-br5jb\" (UID: \"30ce3373-ef30-4727-b57f-5be7963d1892\") " pod="openstack/horizon-76644f9584-br5jb" Oct 03 08:08:33 crc kubenswrapper[4664]: I1003 08:08:33.257484 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwxlz\" (UniqueName: \"kubernetes.io/projected/30ce3373-ef30-4727-b57f-5be7963d1892-kube-api-access-qwxlz\") pod \"horizon-76644f9584-br5jb\" (UID: \"30ce3373-ef30-4727-b57f-5be7963d1892\") " pod="openstack/horizon-76644f9584-br5jb" Oct 03 08:08:33 crc kubenswrapper[4664]: I1003 08:08:33.257819 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/30ce3373-ef30-4727-b57f-5be7963d1892-horizon-secret-key\") pod \"horizon-76644f9584-br5jb\" (UID: \"30ce3373-ef30-4727-b57f-5be7963d1892\") " pod="openstack/horizon-76644f9584-br5jb" Oct 03 08:08:33 crc kubenswrapper[4664]: I1003 08:08:33.257893 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/30ce3373-ef30-4727-b57f-5be7963d1892-horizon-tls-certs\") pod \"horizon-76644f9584-br5jb\" (UID: \"30ce3373-ef30-4727-b57f-5be7963d1892\") " pod="openstack/horizon-76644f9584-br5jb" Oct 03 08:08:33 crc kubenswrapper[4664]: I1003 08:08:33.257938 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/30ce3373-ef30-4727-b57f-5be7963d1892-scripts\") pod \"horizon-76644f9584-br5jb\" (UID: \"30ce3373-ef30-4727-b57f-5be7963d1892\") " pod="openstack/horizon-76644f9584-br5jb" Oct 03 08:08:33 crc kubenswrapper[4664]: I1003 08:08:33.258969 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/30ce3373-ef30-4727-b57f-5be7963d1892-scripts\") pod \"horizon-76644f9584-br5jb\" (UID: \"30ce3373-ef30-4727-b57f-5be7963d1892\") " pod="openstack/horizon-76644f9584-br5jb" Oct 03 08:08:33 crc kubenswrapper[4664]: I1003 08:08:33.259260 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/30ce3373-ef30-4727-b57f-5be7963d1892-config-data\") pod \"horizon-76644f9584-br5jb\" (UID: \"30ce3373-ef30-4727-b57f-5be7963d1892\") " pod="openstack/horizon-76644f9584-br5jb" Oct 03 08:08:33 crc kubenswrapper[4664]: I1003 08:08:33.277171 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/30ce3373-ef30-4727-b57f-5be7963d1892-horizon-secret-key\") pod \"horizon-76644f9584-br5jb\" (UID: \"30ce3373-ef30-4727-b57f-5be7963d1892\") " pod="openstack/horizon-76644f9584-br5jb" Oct 03 08:08:33 crc kubenswrapper[4664]: I1003 08:08:33.277559 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/30ce3373-ef30-4727-b57f-5be7963d1892-horizon-tls-certs\") pod \"horizon-76644f9584-br5jb\" (UID: \"30ce3373-ef30-4727-b57f-5be7963d1892\") " pod="openstack/horizon-76644f9584-br5jb" Oct 03 08:08:33 crc kubenswrapper[4664]: I1003 08:08:33.278932 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30ce3373-ef30-4727-b57f-5be7963d1892-combined-ca-bundle\") pod \"horizon-76644f9584-br5jb\" (UID: \"30ce3373-ef30-4727-b57f-5be7963d1892\") " pod="openstack/horizon-76644f9584-br5jb" Oct 03 08:08:33 crc kubenswrapper[4664]: I1003 08:08:33.295233 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwxlz\" (UniqueName: \"kubernetes.io/projected/30ce3373-ef30-4727-b57f-5be7963d1892-kube-api-access-qwxlz\") pod \"horizon-76644f9584-br5jb\" (UID: \"30ce3373-ef30-4727-b57f-5be7963d1892\") " pod="openstack/horizon-76644f9584-br5jb" Oct 03 08:08:33 crc kubenswrapper[4664]: I1003 08:08:33.361285 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f1015cf1-8e4b-44fd-a794-27edfecdceed-horizon-secret-key\") pod \"horizon-85fcf9fb6-r8r76\" (UID: \"f1015cf1-8e4b-44fd-a794-27edfecdceed\") " pod="openstack/horizon-85fcf9fb6-r8r76" Oct 03 08:08:33 crc kubenswrapper[4664]: I1003 08:08:33.361352 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1015cf1-8e4b-44fd-a794-27edfecdceed-combined-ca-bundle\") pod \"horizon-85fcf9fb6-r8r76\" (UID: \"f1015cf1-8e4b-44fd-a794-27edfecdceed\") " pod="openstack/horizon-85fcf9fb6-r8r76" Oct 03 08:08:33 crc kubenswrapper[4664]: I1003 08:08:33.361518 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f1015cf1-8e4b-44fd-a794-27edfecdceed-config-data\") pod \"horizon-85fcf9fb6-r8r76\" (UID: \"f1015cf1-8e4b-44fd-a794-27edfecdceed\") " pod="openstack/horizon-85fcf9fb6-r8r76" Oct 03 08:08:33 crc kubenswrapper[4664]: I1003 08:08:33.361548 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbv9p\" (UniqueName: \"kubernetes.io/projected/f1015cf1-8e4b-44fd-a794-27edfecdceed-kube-api-access-lbv9p\") pod \"horizon-85fcf9fb6-r8r76\" (UID: \"f1015cf1-8e4b-44fd-a794-27edfecdceed\") " pod="openstack/horizon-85fcf9fb6-r8r76" Oct 03 08:08:33 crc kubenswrapper[4664]: I1003 08:08:33.361661 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f1015cf1-8e4b-44fd-a794-27edfecdceed-scripts\") pod \"horizon-85fcf9fb6-r8r76\" (UID: \"f1015cf1-8e4b-44fd-a794-27edfecdceed\") " pod="openstack/horizon-85fcf9fb6-r8r76" Oct 03 08:08:33 crc kubenswrapper[4664]: I1003 08:08:33.361691 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1015cf1-8e4b-44fd-a794-27edfecdceed-logs\") pod \"horizon-85fcf9fb6-r8r76\" (UID: \"f1015cf1-8e4b-44fd-a794-27edfecdceed\") " pod="openstack/horizon-85fcf9fb6-r8r76" Oct 03 08:08:33 crc kubenswrapper[4664]: I1003 08:08:33.361726 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1015cf1-8e4b-44fd-a794-27edfecdceed-horizon-tls-certs\") pod \"horizon-85fcf9fb6-r8r76\" (UID: \"f1015cf1-8e4b-44fd-a794-27edfecdceed\") " pod="openstack/horizon-85fcf9fb6-r8r76" Oct 03 08:08:33 crc kubenswrapper[4664]: I1003 08:08:33.463114 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f1015cf1-8e4b-44fd-a794-27edfecdceed-scripts\") pod \"horizon-85fcf9fb6-r8r76\" (UID: \"f1015cf1-8e4b-44fd-a794-27edfecdceed\") " pod="openstack/horizon-85fcf9fb6-r8r76" Oct 03 08:08:33 crc kubenswrapper[4664]: I1003 08:08:33.463175 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1015cf1-8e4b-44fd-a794-27edfecdceed-logs\") pod \"horizon-85fcf9fb6-r8r76\" (UID: \"f1015cf1-8e4b-44fd-a794-27edfecdceed\") " pod="openstack/horizon-85fcf9fb6-r8r76" Oct 03 08:08:33 crc kubenswrapper[4664]: I1003 08:08:33.463232 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1015cf1-8e4b-44fd-a794-27edfecdceed-horizon-tls-certs\") pod \"horizon-85fcf9fb6-r8r76\" (UID: \"f1015cf1-8e4b-44fd-a794-27edfecdceed\") " pod="openstack/horizon-85fcf9fb6-r8r76" Oct 03 08:08:33 crc kubenswrapper[4664]: I1003 08:08:33.463302 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f1015cf1-8e4b-44fd-a794-27edfecdceed-horizon-secret-key\") pod \"horizon-85fcf9fb6-r8r76\" (UID: \"f1015cf1-8e4b-44fd-a794-27edfecdceed\") " pod="openstack/horizon-85fcf9fb6-r8r76" Oct 03 08:08:33 crc kubenswrapper[4664]: I1003 08:08:33.463360 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1015cf1-8e4b-44fd-a794-27edfecdceed-combined-ca-bundle\") pod \"horizon-85fcf9fb6-r8r76\" (UID: \"f1015cf1-8e4b-44fd-a794-27edfecdceed\") " pod="openstack/horizon-85fcf9fb6-r8r76" Oct 03 08:08:33 crc kubenswrapper[4664]: I1003 08:08:33.463402 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f1015cf1-8e4b-44fd-a794-27edfecdceed-config-data\") pod \"horizon-85fcf9fb6-r8r76\" (UID: \"f1015cf1-8e4b-44fd-a794-27edfecdceed\") " pod="openstack/horizon-85fcf9fb6-r8r76" Oct 03 08:08:33 crc kubenswrapper[4664]: I1003 08:08:33.463426 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbv9p\" (UniqueName: \"kubernetes.io/projected/f1015cf1-8e4b-44fd-a794-27edfecdceed-kube-api-access-lbv9p\") pod \"horizon-85fcf9fb6-r8r76\" (UID: \"f1015cf1-8e4b-44fd-a794-27edfecdceed\") " pod="openstack/horizon-85fcf9fb6-r8r76" Oct 03 08:08:33 crc kubenswrapper[4664]: I1003 08:08:33.465841 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f1015cf1-8e4b-44fd-a794-27edfecdceed-scripts\") pod \"horizon-85fcf9fb6-r8r76\" (UID: \"f1015cf1-8e4b-44fd-a794-27edfecdceed\") " pod="openstack/horizon-85fcf9fb6-r8r76" Oct 03 08:08:33 crc kubenswrapper[4664]: I1003 08:08:33.466162 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1015cf1-8e4b-44fd-a794-27edfecdceed-logs\") pod \"horizon-85fcf9fb6-r8r76\" (UID: \"f1015cf1-8e4b-44fd-a794-27edfecdceed\") " pod="openstack/horizon-85fcf9fb6-r8r76" Oct 03 08:08:33 crc kubenswrapper[4664]: I1003 08:08:33.468042 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f1015cf1-8e4b-44fd-a794-27edfecdceed-config-data\") pod \"horizon-85fcf9fb6-r8r76\" (UID: \"f1015cf1-8e4b-44fd-a794-27edfecdceed\") " pod="openstack/horizon-85fcf9fb6-r8r76" Oct 03 08:08:33 crc kubenswrapper[4664]: I1003 08:08:33.468077 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1015cf1-8e4b-44fd-a794-27edfecdceed-horizon-tls-certs\") pod \"horizon-85fcf9fb6-r8r76\" (UID: \"f1015cf1-8e4b-44fd-a794-27edfecdceed\") " pod="openstack/horizon-85fcf9fb6-r8r76" Oct 03 08:08:33 crc kubenswrapper[4664]: I1003 08:08:33.469111 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1015cf1-8e4b-44fd-a794-27edfecdceed-combined-ca-bundle\") pod \"horizon-85fcf9fb6-r8r76\" (UID: \"f1015cf1-8e4b-44fd-a794-27edfecdceed\") " pod="openstack/horizon-85fcf9fb6-r8r76" Oct 03 08:08:33 crc kubenswrapper[4664]: I1003 08:08:33.469243 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f1015cf1-8e4b-44fd-a794-27edfecdceed-horizon-secret-key\") pod \"horizon-85fcf9fb6-r8r76\" (UID: \"f1015cf1-8e4b-44fd-a794-27edfecdceed\") " pod="openstack/horizon-85fcf9fb6-r8r76" Oct 03 08:08:33 crc kubenswrapper[4664]: I1003 08:08:33.490383 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbv9p\" (UniqueName: \"kubernetes.io/projected/f1015cf1-8e4b-44fd-a794-27edfecdceed-kube-api-access-lbv9p\") pod \"horizon-85fcf9fb6-r8r76\" (UID: \"f1015cf1-8e4b-44fd-a794-27edfecdceed\") " pod="openstack/horizon-85fcf9fb6-r8r76" Oct 03 08:08:33 crc kubenswrapper[4664]: I1003 08:08:33.515953 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-85fcf9fb6-r8r76" Oct 03 08:08:33 crc kubenswrapper[4664]: I1003 08:08:33.581547 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76644f9584-br5jb" Oct 03 08:08:35 crc kubenswrapper[4664]: I1003 08:08:35.170222 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-fcfdd6f9f-666mt" Oct 03 08:08:35 crc kubenswrapper[4664]: I1003 08:08:35.256312 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-7nzs5"] Oct 03 08:08:35 crc kubenswrapper[4664]: I1003 08:08:35.256637 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-7nzs5" podUID="455d48b8-17a8-4c2f-9923-a30626506388" containerName="dnsmasq-dns" containerID="cri-o://f11f91f1b8e17dcbb36ff90cf9bf1aecf3a5a37bbc0f64b527cf36e74cb8321f" gracePeriod=10 Oct 03 08:08:35 crc kubenswrapper[4664]: I1003 08:08:35.911409 4664 generic.go:334] "Generic (PLEG): container finished" podID="576a3dfe-80a8-4676-b79e-1bac2c67b8fb" containerID="d521b5f2af42cfcb808e03eca31aa839fac7c4f87393779a5956121158d6926c" exitCode=0 Oct 03 08:08:35 crc kubenswrapper[4664]: I1003 08:08:35.911486 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-55xrn" event={"ID":"576a3dfe-80a8-4676-b79e-1bac2c67b8fb","Type":"ContainerDied","Data":"d521b5f2af42cfcb808e03eca31aa839fac7c4f87393779a5956121158d6926c"} Oct 03 08:08:35 crc kubenswrapper[4664]: I1003 08:08:35.914062 4664 generic.go:334] "Generic (PLEG): container finished" podID="455d48b8-17a8-4c2f-9923-a30626506388" containerID="f11f91f1b8e17dcbb36ff90cf9bf1aecf3a5a37bbc0f64b527cf36e74cb8321f" exitCode=0 Oct 03 08:08:35 crc kubenswrapper[4664]: I1003 08:08:35.914094 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-7nzs5" event={"ID":"455d48b8-17a8-4c2f-9923-a30626506388","Type":"ContainerDied","Data":"f11f91f1b8e17dcbb36ff90cf9bf1aecf3a5a37bbc0f64b527cf36e74cb8321f"} Oct 03 08:08:36 crc kubenswrapper[4664]: E1003 08:08:36.005584 4664 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod576a3dfe_80a8_4676_b79e_1bac2c67b8fb.slice/crio-d521b5f2af42cfcb808e03eca31aa839fac7c4f87393779a5956121158d6926c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod576a3dfe_80a8_4676_b79e_1bac2c67b8fb.slice/crio-conmon-d521b5f2af42cfcb808e03eca31aa839fac7c4f87393779a5956121158d6926c.scope\": RecentStats: unable to find data in memory cache]" Oct 03 08:08:38 crc kubenswrapper[4664]: I1003 08:08:38.273109 4664 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-7nzs5" podUID="455d48b8-17a8-4c2f-9923-a30626506388" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.110:5353: connect: connection refused" Oct 03 08:08:41 crc kubenswrapper[4664]: I1003 08:08:41.986812 4664 patch_prober.go:28] interesting pod/machine-config-daemon-x9dgm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:08:41 crc kubenswrapper[4664]: I1003 08:08:41.987178 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:08:43 crc kubenswrapper[4664]: I1003 08:08:43.274236 4664 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-7nzs5" podUID="455d48b8-17a8-4c2f-9923-a30626506388" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.110:5353: connect: connection refused" Oct 03 08:08:45 crc kubenswrapper[4664]: E1003 08:08:45.276228 4664 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Oct 03 08:08:45 crc kubenswrapper[4664]: E1003 08:08:45.276923 4664 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndh56bh5c5h646h685h5bdh567hc4h9fh569h5c8h64h54chdbh8bh577h5fbh65fh556h59chc9h5d6h68bh565h55dhf7h67fh5bfh58chdch97h5bbq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6r7gb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-6c84f6cb8c-j8pcf_openstack(23020330-9f07-4330-9861-ba58dd62e6b2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 08:08:45 crc kubenswrapper[4664]: E1003 08:08:45.767087 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-6c84f6cb8c-j8pcf" podUID="23020330-9f07-4330-9861-ba58dd62e6b2" Oct 03 08:08:45 crc kubenswrapper[4664]: E1003 08:08:45.789595 4664 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Oct 03 08:08:45 crc kubenswrapper[4664]: E1003 08:08:45.789806 4664 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n576h5f5h664h59bh57ch568h57fh67h595h57hc7hd8h65h576h554h576hb9hb9h57dh556hffh8bh59fh5f5h5c7h5d4h668hfdh584h584h65hc5q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k89f8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-796f89ff4f-qpp7l_openstack(43c23d38-4d5e-4b2e-a062-8384dd8d8138): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 08:08:45 crc kubenswrapper[4664]: E1003 08:08:45.793165 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-796f89ff4f-qpp7l" podUID="43c23d38-4d5e-4b2e-a062-8384dd8d8138" Oct 03 08:08:45 crc kubenswrapper[4664]: E1003 08:08:45.802731 4664 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Oct 03 08:08:45 crc kubenswrapper[4664]: E1003 08:08:45.802888 4664 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n6ch675h669hfbh665hcch68bh676h545h5fbh7bh5b8h5cbhffh68h94h65ch5c6h5dch689hdbh57fh5b9h65dh87h59fh548h65ch56bh68bh89h5bq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rfkn4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-6978d9ddc7-5hs4k_openstack(e07f6c1a-40d5-409f-bb52-c32cbcdca08f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 08:08:45 crc kubenswrapper[4664]: E1003 08:08:45.806092 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-6978d9ddc7-5hs4k" podUID="e07f6c1a-40d5-409f-bb52-c32cbcdca08f" Oct 03 08:08:49 crc kubenswrapper[4664]: E1003 08:08:49.678209 4664 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Oct 03 08:08:49 crc kubenswrapper[4664]: E1003 08:08:49.678954 4664 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gr44v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-k59vh_openstack(c13de59d-0879-41ca-95cc-e8bf05c223eb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 08:08:49 crc kubenswrapper[4664]: E1003 08:08:49.684778 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-k59vh" podUID="c13de59d-0879-41ca-95cc-e8bf05c223eb" Oct 03 08:08:49 crc kubenswrapper[4664]: I1003 08:08:49.757757 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-55xrn" Oct 03 08:08:49 crc kubenswrapper[4664]: I1003 08:08:49.896413 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ljrh\" (UniqueName: \"kubernetes.io/projected/576a3dfe-80a8-4676-b79e-1bac2c67b8fb-kube-api-access-7ljrh\") pod \"576a3dfe-80a8-4676-b79e-1bac2c67b8fb\" (UID: \"576a3dfe-80a8-4676-b79e-1bac2c67b8fb\") " Oct 03 08:08:49 crc kubenswrapper[4664]: I1003 08:08:49.896518 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/576a3dfe-80a8-4676-b79e-1bac2c67b8fb-credential-keys\") pod \"576a3dfe-80a8-4676-b79e-1bac2c67b8fb\" (UID: \"576a3dfe-80a8-4676-b79e-1bac2c67b8fb\") " Oct 03 08:08:49 crc kubenswrapper[4664]: I1003 08:08:49.896567 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/576a3dfe-80a8-4676-b79e-1bac2c67b8fb-config-data\") pod \"576a3dfe-80a8-4676-b79e-1bac2c67b8fb\" (UID: \"576a3dfe-80a8-4676-b79e-1bac2c67b8fb\") " Oct 03 08:08:49 crc kubenswrapper[4664]: I1003 08:08:49.896723 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/576a3dfe-80a8-4676-b79e-1bac2c67b8fb-scripts\") pod \"576a3dfe-80a8-4676-b79e-1bac2c67b8fb\" (UID: \"576a3dfe-80a8-4676-b79e-1bac2c67b8fb\") " Oct 03 08:08:49 crc kubenswrapper[4664]: I1003 08:08:49.896763 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/576a3dfe-80a8-4676-b79e-1bac2c67b8fb-combined-ca-bundle\") pod \"576a3dfe-80a8-4676-b79e-1bac2c67b8fb\" (UID: \"576a3dfe-80a8-4676-b79e-1bac2c67b8fb\") " Oct 03 08:08:49 crc kubenswrapper[4664]: I1003 08:08:49.896817 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/576a3dfe-80a8-4676-b79e-1bac2c67b8fb-fernet-keys\") pod \"576a3dfe-80a8-4676-b79e-1bac2c67b8fb\" (UID: \"576a3dfe-80a8-4676-b79e-1bac2c67b8fb\") " Oct 03 08:08:49 crc kubenswrapper[4664]: I1003 08:08:49.904295 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/576a3dfe-80a8-4676-b79e-1bac2c67b8fb-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "576a3dfe-80a8-4676-b79e-1bac2c67b8fb" (UID: "576a3dfe-80a8-4676-b79e-1bac2c67b8fb"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:08:49 crc kubenswrapper[4664]: I1003 08:08:49.904586 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/576a3dfe-80a8-4676-b79e-1bac2c67b8fb-scripts" (OuterVolumeSpecName: "scripts") pod "576a3dfe-80a8-4676-b79e-1bac2c67b8fb" (UID: "576a3dfe-80a8-4676-b79e-1bac2c67b8fb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:08:49 crc kubenswrapper[4664]: I1003 08:08:49.907449 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/576a3dfe-80a8-4676-b79e-1bac2c67b8fb-kube-api-access-7ljrh" (OuterVolumeSpecName: "kube-api-access-7ljrh") pod "576a3dfe-80a8-4676-b79e-1bac2c67b8fb" (UID: "576a3dfe-80a8-4676-b79e-1bac2c67b8fb"). InnerVolumeSpecName "kube-api-access-7ljrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:08:49 crc kubenswrapper[4664]: I1003 08:08:49.915902 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/576a3dfe-80a8-4676-b79e-1bac2c67b8fb-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "576a3dfe-80a8-4676-b79e-1bac2c67b8fb" (UID: "576a3dfe-80a8-4676-b79e-1bac2c67b8fb"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:08:49 crc kubenswrapper[4664]: I1003 08:08:49.928950 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/576a3dfe-80a8-4676-b79e-1bac2c67b8fb-config-data" (OuterVolumeSpecName: "config-data") pod "576a3dfe-80a8-4676-b79e-1bac2c67b8fb" (UID: "576a3dfe-80a8-4676-b79e-1bac2c67b8fb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:08:49 crc kubenswrapper[4664]: I1003 08:08:49.932410 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/576a3dfe-80a8-4676-b79e-1bac2c67b8fb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "576a3dfe-80a8-4676-b79e-1bac2c67b8fb" (UID: "576a3dfe-80a8-4676-b79e-1bac2c67b8fb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:08:49 crc kubenswrapper[4664]: I1003 08:08:49.999394 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ljrh\" (UniqueName: \"kubernetes.io/projected/576a3dfe-80a8-4676-b79e-1bac2c67b8fb-kube-api-access-7ljrh\") on node \"crc\" DevicePath \"\"" Oct 03 08:08:49 crc kubenswrapper[4664]: I1003 08:08:49.999434 4664 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/576a3dfe-80a8-4676-b79e-1bac2c67b8fb-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 03 08:08:49 crc kubenswrapper[4664]: I1003 08:08:49.999446 4664 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/576a3dfe-80a8-4676-b79e-1bac2c67b8fb-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:08:49 crc kubenswrapper[4664]: I1003 08:08:49.999456 4664 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/576a3dfe-80a8-4676-b79e-1bac2c67b8fb-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:08:49 crc kubenswrapper[4664]: I1003 08:08:49.999467 4664 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/576a3dfe-80a8-4676-b79e-1bac2c67b8fb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:08:49 crc kubenswrapper[4664]: I1003 08:08:49.999479 4664 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/576a3dfe-80a8-4676-b79e-1bac2c67b8fb-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 03 08:08:50 crc kubenswrapper[4664]: I1003 08:08:50.054058 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-55xrn" Oct 03 08:08:50 crc kubenswrapper[4664]: I1003 08:08:50.054075 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-55xrn" event={"ID":"576a3dfe-80a8-4676-b79e-1bac2c67b8fb","Type":"ContainerDied","Data":"9746f1b56dffe77641f2c7fe5728e7f6c632c4fb9b2fded0613a031a10e9e8a6"} Oct 03 08:08:50 crc kubenswrapper[4664]: I1003 08:08:50.054510 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9746f1b56dffe77641f2c7fe5728e7f6c632c4fb9b2fded0613a031a10e9e8a6" Oct 03 08:08:50 crc kubenswrapper[4664]: E1003 08:08:50.056338 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-k59vh" podUID="c13de59d-0879-41ca-95cc-e8bf05c223eb" Oct 03 08:08:50 crc kubenswrapper[4664]: I1003 08:08:50.863993 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-55xrn"] Oct 03 08:08:50 crc kubenswrapper[4664]: I1003 08:08:50.872540 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-55xrn"] Oct 03 08:08:50 crc kubenswrapper[4664]: I1003 08:08:50.957019 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-q2khg"] Oct 03 08:08:50 crc kubenswrapper[4664]: E1003 08:08:50.957417 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="576a3dfe-80a8-4676-b79e-1bac2c67b8fb" containerName="keystone-bootstrap" Oct 03 08:08:50 crc kubenswrapper[4664]: I1003 08:08:50.957435 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="576a3dfe-80a8-4676-b79e-1bac2c67b8fb" containerName="keystone-bootstrap" Oct 03 08:08:50 crc kubenswrapper[4664]: I1003 08:08:50.957663 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="576a3dfe-80a8-4676-b79e-1bac2c67b8fb" containerName="keystone-bootstrap" Oct 03 08:08:50 crc kubenswrapper[4664]: I1003 08:08:50.958439 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-q2khg" Oct 03 08:08:50 crc kubenswrapper[4664]: I1003 08:08:50.960882 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 03 08:08:50 crc kubenswrapper[4664]: I1003 08:08:50.961323 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 03 08:08:50 crc kubenswrapper[4664]: I1003 08:08:50.962180 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-bm9nt" Oct 03 08:08:50 crc kubenswrapper[4664]: I1003 08:08:50.962860 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 03 08:08:50 crc kubenswrapper[4664]: I1003 08:08:50.975660 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-q2khg"] Oct 03 08:08:51 crc kubenswrapper[4664]: I1003 08:08:51.064939 4664 generic.go:334] "Generic (PLEG): container finished" podID="d75d0f35-4cda-4925-8d0d-1666f794ce9b" containerID="c62a70250bacca48817b9ee175f91b0a12a26ee8d51bc5f0a3fe50be3677205e" exitCode=0 Oct 03 08:08:51 crc kubenswrapper[4664]: I1003 08:08:51.064993 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-djcmk" event={"ID":"d75d0f35-4cda-4925-8d0d-1666f794ce9b","Type":"ContainerDied","Data":"c62a70250bacca48817b9ee175f91b0a12a26ee8d51bc5f0a3fe50be3677205e"} Oct 03 08:08:51 crc kubenswrapper[4664]: I1003 08:08:51.128660 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6eef4071-9678-4d8e-a595-3ab2d97f1862-combined-ca-bundle\") pod \"keystone-bootstrap-q2khg\" (UID: \"6eef4071-9678-4d8e-a595-3ab2d97f1862\") " pod="openstack/keystone-bootstrap-q2khg" Oct 03 08:08:51 crc kubenswrapper[4664]: I1003 08:08:51.128782 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6eef4071-9678-4d8e-a595-3ab2d97f1862-credential-keys\") pod \"keystone-bootstrap-q2khg\" (UID: \"6eef4071-9678-4d8e-a595-3ab2d97f1862\") " pod="openstack/keystone-bootstrap-q2khg" Oct 03 08:08:51 crc kubenswrapper[4664]: I1003 08:08:51.128899 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6eef4071-9678-4d8e-a595-3ab2d97f1862-fernet-keys\") pod \"keystone-bootstrap-q2khg\" (UID: \"6eef4071-9678-4d8e-a595-3ab2d97f1862\") " pod="openstack/keystone-bootstrap-q2khg" Oct 03 08:08:51 crc kubenswrapper[4664]: I1003 08:08:51.128968 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6eef4071-9678-4d8e-a595-3ab2d97f1862-scripts\") pod \"keystone-bootstrap-q2khg\" (UID: \"6eef4071-9678-4d8e-a595-3ab2d97f1862\") " pod="openstack/keystone-bootstrap-q2khg" Oct 03 08:08:51 crc kubenswrapper[4664]: I1003 08:08:51.129005 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjw2d\" (UniqueName: \"kubernetes.io/projected/6eef4071-9678-4d8e-a595-3ab2d97f1862-kube-api-access-fjw2d\") pod \"keystone-bootstrap-q2khg\" (UID: \"6eef4071-9678-4d8e-a595-3ab2d97f1862\") " pod="openstack/keystone-bootstrap-q2khg" Oct 03 08:08:51 crc kubenswrapper[4664]: I1003 08:08:51.129054 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6eef4071-9678-4d8e-a595-3ab2d97f1862-config-data\") pod \"keystone-bootstrap-q2khg\" (UID: \"6eef4071-9678-4d8e-a595-3ab2d97f1862\") " pod="openstack/keystone-bootstrap-q2khg" Oct 03 08:08:51 crc kubenswrapper[4664]: I1003 08:08:51.230749 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6eef4071-9678-4d8e-a595-3ab2d97f1862-config-data\") pod \"keystone-bootstrap-q2khg\" (UID: \"6eef4071-9678-4d8e-a595-3ab2d97f1862\") " pod="openstack/keystone-bootstrap-q2khg" Oct 03 08:08:51 crc kubenswrapper[4664]: I1003 08:08:51.230897 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6eef4071-9678-4d8e-a595-3ab2d97f1862-combined-ca-bundle\") pod \"keystone-bootstrap-q2khg\" (UID: \"6eef4071-9678-4d8e-a595-3ab2d97f1862\") " pod="openstack/keystone-bootstrap-q2khg" Oct 03 08:08:51 crc kubenswrapper[4664]: I1003 08:08:51.230920 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6eef4071-9678-4d8e-a595-3ab2d97f1862-credential-keys\") pod \"keystone-bootstrap-q2khg\" (UID: \"6eef4071-9678-4d8e-a595-3ab2d97f1862\") " pod="openstack/keystone-bootstrap-q2khg" Oct 03 08:08:51 crc kubenswrapper[4664]: I1003 08:08:51.230985 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6eef4071-9678-4d8e-a595-3ab2d97f1862-fernet-keys\") pod \"keystone-bootstrap-q2khg\" (UID: \"6eef4071-9678-4d8e-a595-3ab2d97f1862\") " pod="openstack/keystone-bootstrap-q2khg" Oct 03 08:08:51 crc kubenswrapper[4664]: I1003 08:08:51.231016 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6eef4071-9678-4d8e-a595-3ab2d97f1862-scripts\") pod \"keystone-bootstrap-q2khg\" (UID: \"6eef4071-9678-4d8e-a595-3ab2d97f1862\") " pod="openstack/keystone-bootstrap-q2khg" Oct 03 08:08:51 crc kubenswrapper[4664]: I1003 08:08:51.231062 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjw2d\" (UniqueName: \"kubernetes.io/projected/6eef4071-9678-4d8e-a595-3ab2d97f1862-kube-api-access-fjw2d\") pod \"keystone-bootstrap-q2khg\" (UID: \"6eef4071-9678-4d8e-a595-3ab2d97f1862\") " pod="openstack/keystone-bootstrap-q2khg" Oct 03 08:08:51 crc kubenswrapper[4664]: I1003 08:08:51.236829 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6eef4071-9678-4d8e-a595-3ab2d97f1862-fernet-keys\") pod \"keystone-bootstrap-q2khg\" (UID: \"6eef4071-9678-4d8e-a595-3ab2d97f1862\") " pod="openstack/keystone-bootstrap-q2khg" Oct 03 08:08:51 crc kubenswrapper[4664]: I1003 08:08:51.237075 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6eef4071-9678-4d8e-a595-3ab2d97f1862-scripts\") pod \"keystone-bootstrap-q2khg\" (UID: \"6eef4071-9678-4d8e-a595-3ab2d97f1862\") " pod="openstack/keystone-bootstrap-q2khg" Oct 03 08:08:51 crc kubenswrapper[4664]: I1003 08:08:51.237703 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6eef4071-9678-4d8e-a595-3ab2d97f1862-credential-keys\") pod \"keystone-bootstrap-q2khg\" (UID: \"6eef4071-9678-4d8e-a595-3ab2d97f1862\") " pod="openstack/keystone-bootstrap-q2khg" Oct 03 08:08:51 crc kubenswrapper[4664]: I1003 08:08:51.239904 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6eef4071-9678-4d8e-a595-3ab2d97f1862-config-data\") pod \"keystone-bootstrap-q2khg\" (UID: \"6eef4071-9678-4d8e-a595-3ab2d97f1862\") " pod="openstack/keystone-bootstrap-q2khg" Oct 03 08:08:51 crc kubenswrapper[4664]: I1003 08:08:51.249129 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6eef4071-9678-4d8e-a595-3ab2d97f1862-combined-ca-bundle\") pod \"keystone-bootstrap-q2khg\" (UID: \"6eef4071-9678-4d8e-a595-3ab2d97f1862\") " pod="openstack/keystone-bootstrap-q2khg" Oct 03 08:08:51 crc kubenswrapper[4664]: I1003 08:08:51.250012 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjw2d\" (UniqueName: \"kubernetes.io/projected/6eef4071-9678-4d8e-a595-3ab2d97f1862-kube-api-access-fjw2d\") pod \"keystone-bootstrap-q2khg\" (UID: \"6eef4071-9678-4d8e-a595-3ab2d97f1862\") " pod="openstack/keystone-bootstrap-q2khg" Oct 03 08:08:51 crc kubenswrapper[4664]: I1003 08:08:51.292335 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-q2khg" Oct 03 08:08:51 crc kubenswrapper[4664]: I1003 08:08:51.889400 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="576a3dfe-80a8-4676-b79e-1bac2c67b8fb" path="/var/lib/kubelet/pods/576a3dfe-80a8-4676-b79e-1bac2c67b8fb/volumes" Oct 03 08:08:53 crc kubenswrapper[4664]: I1003 08:08:53.273200 4664 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-7nzs5" podUID="455d48b8-17a8-4c2f-9923-a30626506388" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.110:5353: i/o timeout" Oct 03 08:08:53 crc kubenswrapper[4664]: I1003 08:08:53.273692 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-7nzs5" Oct 03 08:08:58 crc kubenswrapper[4664]: I1003 08:08:58.274428 4664 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-7nzs5" podUID="455d48b8-17a8-4c2f-9923-a30626506388" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.110:5353: i/o timeout" Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.588661 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-7nzs5" Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.599685 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-796f89ff4f-qpp7l" Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.605950 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6978d9ddc7-5hs4k" Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.615165 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6c84f6cb8c-j8pcf" Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.733944 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e07f6c1a-40d5-409f-bb52-c32cbcdca08f-config-data\") pod \"e07f6c1a-40d5-409f-bb52-c32cbcdca08f\" (UID: \"e07f6c1a-40d5-409f-bb52-c32cbcdca08f\") " Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.734009 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b49bw\" (UniqueName: \"kubernetes.io/projected/455d48b8-17a8-4c2f-9923-a30626506388-kube-api-access-b49bw\") pod \"455d48b8-17a8-4c2f-9923-a30626506388\" (UID: \"455d48b8-17a8-4c2f-9923-a30626506388\") " Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.734046 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/455d48b8-17a8-4c2f-9923-a30626506388-ovsdbserver-sb\") pod \"455d48b8-17a8-4c2f-9923-a30626506388\" (UID: \"455d48b8-17a8-4c2f-9923-a30626506388\") " Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.734070 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6r7gb\" (UniqueName: \"kubernetes.io/projected/23020330-9f07-4330-9861-ba58dd62e6b2-kube-api-access-6r7gb\") pod \"23020330-9f07-4330-9861-ba58dd62e6b2\" (UID: \"23020330-9f07-4330-9861-ba58dd62e6b2\") " Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.734112 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/23020330-9f07-4330-9861-ba58dd62e6b2-horizon-secret-key\") pod \"23020330-9f07-4330-9861-ba58dd62e6b2\" (UID: \"23020330-9f07-4330-9861-ba58dd62e6b2\") " Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.734142 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/455d48b8-17a8-4c2f-9923-a30626506388-dns-svc\") pod \"455d48b8-17a8-4c2f-9923-a30626506388\" (UID: \"455d48b8-17a8-4c2f-9923-a30626506388\") " Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.734160 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/43c23d38-4d5e-4b2e-a062-8384dd8d8138-config-data\") pod \"43c23d38-4d5e-4b2e-a062-8384dd8d8138\" (UID: \"43c23d38-4d5e-4b2e-a062-8384dd8d8138\") " Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.734201 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e07f6c1a-40d5-409f-bb52-c32cbcdca08f-horizon-secret-key\") pod \"e07f6c1a-40d5-409f-bb52-c32cbcdca08f\" (UID: \"e07f6c1a-40d5-409f-bb52-c32cbcdca08f\") " Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.734220 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43c23d38-4d5e-4b2e-a062-8384dd8d8138-logs\") pod \"43c23d38-4d5e-4b2e-a062-8384dd8d8138\" (UID: \"43c23d38-4d5e-4b2e-a062-8384dd8d8138\") " Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.734255 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/455d48b8-17a8-4c2f-9923-a30626506388-ovsdbserver-nb\") pod \"455d48b8-17a8-4c2f-9923-a30626506388\" (UID: \"455d48b8-17a8-4c2f-9923-a30626506388\") " Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.734300 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfkn4\" (UniqueName: \"kubernetes.io/projected/e07f6c1a-40d5-409f-bb52-c32cbcdca08f-kube-api-access-rfkn4\") pod \"e07f6c1a-40d5-409f-bb52-c32cbcdca08f\" (UID: \"e07f6c1a-40d5-409f-bb52-c32cbcdca08f\") " Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.734323 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/43c23d38-4d5e-4b2e-a062-8384dd8d8138-scripts\") pod \"43c23d38-4d5e-4b2e-a062-8384dd8d8138\" (UID: \"43c23d38-4d5e-4b2e-a062-8384dd8d8138\") " Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.734367 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e07f6c1a-40d5-409f-bb52-c32cbcdca08f-scripts\") pod \"e07f6c1a-40d5-409f-bb52-c32cbcdca08f\" (UID: \"e07f6c1a-40d5-409f-bb52-c32cbcdca08f\") " Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.734401 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/23020330-9f07-4330-9861-ba58dd62e6b2-scripts\") pod \"23020330-9f07-4330-9861-ba58dd62e6b2\" (UID: \"23020330-9f07-4330-9861-ba58dd62e6b2\") " Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.734439 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/455d48b8-17a8-4c2f-9923-a30626506388-config\") pod \"455d48b8-17a8-4c2f-9923-a30626506388\" (UID: \"455d48b8-17a8-4c2f-9923-a30626506388\") " Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.734476 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e07f6c1a-40d5-409f-bb52-c32cbcdca08f-logs\") pod \"e07f6c1a-40d5-409f-bb52-c32cbcdca08f\" (UID: \"e07f6c1a-40d5-409f-bb52-c32cbcdca08f\") " Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.734506 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/23020330-9f07-4330-9861-ba58dd62e6b2-config-data\") pod \"23020330-9f07-4330-9861-ba58dd62e6b2\" (UID: \"23020330-9f07-4330-9861-ba58dd62e6b2\") " Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.734534 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23020330-9f07-4330-9861-ba58dd62e6b2-logs\") pod \"23020330-9f07-4330-9861-ba58dd62e6b2\" (UID: \"23020330-9f07-4330-9861-ba58dd62e6b2\") " Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.734564 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/43c23d38-4d5e-4b2e-a062-8384dd8d8138-horizon-secret-key\") pod \"43c23d38-4d5e-4b2e-a062-8384dd8d8138\" (UID: \"43c23d38-4d5e-4b2e-a062-8384dd8d8138\") " Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.734653 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k89f8\" (UniqueName: \"kubernetes.io/projected/43c23d38-4d5e-4b2e-a062-8384dd8d8138-kube-api-access-k89f8\") pod \"43c23d38-4d5e-4b2e-a062-8384dd8d8138\" (UID: \"43c23d38-4d5e-4b2e-a062-8384dd8d8138\") " Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.735585 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e07f6c1a-40d5-409f-bb52-c32cbcdca08f-config-data" (OuterVolumeSpecName: "config-data") pod "e07f6c1a-40d5-409f-bb52-c32cbcdca08f" (UID: "e07f6c1a-40d5-409f-bb52-c32cbcdca08f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.735704 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43c23d38-4d5e-4b2e-a062-8384dd8d8138-config-data" (OuterVolumeSpecName: "config-data") pod "43c23d38-4d5e-4b2e-a062-8384dd8d8138" (UID: "43c23d38-4d5e-4b2e-a062-8384dd8d8138"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.735795 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e07f6c1a-40d5-409f-bb52-c32cbcdca08f-scripts" (OuterVolumeSpecName: "scripts") pod "e07f6c1a-40d5-409f-bb52-c32cbcdca08f" (UID: "e07f6c1a-40d5-409f-bb52-c32cbcdca08f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.736278 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e07f6c1a-40d5-409f-bb52-c32cbcdca08f-logs" (OuterVolumeSpecName: "logs") pod "e07f6c1a-40d5-409f-bb52-c32cbcdca08f" (UID: "e07f6c1a-40d5-409f-bb52-c32cbcdca08f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.736531 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43c23d38-4d5e-4b2e-a062-8384dd8d8138-scripts" (OuterVolumeSpecName: "scripts") pod "43c23d38-4d5e-4b2e-a062-8384dd8d8138" (UID: "43c23d38-4d5e-4b2e-a062-8384dd8d8138"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.736950 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23020330-9f07-4330-9861-ba58dd62e6b2-scripts" (OuterVolumeSpecName: "scripts") pod "23020330-9f07-4330-9861-ba58dd62e6b2" (UID: "23020330-9f07-4330-9861-ba58dd62e6b2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.737170 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23020330-9f07-4330-9861-ba58dd62e6b2-logs" (OuterVolumeSpecName: "logs") pod "23020330-9f07-4330-9861-ba58dd62e6b2" (UID: "23020330-9f07-4330-9861-ba58dd62e6b2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.737866 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23020330-9f07-4330-9861-ba58dd62e6b2-config-data" (OuterVolumeSpecName: "config-data") pod "23020330-9f07-4330-9861-ba58dd62e6b2" (UID: "23020330-9f07-4330-9861-ba58dd62e6b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.738438 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43c23d38-4d5e-4b2e-a062-8384dd8d8138-logs" (OuterVolumeSpecName: "logs") pod "43c23d38-4d5e-4b2e-a062-8384dd8d8138" (UID: "43c23d38-4d5e-4b2e-a062-8384dd8d8138"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.742450 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/455d48b8-17a8-4c2f-9923-a30626506388-kube-api-access-b49bw" (OuterVolumeSpecName: "kube-api-access-b49bw") pod "455d48b8-17a8-4c2f-9923-a30626506388" (UID: "455d48b8-17a8-4c2f-9923-a30626506388"). InnerVolumeSpecName "kube-api-access-b49bw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.744185 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23020330-9f07-4330-9861-ba58dd62e6b2-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "23020330-9f07-4330-9861-ba58dd62e6b2" (UID: "23020330-9f07-4330-9861-ba58dd62e6b2"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.744196 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e07f6c1a-40d5-409f-bb52-c32cbcdca08f-kube-api-access-rfkn4" (OuterVolumeSpecName: "kube-api-access-rfkn4") pod "e07f6c1a-40d5-409f-bb52-c32cbcdca08f" (UID: "e07f6c1a-40d5-409f-bb52-c32cbcdca08f"). InnerVolumeSpecName "kube-api-access-rfkn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.745263 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e07f6c1a-40d5-409f-bb52-c32cbcdca08f-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "e07f6c1a-40d5-409f-bb52-c32cbcdca08f" (UID: "e07f6c1a-40d5-409f-bb52-c32cbcdca08f"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.746928 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23020330-9f07-4330-9861-ba58dd62e6b2-kube-api-access-6r7gb" (OuterVolumeSpecName: "kube-api-access-6r7gb") pod "23020330-9f07-4330-9861-ba58dd62e6b2" (UID: "23020330-9f07-4330-9861-ba58dd62e6b2"). InnerVolumeSpecName "kube-api-access-6r7gb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.758665 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43c23d38-4d5e-4b2e-a062-8384dd8d8138-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "43c23d38-4d5e-4b2e-a062-8384dd8d8138" (UID: "43c23d38-4d5e-4b2e-a062-8384dd8d8138"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.759335 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43c23d38-4d5e-4b2e-a062-8384dd8d8138-kube-api-access-k89f8" (OuterVolumeSpecName: "kube-api-access-k89f8") pod "43c23d38-4d5e-4b2e-a062-8384dd8d8138" (UID: "43c23d38-4d5e-4b2e-a062-8384dd8d8138"). InnerVolumeSpecName "kube-api-access-k89f8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.796528 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/455d48b8-17a8-4c2f-9923-a30626506388-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "455d48b8-17a8-4c2f-9923-a30626506388" (UID: "455d48b8-17a8-4c2f-9923-a30626506388"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.796784 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/455d48b8-17a8-4c2f-9923-a30626506388-config" (OuterVolumeSpecName: "config") pod "455d48b8-17a8-4c2f-9923-a30626506388" (UID: "455d48b8-17a8-4c2f-9923-a30626506388"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.805458 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/455d48b8-17a8-4c2f-9923-a30626506388-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "455d48b8-17a8-4c2f-9923-a30626506388" (UID: "455d48b8-17a8-4c2f-9923-a30626506388"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.806955 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/455d48b8-17a8-4c2f-9923-a30626506388-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "455d48b8-17a8-4c2f-9923-a30626506388" (UID: "455d48b8-17a8-4c2f-9923-a30626506388"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.836522 4664 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/455d48b8-17a8-4c2f-9923-a30626506388-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.836564 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfkn4\" (UniqueName: \"kubernetes.io/projected/e07f6c1a-40d5-409f-bb52-c32cbcdca08f-kube-api-access-rfkn4\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.836578 4664 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/43c23d38-4d5e-4b2e-a062-8384dd8d8138-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.836594 4664 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e07f6c1a-40d5-409f-bb52-c32cbcdca08f-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.836622 4664 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/23020330-9f07-4330-9861-ba58dd62e6b2-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.836632 4664 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/455d48b8-17a8-4c2f-9923-a30626506388-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.836641 4664 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e07f6c1a-40d5-409f-bb52-c32cbcdca08f-logs\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.836650 4664 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/23020330-9f07-4330-9861-ba58dd62e6b2-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.836658 4664 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23020330-9f07-4330-9861-ba58dd62e6b2-logs\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.836667 4664 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/43c23d38-4d5e-4b2e-a062-8384dd8d8138-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.836676 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k89f8\" (UniqueName: \"kubernetes.io/projected/43c23d38-4d5e-4b2e-a062-8384dd8d8138-kube-api-access-k89f8\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.836685 4664 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e07f6c1a-40d5-409f-bb52-c32cbcdca08f-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.836695 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b49bw\" (UniqueName: \"kubernetes.io/projected/455d48b8-17a8-4c2f-9923-a30626506388-kube-api-access-b49bw\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.836704 4664 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/455d48b8-17a8-4c2f-9923-a30626506388-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.836713 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6r7gb\" (UniqueName: \"kubernetes.io/projected/23020330-9f07-4330-9861-ba58dd62e6b2-kube-api-access-6r7gb\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.836721 4664 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/23020330-9f07-4330-9861-ba58dd62e6b2-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.836731 4664 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/455d48b8-17a8-4c2f-9923-a30626506388-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.836740 4664 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/43c23d38-4d5e-4b2e-a062-8384dd8d8138-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.836749 4664 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e07f6c1a-40d5-409f-bb52-c32cbcdca08f-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:01 crc kubenswrapper[4664]: I1003 08:09:01.836757 4664 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43c23d38-4d5e-4b2e-a062-8384dd8d8138-logs\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:02 crc kubenswrapper[4664]: E1003 08:09:02.121376 4664 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Oct 03 08:09:02 crc kubenswrapper[4664]: E1003 08:09:02.121948 4664 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q4q5z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-wcvcw_openstack(82832c17-408b-4b89-992f-09e393024fe2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 08:09:02 crc kubenswrapper[4664]: E1003 08:09:02.123145 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-wcvcw" podUID="82832c17-408b-4b89-992f-09e393024fe2" Oct 03 08:09:02 crc kubenswrapper[4664]: I1003 08:09:02.165293 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-djcmk" Oct 03 08:09:02 crc kubenswrapper[4664]: I1003 08:09:02.177216 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-7nzs5" Oct 03 08:09:02 crc kubenswrapper[4664]: I1003 08:09:02.177665 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-7nzs5" event={"ID":"455d48b8-17a8-4c2f-9923-a30626506388","Type":"ContainerDied","Data":"54ea4fac0a1e7542db8d75d60858a8b0ccf58ebcc0d9756e676346e56dd44951"} Oct 03 08:09:02 crc kubenswrapper[4664]: I1003 08:09:02.177725 4664 scope.go:117] "RemoveContainer" containerID="f11f91f1b8e17dcbb36ff90cf9bf1aecf3a5a37bbc0f64b527cf36e74cb8321f" Oct 03 08:09:02 crc kubenswrapper[4664]: I1003 08:09:02.178587 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-796f89ff4f-qpp7l" event={"ID":"43c23d38-4d5e-4b2e-a062-8384dd8d8138","Type":"ContainerDied","Data":"67b379af3161bb40d9a5f23c577c5e1618a5175c4eab01bca9510678cf5fd2d2"} Oct 03 08:09:02 crc kubenswrapper[4664]: I1003 08:09:02.178711 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-796f89ff4f-qpp7l" Oct 03 08:09:02 crc kubenswrapper[4664]: I1003 08:09:02.196216 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c84f6cb8c-j8pcf" event={"ID":"23020330-9f07-4330-9861-ba58dd62e6b2","Type":"ContainerDied","Data":"cba91cf302a556ea15d9c821d2d4e40c33b6405dfd2411e54ea7ac322ea05b17"} Oct 03 08:09:02 crc kubenswrapper[4664]: I1003 08:09:02.196314 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6c84f6cb8c-j8pcf" Oct 03 08:09:02 crc kubenswrapper[4664]: I1003 08:09:02.199314 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6978d9ddc7-5hs4k" event={"ID":"e07f6c1a-40d5-409f-bb52-c32cbcdca08f","Type":"ContainerDied","Data":"05f3f74fb706d149120fb512b6a193958e85a662a16338c718ab91a5eb92939f"} Oct 03 08:09:02 crc kubenswrapper[4664]: I1003 08:09:02.199422 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6978d9ddc7-5hs4k" Oct 03 08:09:02 crc kubenswrapper[4664]: I1003 08:09:02.204260 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-djcmk" Oct 03 08:09:02 crc kubenswrapper[4664]: I1003 08:09:02.204558 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-djcmk" event={"ID":"d75d0f35-4cda-4925-8d0d-1666f794ce9b","Type":"ContainerDied","Data":"bd1caeea9dc736dc5ba8ee61a006ea55bfef98dd9b57cf82e43a4b437659e188"} Oct 03 08:09:02 crc kubenswrapper[4664]: I1003 08:09:02.204616 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd1caeea9dc736dc5ba8ee61a006ea55bfef98dd9b57cf82e43a4b437659e188" Oct 03 08:09:02 crc kubenswrapper[4664]: E1003 08:09:02.206867 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-wcvcw" podUID="82832c17-408b-4b89-992f-09e393024fe2" Oct 03 08:09:02 crc kubenswrapper[4664]: I1003 08:09:02.254322 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-796f89ff4f-qpp7l"] Oct 03 08:09:02 crc kubenswrapper[4664]: I1003 08:09:02.268335 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-796f89ff4f-qpp7l"] Oct 03 08:09:02 crc kubenswrapper[4664]: I1003 08:09:02.280575 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-7nzs5"] Oct 03 08:09:02 crc kubenswrapper[4664]: I1003 08:09:02.291313 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-7nzs5"] Oct 03 08:09:02 crc kubenswrapper[4664]: I1003 08:09:02.362022 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6c84f6cb8c-j8pcf"] Oct 03 08:09:02 crc kubenswrapper[4664]: I1003 08:09:02.363001 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d75d0f35-4cda-4925-8d0d-1666f794ce9b-config-data\") pod \"d75d0f35-4cda-4925-8d0d-1666f794ce9b\" (UID: \"d75d0f35-4cda-4925-8d0d-1666f794ce9b\") " Oct 03 08:09:02 crc kubenswrapper[4664]: I1003 08:09:02.363111 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d75d0f35-4cda-4925-8d0d-1666f794ce9b-db-sync-config-data\") pod \"d75d0f35-4cda-4925-8d0d-1666f794ce9b\" (UID: \"d75d0f35-4cda-4925-8d0d-1666f794ce9b\") " Oct 03 08:09:02 crc kubenswrapper[4664]: I1003 08:09:02.363146 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2x5fz\" (UniqueName: \"kubernetes.io/projected/d75d0f35-4cda-4925-8d0d-1666f794ce9b-kube-api-access-2x5fz\") pod \"d75d0f35-4cda-4925-8d0d-1666f794ce9b\" (UID: \"d75d0f35-4cda-4925-8d0d-1666f794ce9b\") " Oct 03 08:09:02 crc kubenswrapper[4664]: I1003 08:09:02.363211 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d75d0f35-4cda-4925-8d0d-1666f794ce9b-combined-ca-bundle\") pod \"d75d0f35-4cda-4925-8d0d-1666f794ce9b\" (UID: \"d75d0f35-4cda-4925-8d0d-1666f794ce9b\") " Oct 03 08:09:02 crc kubenswrapper[4664]: I1003 08:09:02.369567 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d75d0f35-4cda-4925-8d0d-1666f794ce9b-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d75d0f35-4cda-4925-8d0d-1666f794ce9b" (UID: "d75d0f35-4cda-4925-8d0d-1666f794ce9b"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:09:02 crc kubenswrapper[4664]: I1003 08:09:02.372519 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d75d0f35-4cda-4925-8d0d-1666f794ce9b-kube-api-access-2x5fz" (OuterVolumeSpecName: "kube-api-access-2x5fz") pod "d75d0f35-4cda-4925-8d0d-1666f794ce9b" (UID: "d75d0f35-4cda-4925-8d0d-1666f794ce9b"). InnerVolumeSpecName "kube-api-access-2x5fz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:09:02 crc kubenswrapper[4664]: I1003 08:09:02.385984 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6c84f6cb8c-j8pcf"] Oct 03 08:09:02 crc kubenswrapper[4664]: I1003 08:09:02.413776 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d75d0f35-4cda-4925-8d0d-1666f794ce9b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d75d0f35-4cda-4925-8d0d-1666f794ce9b" (UID: "d75d0f35-4cda-4925-8d0d-1666f794ce9b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:09:02 crc kubenswrapper[4664]: I1003 08:09:02.418154 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6978d9ddc7-5hs4k"] Oct 03 08:09:02 crc kubenswrapper[4664]: I1003 08:09:02.425670 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6978d9ddc7-5hs4k"] Oct 03 08:09:02 crc kubenswrapper[4664]: I1003 08:09:02.427301 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d75d0f35-4cda-4925-8d0d-1666f794ce9b-config-data" (OuterVolumeSpecName: "config-data") pod "d75d0f35-4cda-4925-8d0d-1666f794ce9b" (UID: "d75d0f35-4cda-4925-8d0d-1666f794ce9b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:09:02 crc kubenswrapper[4664]: I1003 08:09:02.466140 4664 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d75d0f35-4cda-4925-8d0d-1666f794ce9b-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:02 crc kubenswrapper[4664]: I1003 08:09:02.466185 4664 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d75d0f35-4cda-4925-8d0d-1666f794ce9b-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:02 crc kubenswrapper[4664]: I1003 08:09:02.466201 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2x5fz\" (UniqueName: \"kubernetes.io/projected/d75d0f35-4cda-4925-8d0d-1666f794ce9b-kube-api-access-2x5fz\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:02 crc kubenswrapper[4664]: I1003 08:09:02.466216 4664 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d75d0f35-4cda-4925-8d0d-1666f794ce9b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:02 crc kubenswrapper[4664]: I1003 08:09:02.628229 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-76644f9584-br5jb"] Oct 03 08:09:03 crc kubenswrapper[4664]: I1003 08:09:03.275736 4664 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-7nzs5" podUID="455d48b8-17a8-4c2f-9923-a30626506388" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.110:5353: i/o timeout" Oct 03 08:09:03 crc kubenswrapper[4664]: E1003 08:09:03.468955 4664 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Oct 03 08:09:03 crc kubenswrapper[4664]: E1003 08:09:03.469411 4664 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qhjb2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-lgr4h_openstack(4d316d5e-f411-4940-af4d-9c42f5baae63): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 08:09:03 crc kubenswrapper[4664]: E1003 08:09:03.473092 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-lgr4h" podUID="4d316d5e-f411-4940-af4d-9c42f5baae63" Oct 03 08:09:03 crc kubenswrapper[4664]: I1003 08:09:03.554110 4664 scope.go:117] "RemoveContainer" containerID="e9acd6a3e935cf29a65fe995181e25088e73a6149a4a689a65ed6d8709c8d13f" Oct 03 08:09:03 crc kubenswrapper[4664]: I1003 08:09:03.676491 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-rb6rm"] Oct 03 08:09:03 crc kubenswrapper[4664]: E1003 08:09:03.676998 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d75d0f35-4cda-4925-8d0d-1666f794ce9b" containerName="glance-db-sync" Oct 03 08:09:03 crc kubenswrapper[4664]: I1003 08:09:03.677013 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="d75d0f35-4cda-4925-8d0d-1666f794ce9b" containerName="glance-db-sync" Oct 03 08:09:03 crc kubenswrapper[4664]: E1003 08:09:03.677034 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="455d48b8-17a8-4c2f-9923-a30626506388" containerName="init" Oct 03 08:09:03 crc kubenswrapper[4664]: I1003 08:09:03.677042 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="455d48b8-17a8-4c2f-9923-a30626506388" containerName="init" Oct 03 08:09:03 crc kubenswrapper[4664]: E1003 08:09:03.677063 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="455d48b8-17a8-4c2f-9923-a30626506388" containerName="dnsmasq-dns" Oct 03 08:09:03 crc kubenswrapper[4664]: I1003 08:09:03.677070 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="455d48b8-17a8-4c2f-9923-a30626506388" containerName="dnsmasq-dns" Oct 03 08:09:03 crc kubenswrapper[4664]: I1003 08:09:03.677257 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="d75d0f35-4cda-4925-8d0d-1666f794ce9b" containerName="glance-db-sync" Oct 03 08:09:03 crc kubenswrapper[4664]: I1003 08:09:03.677287 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="455d48b8-17a8-4c2f-9923-a30626506388" containerName="dnsmasq-dns" Oct 03 08:09:03 crc kubenswrapper[4664]: I1003 08:09:03.678267 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-rb6rm" Oct 03 08:09:03 crc kubenswrapper[4664]: I1003 08:09:03.689370 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-rb6rm"] Oct 03 08:09:03 crc kubenswrapper[4664]: I1003 08:09:03.691160 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/85f0c94a-31da-41e5-aa27-9840d3704a67-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-rb6rm\" (UID: \"85f0c94a-31da-41e5-aa27-9840d3704a67\") " pod="openstack/dnsmasq-dns-57c957c4ff-rb6rm" Oct 03 08:09:03 crc kubenswrapper[4664]: I1003 08:09:03.691240 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85f0c94a-31da-41e5-aa27-9840d3704a67-config\") pod \"dnsmasq-dns-57c957c4ff-rb6rm\" (UID: \"85f0c94a-31da-41e5-aa27-9840d3704a67\") " pod="openstack/dnsmasq-dns-57c957c4ff-rb6rm" Oct 03 08:09:03 crc kubenswrapper[4664]: I1003 08:09:03.691321 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/85f0c94a-31da-41e5-aa27-9840d3704a67-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-rb6rm\" (UID: \"85f0c94a-31da-41e5-aa27-9840d3704a67\") " pod="openstack/dnsmasq-dns-57c957c4ff-rb6rm" Oct 03 08:09:03 crc kubenswrapper[4664]: I1003 08:09:03.691398 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/85f0c94a-31da-41e5-aa27-9840d3704a67-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-rb6rm\" (UID: \"85f0c94a-31da-41e5-aa27-9840d3704a67\") " pod="openstack/dnsmasq-dns-57c957c4ff-rb6rm" Oct 03 08:09:03 crc kubenswrapper[4664]: I1003 08:09:03.693593 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kkmd\" (UniqueName: \"kubernetes.io/projected/85f0c94a-31da-41e5-aa27-9840d3704a67-kube-api-access-8kkmd\") pod \"dnsmasq-dns-57c957c4ff-rb6rm\" (UID: \"85f0c94a-31da-41e5-aa27-9840d3704a67\") " pod="openstack/dnsmasq-dns-57c957c4ff-rb6rm" Oct 03 08:09:03 crc kubenswrapper[4664]: I1003 08:09:03.693884 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85f0c94a-31da-41e5-aa27-9840d3704a67-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-rb6rm\" (UID: \"85f0c94a-31da-41e5-aa27-9840d3704a67\") " pod="openstack/dnsmasq-dns-57c957c4ff-rb6rm" Oct 03 08:09:03 crc kubenswrapper[4664]: I1003 08:09:03.801770 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85f0c94a-31da-41e5-aa27-9840d3704a67-config\") pod \"dnsmasq-dns-57c957c4ff-rb6rm\" (UID: \"85f0c94a-31da-41e5-aa27-9840d3704a67\") " pod="openstack/dnsmasq-dns-57c957c4ff-rb6rm" Oct 03 08:09:03 crc kubenswrapper[4664]: I1003 08:09:03.803293 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85f0c94a-31da-41e5-aa27-9840d3704a67-config\") pod \"dnsmasq-dns-57c957c4ff-rb6rm\" (UID: \"85f0c94a-31da-41e5-aa27-9840d3704a67\") " pod="openstack/dnsmasq-dns-57c957c4ff-rb6rm" Oct 03 08:09:03 crc kubenswrapper[4664]: I1003 08:09:03.804123 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/85f0c94a-31da-41e5-aa27-9840d3704a67-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-rb6rm\" (UID: \"85f0c94a-31da-41e5-aa27-9840d3704a67\") " pod="openstack/dnsmasq-dns-57c957c4ff-rb6rm" Oct 03 08:09:03 crc kubenswrapper[4664]: I1003 08:09:03.806805 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/85f0c94a-31da-41e5-aa27-9840d3704a67-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-rb6rm\" (UID: \"85f0c94a-31da-41e5-aa27-9840d3704a67\") " pod="openstack/dnsmasq-dns-57c957c4ff-rb6rm" Oct 03 08:09:03 crc kubenswrapper[4664]: I1003 08:09:03.811638 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/85f0c94a-31da-41e5-aa27-9840d3704a67-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-rb6rm\" (UID: \"85f0c94a-31da-41e5-aa27-9840d3704a67\") " pod="openstack/dnsmasq-dns-57c957c4ff-rb6rm" Oct 03 08:09:03 crc kubenswrapper[4664]: I1003 08:09:03.811690 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kkmd\" (UniqueName: \"kubernetes.io/projected/85f0c94a-31da-41e5-aa27-9840d3704a67-kube-api-access-8kkmd\") pod \"dnsmasq-dns-57c957c4ff-rb6rm\" (UID: \"85f0c94a-31da-41e5-aa27-9840d3704a67\") " pod="openstack/dnsmasq-dns-57c957c4ff-rb6rm" Oct 03 08:09:03 crc kubenswrapper[4664]: I1003 08:09:03.811865 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85f0c94a-31da-41e5-aa27-9840d3704a67-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-rb6rm\" (UID: \"85f0c94a-31da-41e5-aa27-9840d3704a67\") " pod="openstack/dnsmasq-dns-57c957c4ff-rb6rm" Oct 03 08:09:03 crc kubenswrapper[4664]: I1003 08:09:03.812001 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/85f0c94a-31da-41e5-aa27-9840d3704a67-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-rb6rm\" (UID: \"85f0c94a-31da-41e5-aa27-9840d3704a67\") " pod="openstack/dnsmasq-dns-57c957c4ff-rb6rm" Oct 03 08:09:03 crc kubenswrapper[4664]: I1003 08:09:03.812573 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/85f0c94a-31da-41e5-aa27-9840d3704a67-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-rb6rm\" (UID: \"85f0c94a-31da-41e5-aa27-9840d3704a67\") " pod="openstack/dnsmasq-dns-57c957c4ff-rb6rm" Oct 03 08:09:03 crc kubenswrapper[4664]: I1003 08:09:03.813033 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/85f0c94a-31da-41e5-aa27-9840d3704a67-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-rb6rm\" (UID: \"85f0c94a-31da-41e5-aa27-9840d3704a67\") " pod="openstack/dnsmasq-dns-57c957c4ff-rb6rm" Oct 03 08:09:03 crc kubenswrapper[4664]: I1003 08:09:03.813399 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85f0c94a-31da-41e5-aa27-9840d3704a67-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-rb6rm\" (UID: \"85f0c94a-31da-41e5-aa27-9840d3704a67\") " pod="openstack/dnsmasq-dns-57c957c4ff-rb6rm" Oct 03 08:09:03 crc kubenswrapper[4664]: I1003 08:09:03.903229 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kkmd\" (UniqueName: \"kubernetes.io/projected/85f0c94a-31da-41e5-aa27-9840d3704a67-kube-api-access-8kkmd\") pod \"dnsmasq-dns-57c957c4ff-rb6rm\" (UID: \"85f0c94a-31da-41e5-aa27-9840d3704a67\") " pod="openstack/dnsmasq-dns-57c957c4ff-rb6rm" Oct 03 08:09:03 crc kubenswrapper[4664]: I1003 08:09:03.910300 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23020330-9f07-4330-9861-ba58dd62e6b2" path="/var/lib/kubelet/pods/23020330-9f07-4330-9861-ba58dd62e6b2/volumes" Oct 03 08:09:03 crc kubenswrapper[4664]: I1003 08:09:03.910888 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43c23d38-4d5e-4b2e-a062-8384dd8d8138" path="/var/lib/kubelet/pods/43c23d38-4d5e-4b2e-a062-8384dd8d8138/volumes" Oct 03 08:09:03 crc kubenswrapper[4664]: I1003 08:09:03.911334 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="455d48b8-17a8-4c2f-9923-a30626506388" path="/var/lib/kubelet/pods/455d48b8-17a8-4c2f-9923-a30626506388/volumes" Oct 03 08:09:03 crc kubenswrapper[4664]: I1003 08:09:03.912513 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e07f6c1a-40d5-409f-bb52-c32cbcdca08f" path="/var/lib/kubelet/pods/e07f6c1a-40d5-409f-bb52-c32cbcdca08f/volumes" Oct 03 08:09:04 crc kubenswrapper[4664]: I1003 08:09:04.073369 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-rb6rm" Oct 03 08:09:04 crc kubenswrapper[4664]: I1003 08:09:04.175038 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-q2khg"] Oct 03 08:09:04 crc kubenswrapper[4664]: I1003 08:09:04.240676 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7050eaa4-061a-4dd3-b4da-73e2abd04458","Type":"ContainerStarted","Data":"2530ee8677ae62c71817ffe80d3cc18b30d54b1718a2b3ffc74a70b706a727d2"} Oct 03 08:09:04 crc kubenswrapper[4664]: I1003 08:09:04.243372 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-k59vh" event={"ID":"c13de59d-0879-41ca-95cc-e8bf05c223eb","Type":"ContainerStarted","Data":"2b6d39ba1d42eb78f1004d2232ad96134bd6c37a6bd1e60cd9b5ef41659c264c"} Oct 03 08:09:04 crc kubenswrapper[4664]: I1003 08:09:04.244897 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76644f9584-br5jb" event={"ID":"30ce3373-ef30-4727-b57f-5be7963d1892","Type":"ContainerStarted","Data":"95b8c3354ba1366154c5515dcc0c60816b0b8b99bd0cfe357cac70a70d8a6930"} Oct 03 08:09:04 crc kubenswrapper[4664]: E1003 08:09:04.264941 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-lgr4h" podUID="4d316d5e-f411-4940-af4d-9c42f5baae63" Oct 03 08:09:04 crc kubenswrapper[4664]: I1003 08:09:04.285478 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-85fcf9fb6-r8r76"] Oct 03 08:09:04 crc kubenswrapper[4664]: I1003 08:09:04.285509 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-k59vh" podStartSLOduration=5.39799579 podStartE2EDuration="40.285485548s" podCreationTimestamp="2025-10-03 08:08:24 +0000 UTC" firstStartedPulling="2025-10-03 08:08:28.828903473 +0000 UTC m=+1209.650093963" lastFinishedPulling="2025-10-03 08:09:03.716393231 +0000 UTC m=+1244.537583721" observedRunningTime="2025-10-03 08:09:04.273100704 +0000 UTC m=+1245.094291204" watchObservedRunningTime="2025-10-03 08:09:04.285485548 +0000 UTC m=+1245.106676048" Oct 03 08:09:04 crc kubenswrapper[4664]: I1003 08:09:04.555034 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 08:09:04 crc kubenswrapper[4664]: I1003 08:09:04.556903 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 08:09:04 crc kubenswrapper[4664]: I1003 08:09:04.562889 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-fcz7z" Oct 03 08:09:04 crc kubenswrapper[4664]: I1003 08:09:04.564787 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 03 08:09:04 crc kubenswrapper[4664]: I1003 08:09:04.564942 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 03 08:09:04 crc kubenswrapper[4664]: I1003 08:09:04.615223 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 08:09:04 crc kubenswrapper[4664]: I1003 08:09:04.647799 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-rb6rm"] Oct 03 08:09:04 crc kubenswrapper[4664]: I1003 08:09:04.737740 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmzg7\" (UniqueName: \"kubernetes.io/projected/069410b4-da38-481c-8cf1-ae161edcabf1-kube-api-access-tmzg7\") pod \"glance-default-external-api-0\" (UID: \"069410b4-da38-481c-8cf1-ae161edcabf1\") " pod="openstack/glance-default-external-api-0" Oct 03 08:09:04 crc kubenswrapper[4664]: I1003 08:09:04.737799 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/069410b4-da38-481c-8cf1-ae161edcabf1-scripts\") pod \"glance-default-external-api-0\" (UID: \"069410b4-da38-481c-8cf1-ae161edcabf1\") " pod="openstack/glance-default-external-api-0" Oct 03 08:09:04 crc kubenswrapper[4664]: I1003 08:09:04.737823 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/069410b4-da38-481c-8cf1-ae161edcabf1-config-data\") pod \"glance-default-external-api-0\" (UID: \"069410b4-da38-481c-8cf1-ae161edcabf1\") " pod="openstack/glance-default-external-api-0" Oct 03 08:09:04 crc kubenswrapper[4664]: I1003 08:09:04.737880 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/069410b4-da38-481c-8cf1-ae161edcabf1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"069410b4-da38-481c-8cf1-ae161edcabf1\") " pod="openstack/glance-default-external-api-0" Oct 03 08:09:04 crc kubenswrapper[4664]: I1003 08:09:04.737926 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/069410b4-da38-481c-8cf1-ae161edcabf1-logs\") pod \"glance-default-external-api-0\" (UID: \"069410b4-da38-481c-8cf1-ae161edcabf1\") " pod="openstack/glance-default-external-api-0" Oct 03 08:09:04 crc kubenswrapper[4664]: I1003 08:09:04.737989 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/069410b4-da38-481c-8cf1-ae161edcabf1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"069410b4-da38-481c-8cf1-ae161edcabf1\") " pod="openstack/glance-default-external-api-0" Oct 03 08:09:04 crc kubenswrapper[4664]: I1003 08:09:04.738077 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"069410b4-da38-481c-8cf1-ae161edcabf1\") " pod="openstack/glance-default-external-api-0" Oct 03 08:09:04 crc kubenswrapper[4664]: I1003 08:09:04.840031 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/069410b4-da38-481c-8cf1-ae161edcabf1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"069410b4-da38-481c-8cf1-ae161edcabf1\") " pod="openstack/glance-default-external-api-0" Oct 03 08:09:04 crc kubenswrapper[4664]: I1003 08:09:04.840095 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/069410b4-da38-481c-8cf1-ae161edcabf1-logs\") pod \"glance-default-external-api-0\" (UID: \"069410b4-da38-481c-8cf1-ae161edcabf1\") " pod="openstack/glance-default-external-api-0" Oct 03 08:09:04 crc kubenswrapper[4664]: I1003 08:09:04.840181 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/069410b4-da38-481c-8cf1-ae161edcabf1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"069410b4-da38-481c-8cf1-ae161edcabf1\") " pod="openstack/glance-default-external-api-0" Oct 03 08:09:04 crc kubenswrapper[4664]: I1003 08:09:04.840220 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"069410b4-da38-481c-8cf1-ae161edcabf1\") " pod="openstack/glance-default-external-api-0" Oct 03 08:09:04 crc kubenswrapper[4664]: I1003 08:09:04.840330 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmzg7\" (UniqueName: \"kubernetes.io/projected/069410b4-da38-481c-8cf1-ae161edcabf1-kube-api-access-tmzg7\") pod \"glance-default-external-api-0\" (UID: \"069410b4-da38-481c-8cf1-ae161edcabf1\") " pod="openstack/glance-default-external-api-0" Oct 03 08:09:04 crc kubenswrapper[4664]: I1003 08:09:04.840365 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/069410b4-da38-481c-8cf1-ae161edcabf1-scripts\") pod \"glance-default-external-api-0\" (UID: \"069410b4-da38-481c-8cf1-ae161edcabf1\") " pod="openstack/glance-default-external-api-0" Oct 03 08:09:04 crc kubenswrapper[4664]: I1003 08:09:04.840399 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/069410b4-da38-481c-8cf1-ae161edcabf1-config-data\") pod \"glance-default-external-api-0\" (UID: \"069410b4-da38-481c-8cf1-ae161edcabf1\") " pod="openstack/glance-default-external-api-0" Oct 03 08:09:04 crc kubenswrapper[4664]: I1003 08:09:04.840900 4664 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"069410b4-da38-481c-8cf1-ae161edcabf1\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Oct 03 08:09:04 crc kubenswrapper[4664]: I1003 08:09:04.840941 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/069410b4-da38-481c-8cf1-ae161edcabf1-logs\") pod \"glance-default-external-api-0\" (UID: \"069410b4-da38-481c-8cf1-ae161edcabf1\") " pod="openstack/glance-default-external-api-0" Oct 03 08:09:04 crc kubenswrapper[4664]: I1003 08:09:04.841545 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/069410b4-da38-481c-8cf1-ae161edcabf1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"069410b4-da38-481c-8cf1-ae161edcabf1\") " pod="openstack/glance-default-external-api-0" Oct 03 08:09:04 crc kubenswrapper[4664]: I1003 08:09:04.844874 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/069410b4-da38-481c-8cf1-ae161edcabf1-config-data\") pod \"glance-default-external-api-0\" (UID: \"069410b4-da38-481c-8cf1-ae161edcabf1\") " pod="openstack/glance-default-external-api-0" Oct 03 08:09:04 crc kubenswrapper[4664]: I1003 08:09:04.844998 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/069410b4-da38-481c-8cf1-ae161edcabf1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"069410b4-da38-481c-8cf1-ae161edcabf1\") " pod="openstack/glance-default-external-api-0" Oct 03 08:09:04 crc kubenswrapper[4664]: I1003 08:09:04.854347 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/069410b4-da38-481c-8cf1-ae161edcabf1-scripts\") pod \"glance-default-external-api-0\" (UID: \"069410b4-da38-481c-8cf1-ae161edcabf1\") " pod="openstack/glance-default-external-api-0" Oct 03 08:09:04 crc kubenswrapper[4664]: I1003 08:09:04.864442 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmzg7\" (UniqueName: \"kubernetes.io/projected/069410b4-da38-481c-8cf1-ae161edcabf1-kube-api-access-tmzg7\") pod \"glance-default-external-api-0\" (UID: \"069410b4-da38-481c-8cf1-ae161edcabf1\") " pod="openstack/glance-default-external-api-0" Oct 03 08:09:04 crc kubenswrapper[4664]: I1003 08:09:04.939347 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 08:09:04 crc kubenswrapper[4664]: I1003 08:09:04.944078 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 08:09:04 crc kubenswrapper[4664]: I1003 08:09:04.950243 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 03 08:09:04 crc kubenswrapper[4664]: I1003 08:09:04.960484 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"069410b4-da38-481c-8cf1-ae161edcabf1\") " pod="openstack/glance-default-external-api-0" Oct 03 08:09:04 crc kubenswrapper[4664]: I1003 08:09:04.982228 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 08:09:05 crc kubenswrapper[4664]: I1003 08:09:05.050812 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13bd5acf-4425-4fcc-8c22-cbddfe167f46-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"13bd5acf-4425-4fcc-8c22-cbddfe167f46\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:09:05 crc kubenswrapper[4664]: I1003 08:09:05.050877 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13bd5acf-4425-4fcc-8c22-cbddfe167f46-logs\") pod \"glance-default-internal-api-0\" (UID: \"13bd5acf-4425-4fcc-8c22-cbddfe167f46\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:09:05 crc kubenswrapper[4664]: I1003 08:09:05.050901 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kxnr\" (UniqueName: \"kubernetes.io/projected/13bd5acf-4425-4fcc-8c22-cbddfe167f46-kube-api-access-5kxnr\") pod \"glance-default-internal-api-0\" (UID: \"13bd5acf-4425-4fcc-8c22-cbddfe167f46\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:09:05 crc kubenswrapper[4664]: I1003 08:09:05.050944 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13bd5acf-4425-4fcc-8c22-cbddfe167f46-config-data\") pod \"glance-default-internal-api-0\" (UID: \"13bd5acf-4425-4fcc-8c22-cbddfe167f46\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:09:05 crc kubenswrapper[4664]: I1003 08:09:05.050994 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13bd5acf-4425-4fcc-8c22-cbddfe167f46-scripts\") pod \"glance-default-internal-api-0\" (UID: \"13bd5acf-4425-4fcc-8c22-cbddfe167f46\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:09:05 crc kubenswrapper[4664]: I1003 08:09:05.051012 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/13bd5acf-4425-4fcc-8c22-cbddfe167f46-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"13bd5acf-4425-4fcc-8c22-cbddfe167f46\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:09:05 crc kubenswrapper[4664]: I1003 08:09:05.051050 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"13bd5acf-4425-4fcc-8c22-cbddfe167f46\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:09:05 crc kubenswrapper[4664]: I1003 08:09:05.156942 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"13bd5acf-4425-4fcc-8c22-cbddfe167f46\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:09:05 crc kubenswrapper[4664]: I1003 08:09:05.157074 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13bd5acf-4425-4fcc-8c22-cbddfe167f46-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"13bd5acf-4425-4fcc-8c22-cbddfe167f46\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:09:05 crc kubenswrapper[4664]: I1003 08:09:05.157122 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13bd5acf-4425-4fcc-8c22-cbddfe167f46-logs\") pod \"glance-default-internal-api-0\" (UID: \"13bd5acf-4425-4fcc-8c22-cbddfe167f46\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:09:05 crc kubenswrapper[4664]: I1003 08:09:05.157144 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kxnr\" (UniqueName: \"kubernetes.io/projected/13bd5acf-4425-4fcc-8c22-cbddfe167f46-kube-api-access-5kxnr\") pod \"glance-default-internal-api-0\" (UID: \"13bd5acf-4425-4fcc-8c22-cbddfe167f46\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:09:05 crc kubenswrapper[4664]: I1003 08:09:05.157188 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13bd5acf-4425-4fcc-8c22-cbddfe167f46-config-data\") pod \"glance-default-internal-api-0\" (UID: \"13bd5acf-4425-4fcc-8c22-cbddfe167f46\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:09:05 crc kubenswrapper[4664]: I1003 08:09:05.157231 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13bd5acf-4425-4fcc-8c22-cbddfe167f46-scripts\") pod \"glance-default-internal-api-0\" (UID: \"13bd5acf-4425-4fcc-8c22-cbddfe167f46\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:09:05 crc kubenswrapper[4664]: I1003 08:09:05.157250 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/13bd5acf-4425-4fcc-8c22-cbddfe167f46-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"13bd5acf-4425-4fcc-8c22-cbddfe167f46\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:09:05 crc kubenswrapper[4664]: I1003 08:09:05.157632 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13bd5acf-4425-4fcc-8c22-cbddfe167f46-logs\") pod \"glance-default-internal-api-0\" (UID: \"13bd5acf-4425-4fcc-8c22-cbddfe167f46\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:09:05 crc kubenswrapper[4664]: I1003 08:09:05.157674 4664 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"13bd5acf-4425-4fcc-8c22-cbddfe167f46\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Oct 03 08:09:05 crc kubenswrapper[4664]: I1003 08:09:05.158143 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/13bd5acf-4425-4fcc-8c22-cbddfe167f46-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"13bd5acf-4425-4fcc-8c22-cbddfe167f46\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:09:05 crc kubenswrapper[4664]: I1003 08:09:05.169357 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13bd5acf-4425-4fcc-8c22-cbddfe167f46-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"13bd5acf-4425-4fcc-8c22-cbddfe167f46\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:09:05 crc kubenswrapper[4664]: I1003 08:09:05.170076 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13bd5acf-4425-4fcc-8c22-cbddfe167f46-scripts\") pod \"glance-default-internal-api-0\" (UID: \"13bd5acf-4425-4fcc-8c22-cbddfe167f46\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:09:05 crc kubenswrapper[4664]: I1003 08:09:05.170893 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13bd5acf-4425-4fcc-8c22-cbddfe167f46-config-data\") pod \"glance-default-internal-api-0\" (UID: \"13bd5acf-4425-4fcc-8c22-cbddfe167f46\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:09:05 crc kubenswrapper[4664]: I1003 08:09:05.182119 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kxnr\" (UniqueName: \"kubernetes.io/projected/13bd5acf-4425-4fcc-8c22-cbddfe167f46-kube-api-access-5kxnr\") pod \"glance-default-internal-api-0\" (UID: \"13bd5acf-4425-4fcc-8c22-cbddfe167f46\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:09:05 crc kubenswrapper[4664]: I1003 08:09:05.196966 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"13bd5acf-4425-4fcc-8c22-cbddfe167f46\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:09:05 crc kubenswrapper[4664]: I1003 08:09:05.258748 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 08:09:05 crc kubenswrapper[4664]: I1003 08:09:05.272741 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-q2khg" event={"ID":"6eef4071-9678-4d8e-a595-3ab2d97f1862","Type":"ContainerStarted","Data":"be6533d7289d50bcfc9390a098140db08ef97032f4b07f20dbd48d768af4243a"} Oct 03 08:09:05 crc kubenswrapper[4664]: I1003 08:09:05.272809 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-q2khg" event={"ID":"6eef4071-9678-4d8e-a595-3ab2d97f1862","Type":"ContainerStarted","Data":"a459349d0b81a8d423bb2deda0d2087897e3a3e55a911c223a69dc44f16aeb07"} Oct 03 08:09:05 crc kubenswrapper[4664]: I1003 08:09:05.275870 4664 generic.go:334] "Generic (PLEG): container finished" podID="85f0c94a-31da-41e5-aa27-9840d3704a67" containerID="829e099c46902780ec295143722995fd114225c55c34277c05bb89f90cfb29b2" exitCode=0 Oct 03 08:09:05 crc kubenswrapper[4664]: I1003 08:09:05.275991 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-rb6rm" event={"ID":"85f0c94a-31da-41e5-aa27-9840d3704a67","Type":"ContainerDied","Data":"829e099c46902780ec295143722995fd114225c55c34277c05bb89f90cfb29b2"} Oct 03 08:09:05 crc kubenswrapper[4664]: I1003 08:09:05.276028 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-rb6rm" event={"ID":"85f0c94a-31da-41e5-aa27-9840d3704a67","Type":"ContainerStarted","Data":"3fd0cd375b74b811bac2a3598c913a8298257e4c911faf9270f338e8c6b03064"} Oct 03 08:09:05 crc kubenswrapper[4664]: I1003 08:09:05.283184 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 08:09:05 crc kubenswrapper[4664]: I1003 08:09:05.305941 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-q2khg" podStartSLOduration=15.305925181 podStartE2EDuration="15.305925181s" podCreationTimestamp="2025-10-03 08:08:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:09:05.304726887 +0000 UTC m=+1246.125917387" watchObservedRunningTime="2025-10-03 08:09:05.305925181 +0000 UTC m=+1246.127115671" Oct 03 08:09:05 crc kubenswrapper[4664]: I1003 08:09:05.316423 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76644f9584-br5jb" event={"ID":"30ce3373-ef30-4727-b57f-5be7963d1892","Type":"ContainerStarted","Data":"78ce76bbf2905152072cab64efc16b2aac3c660597a68006c974ce630c008830"} Oct 03 08:09:05 crc kubenswrapper[4664]: I1003 08:09:05.316478 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76644f9584-br5jb" event={"ID":"30ce3373-ef30-4727-b57f-5be7963d1892","Type":"ContainerStarted","Data":"b67e64d8a85e712979c8209e18f5938ff206128e2c3d1b5097e5fa9956960c1c"} Oct 03 08:09:05 crc kubenswrapper[4664]: I1003 08:09:05.323347 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85fcf9fb6-r8r76" event={"ID":"f1015cf1-8e4b-44fd-a794-27edfecdceed","Type":"ContainerStarted","Data":"e98a1b72baf4695265fc09cccba25e4d95e5c7d47b189cf037efd5a1aae19f4f"} Oct 03 08:09:05 crc kubenswrapper[4664]: I1003 08:09:05.323575 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85fcf9fb6-r8r76" event={"ID":"f1015cf1-8e4b-44fd-a794-27edfecdceed","Type":"ContainerStarted","Data":"c9e5ca8655f18717c62b7f3ab6007d2b67d86327c8c64df655b33d2fbd54bf3d"} Oct 03 08:09:05 crc kubenswrapper[4664]: I1003 08:09:05.323671 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85fcf9fb6-r8r76" event={"ID":"f1015cf1-8e4b-44fd-a794-27edfecdceed","Type":"ContainerStarted","Data":"ee886497d8db2cc9649ce3f4cf07bb390a4684d55df0c359c18c5f8404d62427"} Oct 03 08:09:05 crc kubenswrapper[4664]: I1003 08:09:05.364477 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-76644f9584-br5jb" podStartSLOduration=32.779567403 podStartE2EDuration="33.364460571s" podCreationTimestamp="2025-10-03 08:08:32 +0000 UTC" firstStartedPulling="2025-10-03 08:09:03.466923033 +0000 UTC m=+1244.288113523" lastFinishedPulling="2025-10-03 08:09:04.051816201 +0000 UTC m=+1244.873006691" observedRunningTime="2025-10-03 08:09:05.362962798 +0000 UTC m=+1246.184153298" watchObservedRunningTime="2025-10-03 08:09:05.364460571 +0000 UTC m=+1246.185651061" Oct 03 08:09:05 crc kubenswrapper[4664]: I1003 08:09:05.394229 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-85fcf9fb6-r8r76" podStartSLOduration=32.39420903 podStartE2EDuration="32.39420903s" podCreationTimestamp="2025-10-03 08:08:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:09:05.384724399 +0000 UTC m=+1246.205914889" watchObservedRunningTime="2025-10-03 08:09:05.39420903 +0000 UTC m=+1246.215399520" Oct 03 08:09:06 crc kubenswrapper[4664]: I1003 08:09:06.455279 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 08:09:06 crc kubenswrapper[4664]: I1003 08:09:06.547845 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 08:09:07 crc kubenswrapper[4664]: I1003 08:09:07.189470 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 08:09:07 crc kubenswrapper[4664]: I1003 08:09:07.307312 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 08:09:07 crc kubenswrapper[4664]: I1003 08:09:07.375276 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"13bd5acf-4425-4fcc-8c22-cbddfe167f46","Type":"ContainerStarted","Data":"99beec851990494ce5d4e9ded16851d5e3d44ab65b25ef8280e91dfbb39c362a"} Oct 03 08:09:07 crc kubenswrapper[4664]: I1003 08:09:07.380964 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7050eaa4-061a-4dd3-b4da-73e2abd04458","Type":"ContainerStarted","Data":"dec11e910be9792de36f8c4bb865a990edfa318b57c9d2c1b37962bde0dc3069"} Oct 03 08:09:07 crc kubenswrapper[4664]: I1003 08:09:07.383721 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"069410b4-da38-481c-8cf1-ae161edcabf1","Type":"ContainerStarted","Data":"871c0a0ed54eff23b0d4f58edadb94bce6dc46fcd1a054b205bc44d86f2f7bd5"} Oct 03 08:09:07 crc kubenswrapper[4664]: I1003 08:09:07.389219 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-rb6rm" event={"ID":"85f0c94a-31da-41e5-aa27-9840d3704a67","Type":"ContainerStarted","Data":"2d76b47954e6fd7c49e12c3f0824653dc8bb66b39e76aecd27741298dc53ab1a"} Oct 03 08:09:07 crc kubenswrapper[4664]: I1003 08:09:07.389913 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57c957c4ff-rb6rm" Oct 03 08:09:07 crc kubenswrapper[4664]: I1003 08:09:07.418290 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57c957c4ff-rb6rm" podStartSLOduration=4.418271168 podStartE2EDuration="4.418271168s" podCreationTimestamp="2025-10-03 08:09:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:09:07.416448126 +0000 UTC m=+1248.237638636" watchObservedRunningTime="2025-10-03 08:09:07.418271168 +0000 UTC m=+1248.239461658" Oct 03 08:09:08 crc kubenswrapper[4664]: I1003 08:09:08.406501 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"069410b4-da38-481c-8cf1-ae161edcabf1","Type":"ContainerStarted","Data":"7dae587b971cd80d76f293caf311ffc3cc9e412d2fb1bb4b87ad05dd7e0a11d8"} Oct 03 08:09:10 crc kubenswrapper[4664]: I1003 08:09:10.429424 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"13bd5acf-4425-4fcc-8c22-cbddfe167f46","Type":"ContainerStarted","Data":"6f72c1307115b257a6d41f829ba2ac53221291414478f9d951a2c222af9c4395"} Oct 03 08:09:11 crc kubenswrapper[4664]: I1003 08:09:11.987859 4664 patch_prober.go:28] interesting pod/machine-config-daemon-x9dgm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:09:11 crc kubenswrapper[4664]: I1003 08:09:11.988479 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:09:12 crc kubenswrapper[4664]: I1003 08:09:12.456500 4664 generic.go:334] "Generic (PLEG): container finished" podID="6eef4071-9678-4d8e-a595-3ab2d97f1862" containerID="be6533d7289d50bcfc9390a098140db08ef97032f4b07f20dbd48d768af4243a" exitCode=0 Oct 03 08:09:12 crc kubenswrapper[4664]: I1003 08:09:12.456586 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-q2khg" event={"ID":"6eef4071-9678-4d8e-a595-3ab2d97f1862","Type":"ContainerDied","Data":"be6533d7289d50bcfc9390a098140db08ef97032f4b07f20dbd48d768af4243a"} Oct 03 08:09:12 crc kubenswrapper[4664]: I1003 08:09:12.461082 4664 generic.go:334] "Generic (PLEG): container finished" podID="c13de59d-0879-41ca-95cc-e8bf05c223eb" containerID="2b6d39ba1d42eb78f1004d2232ad96134bd6c37a6bd1e60cd9b5ef41659c264c" exitCode=0 Oct 03 08:09:12 crc kubenswrapper[4664]: I1003 08:09:12.461146 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-k59vh" event={"ID":"c13de59d-0879-41ca-95cc-e8bf05c223eb","Type":"ContainerDied","Data":"2b6d39ba1d42eb78f1004d2232ad96134bd6c37a6bd1e60cd9b5ef41659c264c"} Oct 03 08:09:12 crc kubenswrapper[4664]: I1003 08:09:12.464444 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"069410b4-da38-481c-8cf1-ae161edcabf1","Type":"ContainerStarted","Data":"dff62bbfbcd8f1313ca0ad23c6d3d3298ec2af22723e7af8216eb74c56233bce"} Oct 03 08:09:12 crc kubenswrapper[4664]: I1003 08:09:12.464628 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="069410b4-da38-481c-8cf1-ae161edcabf1" containerName="glance-log" containerID="cri-o://7dae587b971cd80d76f293caf311ffc3cc9e412d2fb1bb4b87ad05dd7e0a11d8" gracePeriod=30 Oct 03 08:09:12 crc kubenswrapper[4664]: I1003 08:09:12.464760 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="069410b4-da38-481c-8cf1-ae161edcabf1" containerName="glance-httpd" containerID="cri-o://dff62bbfbcd8f1313ca0ad23c6d3d3298ec2af22723e7af8216eb74c56233bce" gracePeriod=30 Oct 03 08:09:12 crc kubenswrapper[4664]: I1003 08:09:12.523042 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=9.5230241 podStartE2EDuration="9.5230241s" podCreationTimestamp="2025-10-03 08:09:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:09:12.514062024 +0000 UTC m=+1253.335252534" watchObservedRunningTime="2025-10-03 08:09:12.5230241 +0000 UTC m=+1253.344214590" Oct 03 08:09:13 crc kubenswrapper[4664]: I1003 08:09:13.481025 4664 generic.go:334] "Generic (PLEG): container finished" podID="069410b4-da38-481c-8cf1-ae161edcabf1" containerID="dff62bbfbcd8f1313ca0ad23c6d3d3298ec2af22723e7af8216eb74c56233bce" exitCode=0 Oct 03 08:09:13 crc kubenswrapper[4664]: I1003 08:09:13.481461 4664 generic.go:334] "Generic (PLEG): container finished" podID="069410b4-da38-481c-8cf1-ae161edcabf1" containerID="7dae587b971cd80d76f293caf311ffc3cc9e412d2fb1bb4b87ad05dd7e0a11d8" exitCode=143 Oct 03 08:09:13 crc kubenswrapper[4664]: I1003 08:09:13.481165 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"069410b4-da38-481c-8cf1-ae161edcabf1","Type":"ContainerDied","Data":"dff62bbfbcd8f1313ca0ad23c6d3d3298ec2af22723e7af8216eb74c56233bce"} Oct 03 08:09:13 crc kubenswrapper[4664]: I1003 08:09:13.481831 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"069410b4-da38-481c-8cf1-ae161edcabf1","Type":"ContainerDied","Data":"7dae587b971cd80d76f293caf311ffc3cc9e412d2fb1bb4b87ad05dd7e0a11d8"} Oct 03 08:09:13 crc kubenswrapper[4664]: I1003 08:09:13.517050 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-85fcf9fb6-r8r76" Oct 03 08:09:13 crc kubenswrapper[4664]: I1003 08:09:13.518339 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-85fcf9fb6-r8r76" Oct 03 08:09:13 crc kubenswrapper[4664]: I1003 08:09:13.582589 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-76644f9584-br5jb" Oct 03 08:09:13 crc kubenswrapper[4664]: I1003 08:09:13.582725 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-76644f9584-br5jb" Oct 03 08:09:14 crc kubenswrapper[4664]: I1003 08:09:14.085565 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57c957c4ff-rb6rm" Oct 03 08:09:14 crc kubenswrapper[4664]: I1003 08:09:14.157909 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fcfdd6f9f-666mt"] Oct 03 08:09:14 crc kubenswrapper[4664]: I1003 08:09:14.158178 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-fcfdd6f9f-666mt" podUID="9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd" containerName="dnsmasq-dns" containerID="cri-o://64d835e96ba4280eb7e483668dddb5bde20f1a1b8703908dd6b6d2c6bd5af680" gracePeriod=10 Oct 03 08:09:14 crc kubenswrapper[4664]: I1003 08:09:14.492363 4664 generic.go:334] "Generic (PLEG): container finished" podID="9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd" containerID="64d835e96ba4280eb7e483668dddb5bde20f1a1b8703908dd6b6d2c6bd5af680" exitCode=0 Oct 03 08:09:14 crc kubenswrapper[4664]: I1003 08:09:14.492441 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcfdd6f9f-666mt" event={"ID":"9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd","Type":"ContainerDied","Data":"64d835e96ba4280eb7e483668dddb5bde20f1a1b8703908dd6b6d2c6bd5af680"} Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.169149 4664 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-fcfdd6f9f-666mt" podUID="9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.139:5353: connect: connection refused" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.294531 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-k59vh" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.299356 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-q2khg" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.389488 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c13de59d-0879-41ca-95cc-e8bf05c223eb-combined-ca-bundle\") pod \"c13de59d-0879-41ca-95cc-e8bf05c223eb\" (UID: \"c13de59d-0879-41ca-95cc-e8bf05c223eb\") " Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.389915 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6eef4071-9678-4d8e-a595-3ab2d97f1862-combined-ca-bundle\") pod \"6eef4071-9678-4d8e-a595-3ab2d97f1862\" (UID: \"6eef4071-9678-4d8e-a595-3ab2d97f1862\") " Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.389986 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c13de59d-0879-41ca-95cc-e8bf05c223eb-logs\") pod \"c13de59d-0879-41ca-95cc-e8bf05c223eb\" (UID: \"c13de59d-0879-41ca-95cc-e8bf05c223eb\") " Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.390067 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c13de59d-0879-41ca-95cc-e8bf05c223eb-scripts\") pod \"c13de59d-0879-41ca-95cc-e8bf05c223eb\" (UID: \"c13de59d-0879-41ca-95cc-e8bf05c223eb\") " Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.390160 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6eef4071-9678-4d8e-a595-3ab2d97f1862-config-data\") pod \"6eef4071-9678-4d8e-a595-3ab2d97f1862\" (UID: \"6eef4071-9678-4d8e-a595-3ab2d97f1862\") " Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.390185 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c13de59d-0879-41ca-95cc-e8bf05c223eb-config-data\") pod \"c13de59d-0879-41ca-95cc-e8bf05c223eb\" (UID: \"c13de59d-0879-41ca-95cc-e8bf05c223eb\") " Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.390211 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6eef4071-9678-4d8e-a595-3ab2d97f1862-credential-keys\") pod \"6eef4071-9678-4d8e-a595-3ab2d97f1862\" (UID: \"6eef4071-9678-4d8e-a595-3ab2d97f1862\") " Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.390267 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6eef4071-9678-4d8e-a595-3ab2d97f1862-fernet-keys\") pod \"6eef4071-9678-4d8e-a595-3ab2d97f1862\" (UID: \"6eef4071-9678-4d8e-a595-3ab2d97f1862\") " Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.390301 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gr44v\" (UniqueName: \"kubernetes.io/projected/c13de59d-0879-41ca-95cc-e8bf05c223eb-kube-api-access-gr44v\") pod \"c13de59d-0879-41ca-95cc-e8bf05c223eb\" (UID: \"c13de59d-0879-41ca-95cc-e8bf05c223eb\") " Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.390375 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjw2d\" (UniqueName: \"kubernetes.io/projected/6eef4071-9678-4d8e-a595-3ab2d97f1862-kube-api-access-fjw2d\") pod \"6eef4071-9678-4d8e-a595-3ab2d97f1862\" (UID: \"6eef4071-9678-4d8e-a595-3ab2d97f1862\") " Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.390443 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6eef4071-9678-4d8e-a595-3ab2d97f1862-scripts\") pod \"6eef4071-9678-4d8e-a595-3ab2d97f1862\" (UID: \"6eef4071-9678-4d8e-a595-3ab2d97f1862\") " Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.394438 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c13de59d-0879-41ca-95cc-e8bf05c223eb-logs" (OuterVolumeSpecName: "logs") pod "c13de59d-0879-41ca-95cc-e8bf05c223eb" (UID: "c13de59d-0879-41ca-95cc-e8bf05c223eb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.396138 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6eef4071-9678-4d8e-a595-3ab2d97f1862-scripts" (OuterVolumeSpecName: "scripts") pod "6eef4071-9678-4d8e-a595-3ab2d97f1862" (UID: "6eef4071-9678-4d8e-a595-3ab2d97f1862"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.396465 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6eef4071-9678-4d8e-a595-3ab2d97f1862-kube-api-access-fjw2d" (OuterVolumeSpecName: "kube-api-access-fjw2d") pod "6eef4071-9678-4d8e-a595-3ab2d97f1862" (UID: "6eef4071-9678-4d8e-a595-3ab2d97f1862"). InnerVolumeSpecName "kube-api-access-fjw2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.399282 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c13de59d-0879-41ca-95cc-e8bf05c223eb-scripts" (OuterVolumeSpecName: "scripts") pod "c13de59d-0879-41ca-95cc-e8bf05c223eb" (UID: "c13de59d-0879-41ca-95cc-e8bf05c223eb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.399840 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c13de59d-0879-41ca-95cc-e8bf05c223eb-kube-api-access-gr44v" (OuterVolumeSpecName: "kube-api-access-gr44v") pod "c13de59d-0879-41ca-95cc-e8bf05c223eb" (UID: "c13de59d-0879-41ca-95cc-e8bf05c223eb"). InnerVolumeSpecName "kube-api-access-gr44v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.402937 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6eef4071-9678-4d8e-a595-3ab2d97f1862-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "6eef4071-9678-4d8e-a595-3ab2d97f1862" (UID: "6eef4071-9678-4d8e-a595-3ab2d97f1862"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.424970 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6eef4071-9678-4d8e-a595-3ab2d97f1862-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "6eef4071-9678-4d8e-a595-3ab2d97f1862" (UID: "6eef4071-9678-4d8e-a595-3ab2d97f1862"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.430430 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6eef4071-9678-4d8e-a595-3ab2d97f1862-config-data" (OuterVolumeSpecName: "config-data") pod "6eef4071-9678-4d8e-a595-3ab2d97f1862" (UID: "6eef4071-9678-4d8e-a595-3ab2d97f1862"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.458293 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c13de59d-0879-41ca-95cc-e8bf05c223eb-config-data" (OuterVolumeSpecName: "config-data") pod "c13de59d-0879-41ca-95cc-e8bf05c223eb" (UID: "c13de59d-0879-41ca-95cc-e8bf05c223eb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.466040 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c13de59d-0879-41ca-95cc-e8bf05c223eb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c13de59d-0879-41ca-95cc-e8bf05c223eb" (UID: "c13de59d-0879-41ca-95cc-e8bf05c223eb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.480322 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6eef4071-9678-4d8e-a595-3ab2d97f1862-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6eef4071-9678-4d8e-a595-3ab2d97f1862" (UID: "6eef4071-9678-4d8e-a595-3ab2d97f1862"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.492766 4664 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c13de59d-0879-41ca-95cc-e8bf05c223eb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.492840 4664 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6eef4071-9678-4d8e-a595-3ab2d97f1862-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.492853 4664 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c13de59d-0879-41ca-95cc-e8bf05c223eb-logs\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.492864 4664 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c13de59d-0879-41ca-95cc-e8bf05c223eb-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.492874 4664 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c13de59d-0879-41ca-95cc-e8bf05c223eb-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.492884 4664 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6eef4071-9678-4d8e-a595-3ab2d97f1862-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.492911 4664 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6eef4071-9678-4d8e-a595-3ab2d97f1862-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.492920 4664 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6eef4071-9678-4d8e-a595-3ab2d97f1862-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.492933 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gr44v\" (UniqueName: \"kubernetes.io/projected/c13de59d-0879-41ca-95cc-e8bf05c223eb-kube-api-access-gr44v\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.492946 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjw2d\" (UniqueName: \"kubernetes.io/projected/6eef4071-9678-4d8e-a595-3ab2d97f1862-kube-api-access-fjw2d\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.492957 4664 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6eef4071-9678-4d8e-a595-3ab2d97f1862-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.515587 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcfdd6f9f-666mt" event={"ID":"9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd","Type":"ContainerDied","Data":"8a86d3f624f5fa81c852cc50e11736566da68a1f4c7c14b914ee4bfb32a649de"} Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.515649 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a86d3f624f5fa81c852cc50e11736566da68a1f4c7c14b914ee4bfb32a649de" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.540846 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-q2khg" event={"ID":"6eef4071-9678-4d8e-a595-3ab2d97f1862","Type":"ContainerDied","Data":"a459349d0b81a8d423bb2deda0d2087897e3a3e55a911c223a69dc44f16aeb07"} Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.540887 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a459349d0b81a8d423bb2deda0d2087897e3a3e55a911c223a69dc44f16aeb07" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.540961 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-q2khg" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.564470 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-k59vh" event={"ID":"c13de59d-0879-41ca-95cc-e8bf05c223eb","Type":"ContainerDied","Data":"19851dc8d6e21a2ae3d5f3d4df81b600f75122f7696c8c39b15d5b044c20f252"} Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.564539 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19851dc8d6e21a2ae3d5f3d4df81b600f75122f7696c8c39b15d5b044c20f252" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.564726 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-k59vh" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.617092 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fcfdd6f9f-666mt" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.698900 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd-dns-swift-storage-0\") pod \"9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd\" (UID: \"9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd\") " Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.698957 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd-dns-svc\") pod \"9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd\" (UID: \"9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd\") " Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.699016 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckgbt\" (UniqueName: \"kubernetes.io/projected/9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd-kube-api-access-ckgbt\") pod \"9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd\" (UID: \"9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd\") " Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.699106 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd-config\") pod \"9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd\" (UID: \"9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd\") " Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.699131 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd-ovsdbserver-nb\") pod \"9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd\" (UID: \"9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd\") " Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.699821 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd-ovsdbserver-sb\") pod \"9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd\" (UID: \"9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd\") " Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.701545 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.710767 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd-kube-api-access-ckgbt" (OuterVolumeSpecName: "kube-api-access-ckgbt") pod "9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd" (UID: "9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd"). InnerVolumeSpecName "kube-api-access-ckgbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.770582 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd" (UID: "9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.782522 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd" (UID: "9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.782965 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd-config" (OuterVolumeSpecName: "config") pod "9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd" (UID: "9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.795511 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd" (UID: "9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.800791 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"069410b4-da38-481c-8cf1-ae161edcabf1\" (UID: \"069410b4-da38-481c-8cf1-ae161edcabf1\") " Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.800848 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/069410b4-da38-481c-8cf1-ae161edcabf1-httpd-run\") pod \"069410b4-da38-481c-8cf1-ae161edcabf1\" (UID: \"069410b4-da38-481c-8cf1-ae161edcabf1\") " Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.800907 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/069410b4-da38-481c-8cf1-ae161edcabf1-logs\") pod \"069410b4-da38-481c-8cf1-ae161edcabf1\" (UID: \"069410b4-da38-481c-8cf1-ae161edcabf1\") " Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.801110 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/069410b4-da38-481c-8cf1-ae161edcabf1-combined-ca-bundle\") pod \"069410b4-da38-481c-8cf1-ae161edcabf1\" (UID: \"069410b4-da38-481c-8cf1-ae161edcabf1\") " Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.801142 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmzg7\" (UniqueName: \"kubernetes.io/projected/069410b4-da38-481c-8cf1-ae161edcabf1-kube-api-access-tmzg7\") pod \"069410b4-da38-481c-8cf1-ae161edcabf1\" (UID: \"069410b4-da38-481c-8cf1-ae161edcabf1\") " Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.801166 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/069410b4-da38-481c-8cf1-ae161edcabf1-scripts\") pod \"069410b4-da38-481c-8cf1-ae161edcabf1\" (UID: \"069410b4-da38-481c-8cf1-ae161edcabf1\") " Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.801232 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/069410b4-da38-481c-8cf1-ae161edcabf1-config-data\") pod \"069410b4-da38-481c-8cf1-ae161edcabf1\" (UID: \"069410b4-da38-481c-8cf1-ae161edcabf1\") " Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.801687 4664 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.801710 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckgbt\" (UniqueName: \"kubernetes.io/projected/9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd-kube-api-access-ckgbt\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.801724 4664 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.801740 4664 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.801753 4664 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.803418 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/069410b4-da38-481c-8cf1-ae161edcabf1-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "069410b4-da38-481c-8cf1-ae161edcabf1" (UID: "069410b4-da38-481c-8cf1-ae161edcabf1"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.803711 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/069410b4-da38-481c-8cf1-ae161edcabf1-logs" (OuterVolumeSpecName: "logs") pod "069410b4-da38-481c-8cf1-ae161edcabf1" (UID: "069410b4-da38-481c-8cf1-ae161edcabf1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.807292 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "069410b4-da38-481c-8cf1-ae161edcabf1" (UID: "069410b4-da38-481c-8cf1-ae161edcabf1"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.809934 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/069410b4-da38-481c-8cf1-ae161edcabf1-scripts" (OuterVolumeSpecName: "scripts") pod "069410b4-da38-481c-8cf1-ae161edcabf1" (UID: "069410b4-da38-481c-8cf1-ae161edcabf1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.809982 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/069410b4-da38-481c-8cf1-ae161edcabf1-kube-api-access-tmzg7" (OuterVolumeSpecName: "kube-api-access-tmzg7") pod "069410b4-da38-481c-8cf1-ae161edcabf1" (UID: "069410b4-da38-481c-8cf1-ae161edcabf1"). InnerVolumeSpecName "kube-api-access-tmzg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.814359 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd" (UID: "9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.846999 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/069410b4-da38-481c-8cf1-ae161edcabf1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "069410b4-da38-481c-8cf1-ae161edcabf1" (UID: "069410b4-da38-481c-8cf1-ae161edcabf1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.876898 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/069410b4-da38-481c-8cf1-ae161edcabf1-config-data" (OuterVolumeSpecName: "config-data") pod "069410b4-da38-481c-8cf1-ae161edcabf1" (UID: "069410b4-da38-481c-8cf1-ae161edcabf1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.904212 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmzg7\" (UniqueName: \"kubernetes.io/projected/069410b4-da38-481c-8cf1-ae161edcabf1-kube-api-access-tmzg7\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.904359 4664 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/069410b4-da38-481c-8cf1-ae161edcabf1-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.904464 4664 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/069410b4-da38-481c-8cf1-ae161edcabf1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.904537 4664 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/069410b4-da38-481c-8cf1-ae161edcabf1-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.904589 4664 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.904675 4664 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.904730 4664 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/069410b4-da38-481c-8cf1-ae161edcabf1-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.904781 4664 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/069410b4-da38-481c-8cf1-ae161edcabf1-logs\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:15 crc kubenswrapper[4664]: I1003 08:09:15.933669 4664 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.006896 4664 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.462619 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-67cd87976d-7fbgw"] Oct 03 08:09:16 crc kubenswrapper[4664]: E1003 08:09:16.463779 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c13de59d-0879-41ca-95cc-e8bf05c223eb" containerName="placement-db-sync" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.463801 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="c13de59d-0879-41ca-95cc-e8bf05c223eb" containerName="placement-db-sync" Oct 03 08:09:16 crc kubenswrapper[4664]: E1003 08:09:16.463820 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eef4071-9678-4d8e-a595-3ab2d97f1862" containerName="keystone-bootstrap" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.463833 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eef4071-9678-4d8e-a595-3ab2d97f1862" containerName="keystone-bootstrap" Oct 03 08:09:16 crc kubenswrapper[4664]: E1003 08:09:16.463871 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="069410b4-da38-481c-8cf1-ae161edcabf1" containerName="glance-httpd" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.463879 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="069410b4-da38-481c-8cf1-ae161edcabf1" containerName="glance-httpd" Oct 03 08:09:16 crc kubenswrapper[4664]: E1003 08:09:16.463901 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd" containerName="init" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.463909 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd" containerName="init" Oct 03 08:09:16 crc kubenswrapper[4664]: E1003 08:09:16.463927 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="069410b4-da38-481c-8cf1-ae161edcabf1" containerName="glance-log" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.463935 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="069410b4-da38-481c-8cf1-ae161edcabf1" containerName="glance-log" Oct 03 08:09:16 crc kubenswrapper[4664]: E1003 08:09:16.463948 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd" containerName="dnsmasq-dns" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.463955 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd" containerName="dnsmasq-dns" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.464475 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="c13de59d-0879-41ca-95cc-e8bf05c223eb" containerName="placement-db-sync" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.464507 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="6eef4071-9678-4d8e-a595-3ab2d97f1862" containerName="keystone-bootstrap" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.464525 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="069410b4-da38-481c-8cf1-ae161edcabf1" containerName="glance-httpd" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.464541 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd" containerName="dnsmasq-dns" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.464562 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="069410b4-da38-481c-8cf1-ae161edcabf1" containerName="glance-log" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.466488 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-67cd87976d-7fbgw" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.475925 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-8qjww" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.476234 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.476429 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.476677 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.476879 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.505447 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-67cd87976d-7fbgw"] Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.572328 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-ccfbd46bc-qz9qm"] Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.573730 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ccfbd46bc-qz9qm" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.577592 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-bm9nt" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.577889 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.578055 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.581681 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.581864 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.582018 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.591311 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-ccfbd46bc-qz9qm"] Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.598527 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.599353 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"069410b4-da38-481c-8cf1-ae161edcabf1","Type":"ContainerDied","Data":"871c0a0ed54eff23b0d4f58edadb94bce6dc46fcd1a054b205bc44d86f2f7bd5"} Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.599395 4664 scope.go:117] "RemoveContainer" containerID="dff62bbfbcd8f1313ca0ad23c6d3d3298ec2af22723e7af8216eb74c56233bce" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.609760 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wcvcw" event={"ID":"82832c17-408b-4b89-992f-09e393024fe2","Type":"ContainerStarted","Data":"6c5a48133e9038524a116cd42bf27c44fb9734cc10a9f3b5e3e6f8820e6c5446"} Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.617032 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"13bd5acf-4425-4fcc-8c22-cbddfe167f46","Type":"ContainerStarted","Data":"3e1bf8930096612aa2991b40c4361f49d124f8a9baa403cf5f0f42cd3ce83bb7"} Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.617210 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="13bd5acf-4425-4fcc-8c22-cbddfe167f46" containerName="glance-log" containerID="cri-o://6f72c1307115b257a6d41f829ba2ac53221291414478f9d951a2c222af9c4395" gracePeriod=30 Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.617486 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="13bd5acf-4425-4fcc-8c22-cbddfe167f46" containerName="glance-httpd" containerID="cri-o://3e1bf8930096612aa2991b40c4361f49d124f8a9baa403cf5f0f42cd3ce83bb7" gracePeriod=30 Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.622485 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a226bb56-10cc-42f6-81dc-bf62f7f4038d-scripts\") pod \"placement-67cd87976d-7fbgw\" (UID: \"a226bb56-10cc-42f6-81dc-bf62f7f4038d\") " pod="openstack/placement-67cd87976d-7fbgw" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.622548 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/05056d09-f95e-40cb-96e7-100243e5a858-fernet-keys\") pod \"keystone-ccfbd46bc-qz9qm\" (UID: \"05056d09-f95e-40cb-96e7-100243e5a858\") " pod="openstack/keystone-ccfbd46bc-qz9qm" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.622595 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a226bb56-10cc-42f6-81dc-bf62f7f4038d-internal-tls-certs\") pod \"placement-67cd87976d-7fbgw\" (UID: \"a226bb56-10cc-42f6-81dc-bf62f7f4038d\") " pod="openstack/placement-67cd87976d-7fbgw" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.622646 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a226bb56-10cc-42f6-81dc-bf62f7f4038d-public-tls-certs\") pod \"placement-67cd87976d-7fbgw\" (UID: \"a226bb56-10cc-42f6-81dc-bf62f7f4038d\") " pod="openstack/placement-67cd87976d-7fbgw" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.622672 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pv6d6\" (UniqueName: \"kubernetes.io/projected/05056d09-f95e-40cb-96e7-100243e5a858-kube-api-access-pv6d6\") pod \"keystone-ccfbd46bc-qz9qm\" (UID: \"05056d09-f95e-40cb-96e7-100243e5a858\") " pod="openstack/keystone-ccfbd46bc-qz9qm" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.622712 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05056d09-f95e-40cb-96e7-100243e5a858-combined-ca-bundle\") pod \"keystone-ccfbd46bc-qz9qm\" (UID: \"05056d09-f95e-40cb-96e7-100243e5a858\") " pod="openstack/keystone-ccfbd46bc-qz9qm" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.622752 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/05056d09-f95e-40cb-96e7-100243e5a858-credential-keys\") pod \"keystone-ccfbd46bc-qz9qm\" (UID: \"05056d09-f95e-40cb-96e7-100243e5a858\") " pod="openstack/keystone-ccfbd46bc-qz9qm" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.622810 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmmgc\" (UniqueName: \"kubernetes.io/projected/a226bb56-10cc-42f6-81dc-bf62f7f4038d-kube-api-access-zmmgc\") pod \"placement-67cd87976d-7fbgw\" (UID: \"a226bb56-10cc-42f6-81dc-bf62f7f4038d\") " pod="openstack/placement-67cd87976d-7fbgw" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.622835 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/05056d09-f95e-40cb-96e7-100243e5a858-public-tls-certs\") pod \"keystone-ccfbd46bc-qz9qm\" (UID: \"05056d09-f95e-40cb-96e7-100243e5a858\") " pod="openstack/keystone-ccfbd46bc-qz9qm" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.622889 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05056d09-f95e-40cb-96e7-100243e5a858-scripts\") pod \"keystone-ccfbd46bc-qz9qm\" (UID: \"05056d09-f95e-40cb-96e7-100243e5a858\") " pod="openstack/keystone-ccfbd46bc-qz9qm" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.622927 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a226bb56-10cc-42f6-81dc-bf62f7f4038d-config-data\") pod \"placement-67cd87976d-7fbgw\" (UID: \"a226bb56-10cc-42f6-81dc-bf62f7f4038d\") " pod="openstack/placement-67cd87976d-7fbgw" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.622965 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/05056d09-f95e-40cb-96e7-100243e5a858-internal-tls-certs\") pod \"keystone-ccfbd46bc-qz9qm\" (UID: \"05056d09-f95e-40cb-96e7-100243e5a858\") " pod="openstack/keystone-ccfbd46bc-qz9qm" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.622993 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a226bb56-10cc-42f6-81dc-bf62f7f4038d-combined-ca-bundle\") pod \"placement-67cd87976d-7fbgw\" (UID: \"a226bb56-10cc-42f6-81dc-bf62f7f4038d\") " pod="openstack/placement-67cd87976d-7fbgw" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.623027 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a226bb56-10cc-42f6-81dc-bf62f7f4038d-logs\") pod \"placement-67cd87976d-7fbgw\" (UID: \"a226bb56-10cc-42f6-81dc-bf62f7f4038d\") " pod="openstack/placement-67cd87976d-7fbgw" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.623060 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05056d09-f95e-40cb-96e7-100243e5a858-config-data\") pod \"keystone-ccfbd46bc-qz9qm\" (UID: \"05056d09-f95e-40cb-96e7-100243e5a858\") " pod="openstack/keystone-ccfbd46bc-qz9qm" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.640618 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.643783 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fcfdd6f9f-666mt" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.645010 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7050eaa4-061a-4dd3-b4da-73e2abd04458","Type":"ContainerStarted","Data":"287d1f9ca3e681244d3cce039edfcd409cea2b90f4debc427dfadeab6c59b4e2"} Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.670805 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.673212 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-wcvcw" podStartSLOduration=5.508237174 podStartE2EDuration="52.673196418s" podCreationTimestamp="2025-10-03 08:08:24 +0000 UTC" firstStartedPulling="2025-10-03 08:08:28.179305639 +0000 UTC m=+1209.000496119" lastFinishedPulling="2025-10-03 08:09:15.344264873 +0000 UTC m=+1256.165455363" observedRunningTime="2025-10-03 08:09:16.640165175 +0000 UTC m=+1257.461355675" watchObservedRunningTime="2025-10-03 08:09:16.673196418 +0000 UTC m=+1257.494386908" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.681008 4664 scope.go:117] "RemoveContainer" containerID="7dae587b971cd80d76f293caf311ffc3cc9e412d2fb1bb4b87ad05dd7e0a11d8" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.703849 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.706890 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.711253 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.711352 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.719311 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=13.719288273 podStartE2EDuration="13.719288273s" podCreationTimestamp="2025-10-03 08:09:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:09:16.679007423 +0000 UTC m=+1257.500197923" watchObservedRunningTime="2025-10-03 08:09:16.719288273 +0000 UTC m=+1257.540478783" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.729259 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a226bb56-10cc-42f6-81dc-bf62f7f4038d-internal-tls-certs\") pod \"placement-67cd87976d-7fbgw\" (UID: \"a226bb56-10cc-42f6-81dc-bf62f7f4038d\") " pod="openstack/placement-67cd87976d-7fbgw" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.729340 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a226bb56-10cc-42f6-81dc-bf62f7f4038d-public-tls-certs\") pod \"placement-67cd87976d-7fbgw\" (UID: \"a226bb56-10cc-42f6-81dc-bf62f7f4038d\") " pod="openstack/placement-67cd87976d-7fbgw" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.729380 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pv6d6\" (UniqueName: \"kubernetes.io/projected/05056d09-f95e-40cb-96e7-100243e5a858-kube-api-access-pv6d6\") pod \"keystone-ccfbd46bc-qz9qm\" (UID: \"05056d09-f95e-40cb-96e7-100243e5a858\") " pod="openstack/keystone-ccfbd46bc-qz9qm" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.729420 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05056d09-f95e-40cb-96e7-100243e5a858-combined-ca-bundle\") pod \"keystone-ccfbd46bc-qz9qm\" (UID: \"05056d09-f95e-40cb-96e7-100243e5a858\") " pod="openstack/keystone-ccfbd46bc-qz9qm" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.729493 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/05056d09-f95e-40cb-96e7-100243e5a858-credential-keys\") pod \"keystone-ccfbd46bc-qz9qm\" (UID: \"05056d09-f95e-40cb-96e7-100243e5a858\") " pod="openstack/keystone-ccfbd46bc-qz9qm" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.729570 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmmgc\" (UniqueName: \"kubernetes.io/projected/a226bb56-10cc-42f6-81dc-bf62f7f4038d-kube-api-access-zmmgc\") pod \"placement-67cd87976d-7fbgw\" (UID: \"a226bb56-10cc-42f6-81dc-bf62f7f4038d\") " pod="openstack/placement-67cd87976d-7fbgw" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.729603 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/05056d09-f95e-40cb-96e7-100243e5a858-public-tls-certs\") pod \"keystone-ccfbd46bc-qz9qm\" (UID: \"05056d09-f95e-40cb-96e7-100243e5a858\") " pod="openstack/keystone-ccfbd46bc-qz9qm" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.729800 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05056d09-f95e-40cb-96e7-100243e5a858-scripts\") pod \"keystone-ccfbd46bc-qz9qm\" (UID: \"05056d09-f95e-40cb-96e7-100243e5a858\") " pod="openstack/keystone-ccfbd46bc-qz9qm" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.729850 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a226bb56-10cc-42f6-81dc-bf62f7f4038d-config-data\") pod \"placement-67cd87976d-7fbgw\" (UID: \"a226bb56-10cc-42f6-81dc-bf62f7f4038d\") " pod="openstack/placement-67cd87976d-7fbgw" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.729943 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/05056d09-f95e-40cb-96e7-100243e5a858-internal-tls-certs\") pod \"keystone-ccfbd46bc-qz9qm\" (UID: \"05056d09-f95e-40cb-96e7-100243e5a858\") " pod="openstack/keystone-ccfbd46bc-qz9qm" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.729981 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a226bb56-10cc-42f6-81dc-bf62f7f4038d-combined-ca-bundle\") pod \"placement-67cd87976d-7fbgw\" (UID: \"a226bb56-10cc-42f6-81dc-bf62f7f4038d\") " pod="openstack/placement-67cd87976d-7fbgw" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.730037 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a226bb56-10cc-42f6-81dc-bf62f7f4038d-logs\") pod \"placement-67cd87976d-7fbgw\" (UID: \"a226bb56-10cc-42f6-81dc-bf62f7f4038d\") " pod="openstack/placement-67cd87976d-7fbgw" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.730090 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05056d09-f95e-40cb-96e7-100243e5a858-config-data\") pod \"keystone-ccfbd46bc-qz9qm\" (UID: \"05056d09-f95e-40cb-96e7-100243e5a858\") " pod="openstack/keystone-ccfbd46bc-qz9qm" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.730130 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a226bb56-10cc-42f6-81dc-bf62f7f4038d-scripts\") pod \"placement-67cd87976d-7fbgw\" (UID: \"a226bb56-10cc-42f6-81dc-bf62f7f4038d\") " pod="openstack/placement-67cd87976d-7fbgw" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.730157 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/05056d09-f95e-40cb-96e7-100243e5a858-fernet-keys\") pod \"keystone-ccfbd46bc-qz9qm\" (UID: \"05056d09-f95e-40cb-96e7-100243e5a858\") " pod="openstack/keystone-ccfbd46bc-qz9qm" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.733056 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a226bb56-10cc-42f6-81dc-bf62f7f4038d-logs\") pod \"placement-67cd87976d-7fbgw\" (UID: \"a226bb56-10cc-42f6-81dc-bf62f7f4038d\") " pod="openstack/placement-67cd87976d-7fbgw" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.737287 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/05056d09-f95e-40cb-96e7-100243e5a858-fernet-keys\") pod \"keystone-ccfbd46bc-qz9qm\" (UID: \"05056d09-f95e-40cb-96e7-100243e5a858\") " pod="openstack/keystone-ccfbd46bc-qz9qm" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.740358 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/05056d09-f95e-40cb-96e7-100243e5a858-internal-tls-certs\") pod \"keystone-ccfbd46bc-qz9qm\" (UID: \"05056d09-f95e-40cb-96e7-100243e5a858\") " pod="openstack/keystone-ccfbd46bc-qz9qm" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.742073 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a226bb56-10cc-42f6-81dc-bf62f7f4038d-combined-ca-bundle\") pod \"placement-67cd87976d-7fbgw\" (UID: \"a226bb56-10cc-42f6-81dc-bf62f7f4038d\") " pod="openstack/placement-67cd87976d-7fbgw" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.744167 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a226bb56-10cc-42f6-81dc-bf62f7f4038d-scripts\") pod \"placement-67cd87976d-7fbgw\" (UID: \"a226bb56-10cc-42f6-81dc-bf62f7f4038d\") " pod="openstack/placement-67cd87976d-7fbgw" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.744458 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05056d09-f95e-40cb-96e7-100243e5a858-combined-ca-bundle\") pod \"keystone-ccfbd46bc-qz9qm\" (UID: \"05056d09-f95e-40cb-96e7-100243e5a858\") " pod="openstack/keystone-ccfbd46bc-qz9qm" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.745073 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a226bb56-10cc-42f6-81dc-bf62f7f4038d-public-tls-certs\") pod \"placement-67cd87976d-7fbgw\" (UID: \"a226bb56-10cc-42f6-81dc-bf62f7f4038d\") " pod="openstack/placement-67cd87976d-7fbgw" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.747505 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/05056d09-f95e-40cb-96e7-100243e5a858-credential-keys\") pod \"keystone-ccfbd46bc-qz9qm\" (UID: \"05056d09-f95e-40cb-96e7-100243e5a858\") " pod="openstack/keystone-ccfbd46bc-qz9qm" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.756242 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05056d09-f95e-40cb-96e7-100243e5a858-scripts\") pod \"keystone-ccfbd46bc-qz9qm\" (UID: \"05056d09-f95e-40cb-96e7-100243e5a858\") " pod="openstack/keystone-ccfbd46bc-qz9qm" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.756330 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05056d09-f95e-40cb-96e7-100243e5a858-config-data\") pod \"keystone-ccfbd46bc-qz9qm\" (UID: \"05056d09-f95e-40cb-96e7-100243e5a858\") " pod="openstack/keystone-ccfbd46bc-qz9qm" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.756336 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmmgc\" (UniqueName: \"kubernetes.io/projected/a226bb56-10cc-42f6-81dc-bf62f7f4038d-kube-api-access-zmmgc\") pod \"placement-67cd87976d-7fbgw\" (UID: \"a226bb56-10cc-42f6-81dc-bf62f7f4038d\") " pod="openstack/placement-67cd87976d-7fbgw" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.758596 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a226bb56-10cc-42f6-81dc-bf62f7f4038d-config-data\") pod \"placement-67cd87976d-7fbgw\" (UID: \"a226bb56-10cc-42f6-81dc-bf62f7f4038d\") " pod="openstack/placement-67cd87976d-7fbgw" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.760269 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.761183 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a226bb56-10cc-42f6-81dc-bf62f7f4038d-internal-tls-certs\") pod \"placement-67cd87976d-7fbgw\" (UID: \"a226bb56-10cc-42f6-81dc-bf62f7f4038d\") " pod="openstack/placement-67cd87976d-7fbgw" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.763065 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/05056d09-f95e-40cb-96e7-100243e5a858-public-tls-certs\") pod \"keystone-ccfbd46bc-qz9qm\" (UID: \"05056d09-f95e-40cb-96e7-100243e5a858\") " pod="openstack/keystone-ccfbd46bc-qz9qm" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.767390 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pv6d6\" (UniqueName: \"kubernetes.io/projected/05056d09-f95e-40cb-96e7-100243e5a858-kube-api-access-pv6d6\") pod \"keystone-ccfbd46bc-qz9qm\" (UID: \"05056d09-f95e-40cb-96e7-100243e5a858\") " pod="openstack/keystone-ccfbd46bc-qz9qm" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.767575 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fcfdd6f9f-666mt"] Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.774624 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-fcfdd6f9f-666mt"] Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.814734 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-67cd87976d-7fbgw" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.842128 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6837bc3c-3e78-491c-8f0c-377955560009-scripts\") pod \"glance-default-external-api-0\" (UID: \"6837bc3c-3e78-491c-8f0c-377955560009\") " pod="openstack/glance-default-external-api-0" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.842394 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6837bc3c-3e78-491c-8f0c-377955560009-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6837bc3c-3e78-491c-8f0c-377955560009\") " pod="openstack/glance-default-external-api-0" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.843210 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6837bc3c-3e78-491c-8f0c-377955560009-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6837bc3c-3e78-491c-8f0c-377955560009\") " pod="openstack/glance-default-external-api-0" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.843267 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p494m\" (UniqueName: \"kubernetes.io/projected/6837bc3c-3e78-491c-8f0c-377955560009-kube-api-access-p494m\") pod \"glance-default-external-api-0\" (UID: \"6837bc3c-3e78-491c-8f0c-377955560009\") " pod="openstack/glance-default-external-api-0" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.843315 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"6837bc3c-3e78-491c-8f0c-377955560009\") " pod="openstack/glance-default-external-api-0" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.844251 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6837bc3c-3e78-491c-8f0c-377955560009-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6837bc3c-3e78-491c-8f0c-377955560009\") " pod="openstack/glance-default-external-api-0" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.844428 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6837bc3c-3e78-491c-8f0c-377955560009-logs\") pod \"glance-default-external-api-0\" (UID: \"6837bc3c-3e78-491c-8f0c-377955560009\") " pod="openstack/glance-default-external-api-0" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.844479 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6837bc3c-3e78-491c-8f0c-377955560009-config-data\") pod \"glance-default-external-api-0\" (UID: \"6837bc3c-3e78-491c-8f0c-377955560009\") " pod="openstack/glance-default-external-api-0" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.905089 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ccfbd46bc-qz9qm" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.946898 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6837bc3c-3e78-491c-8f0c-377955560009-config-data\") pod \"glance-default-external-api-0\" (UID: \"6837bc3c-3e78-491c-8f0c-377955560009\") " pod="openstack/glance-default-external-api-0" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.947015 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6837bc3c-3e78-491c-8f0c-377955560009-scripts\") pod \"glance-default-external-api-0\" (UID: \"6837bc3c-3e78-491c-8f0c-377955560009\") " pod="openstack/glance-default-external-api-0" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.947071 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6837bc3c-3e78-491c-8f0c-377955560009-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6837bc3c-3e78-491c-8f0c-377955560009\") " pod="openstack/glance-default-external-api-0" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.947143 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6837bc3c-3e78-491c-8f0c-377955560009-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6837bc3c-3e78-491c-8f0c-377955560009\") " pod="openstack/glance-default-external-api-0" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.947177 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p494m\" (UniqueName: \"kubernetes.io/projected/6837bc3c-3e78-491c-8f0c-377955560009-kube-api-access-p494m\") pod \"glance-default-external-api-0\" (UID: \"6837bc3c-3e78-491c-8f0c-377955560009\") " pod="openstack/glance-default-external-api-0" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.947214 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"6837bc3c-3e78-491c-8f0c-377955560009\") " pod="openstack/glance-default-external-api-0" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.947272 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6837bc3c-3e78-491c-8f0c-377955560009-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6837bc3c-3e78-491c-8f0c-377955560009\") " pod="openstack/glance-default-external-api-0" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.947386 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6837bc3c-3e78-491c-8f0c-377955560009-logs\") pod \"glance-default-external-api-0\" (UID: \"6837bc3c-3e78-491c-8f0c-377955560009\") " pod="openstack/glance-default-external-api-0" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.948135 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6837bc3c-3e78-491c-8f0c-377955560009-logs\") pod \"glance-default-external-api-0\" (UID: \"6837bc3c-3e78-491c-8f0c-377955560009\") " pod="openstack/glance-default-external-api-0" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.948296 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6837bc3c-3e78-491c-8f0c-377955560009-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6837bc3c-3e78-491c-8f0c-377955560009\") " pod="openstack/glance-default-external-api-0" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.949337 4664 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"6837bc3c-3e78-491c-8f0c-377955560009\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.954214 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6837bc3c-3e78-491c-8f0c-377955560009-config-data\") pod \"glance-default-external-api-0\" (UID: \"6837bc3c-3e78-491c-8f0c-377955560009\") " pod="openstack/glance-default-external-api-0" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.955582 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6837bc3c-3e78-491c-8f0c-377955560009-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6837bc3c-3e78-491c-8f0c-377955560009\") " pod="openstack/glance-default-external-api-0" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.955668 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6837bc3c-3e78-491c-8f0c-377955560009-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6837bc3c-3e78-491c-8f0c-377955560009\") " pod="openstack/glance-default-external-api-0" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.956540 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6837bc3c-3e78-491c-8f0c-377955560009-scripts\") pod \"glance-default-external-api-0\" (UID: \"6837bc3c-3e78-491c-8f0c-377955560009\") " pod="openstack/glance-default-external-api-0" Oct 03 08:09:16 crc kubenswrapper[4664]: I1003 08:09:16.977888 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p494m\" (UniqueName: \"kubernetes.io/projected/6837bc3c-3e78-491c-8f0c-377955560009-kube-api-access-p494m\") pod \"glance-default-external-api-0\" (UID: \"6837bc3c-3e78-491c-8f0c-377955560009\") " pod="openstack/glance-default-external-api-0" Oct 03 08:09:17 crc kubenswrapper[4664]: I1003 08:09:17.023972 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"6837bc3c-3e78-491c-8f0c-377955560009\") " pod="openstack/glance-default-external-api-0" Oct 03 08:09:17 crc kubenswrapper[4664]: I1003 08:09:17.209182 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 08:09:17 crc kubenswrapper[4664]: E1003 08:09:17.347089 4664 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13bd5acf_4425_4fcc_8c22_cbddfe167f46.slice/crio-conmon-3e1bf8930096612aa2991b40c4361f49d124f8a9baa403cf5f0f42cd3ce83bb7.scope\": RecentStats: unable to find data in memory cache]" Oct 03 08:09:17 crc kubenswrapper[4664]: I1003 08:09:17.450801 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-67cd87976d-7fbgw"] Oct 03 08:09:17 crc kubenswrapper[4664]: I1003 08:09:17.602638 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-ccfbd46bc-qz9qm"] Oct 03 08:09:17 crc kubenswrapper[4664]: I1003 08:09:17.689947 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-67cd87976d-7fbgw" event={"ID":"a226bb56-10cc-42f6-81dc-bf62f7f4038d","Type":"ContainerStarted","Data":"1a3a4daaef79ada48e831e683a87925df0a6222ef60f3d15ca9f14f780705d04"} Oct 03 08:09:17 crc kubenswrapper[4664]: I1003 08:09:17.705389 4664 generic.go:334] "Generic (PLEG): container finished" podID="13bd5acf-4425-4fcc-8c22-cbddfe167f46" containerID="3e1bf8930096612aa2991b40c4361f49d124f8a9baa403cf5f0f42cd3ce83bb7" exitCode=0 Oct 03 08:09:17 crc kubenswrapper[4664]: I1003 08:09:17.705443 4664 generic.go:334] "Generic (PLEG): container finished" podID="13bd5acf-4425-4fcc-8c22-cbddfe167f46" containerID="6f72c1307115b257a6d41f829ba2ac53221291414478f9d951a2c222af9c4395" exitCode=143 Oct 03 08:09:17 crc kubenswrapper[4664]: I1003 08:09:17.705463 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"13bd5acf-4425-4fcc-8c22-cbddfe167f46","Type":"ContainerDied","Data":"3e1bf8930096612aa2991b40c4361f49d124f8a9baa403cf5f0f42cd3ce83bb7"} Oct 03 08:09:17 crc kubenswrapper[4664]: I1003 08:09:17.705496 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"13bd5acf-4425-4fcc-8c22-cbddfe167f46","Type":"ContainerDied","Data":"6f72c1307115b257a6d41f829ba2ac53221291414478f9d951a2c222af9c4395"} Oct 03 08:09:17 crc kubenswrapper[4664]: I1003 08:09:17.708357 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 08:09:17 crc kubenswrapper[4664]: I1003 08:09:17.779038 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13bd5acf-4425-4fcc-8c22-cbddfe167f46-combined-ca-bundle\") pod \"13bd5acf-4425-4fcc-8c22-cbddfe167f46\" (UID: \"13bd5acf-4425-4fcc-8c22-cbddfe167f46\") " Oct 03 08:09:17 crc kubenswrapper[4664]: I1003 08:09:17.779188 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13bd5acf-4425-4fcc-8c22-cbddfe167f46-logs\") pod \"13bd5acf-4425-4fcc-8c22-cbddfe167f46\" (UID: \"13bd5acf-4425-4fcc-8c22-cbddfe167f46\") " Oct 03 08:09:17 crc kubenswrapper[4664]: I1003 08:09:17.779229 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13bd5acf-4425-4fcc-8c22-cbddfe167f46-config-data\") pod \"13bd5acf-4425-4fcc-8c22-cbddfe167f46\" (UID: \"13bd5acf-4425-4fcc-8c22-cbddfe167f46\") " Oct 03 08:09:17 crc kubenswrapper[4664]: I1003 08:09:17.779308 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13bd5acf-4425-4fcc-8c22-cbddfe167f46-scripts\") pod \"13bd5acf-4425-4fcc-8c22-cbddfe167f46\" (UID: \"13bd5acf-4425-4fcc-8c22-cbddfe167f46\") " Oct 03 08:09:17 crc kubenswrapper[4664]: I1003 08:09:17.779353 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/13bd5acf-4425-4fcc-8c22-cbddfe167f46-httpd-run\") pod \"13bd5acf-4425-4fcc-8c22-cbddfe167f46\" (UID: \"13bd5acf-4425-4fcc-8c22-cbddfe167f46\") " Oct 03 08:09:17 crc kubenswrapper[4664]: I1003 08:09:17.779379 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kxnr\" (UniqueName: \"kubernetes.io/projected/13bd5acf-4425-4fcc-8c22-cbddfe167f46-kube-api-access-5kxnr\") pod \"13bd5acf-4425-4fcc-8c22-cbddfe167f46\" (UID: \"13bd5acf-4425-4fcc-8c22-cbddfe167f46\") " Oct 03 08:09:17 crc kubenswrapper[4664]: I1003 08:09:17.779470 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"13bd5acf-4425-4fcc-8c22-cbddfe167f46\" (UID: \"13bd5acf-4425-4fcc-8c22-cbddfe167f46\") " Oct 03 08:09:17 crc kubenswrapper[4664]: I1003 08:09:17.779700 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13bd5acf-4425-4fcc-8c22-cbddfe167f46-logs" (OuterVolumeSpecName: "logs") pod "13bd5acf-4425-4fcc-8c22-cbddfe167f46" (UID: "13bd5acf-4425-4fcc-8c22-cbddfe167f46"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:09:17 crc kubenswrapper[4664]: I1003 08:09:17.779937 4664 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13bd5acf-4425-4fcc-8c22-cbddfe167f46-logs\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:17 crc kubenswrapper[4664]: I1003 08:09:17.780123 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13bd5acf-4425-4fcc-8c22-cbddfe167f46-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "13bd5acf-4425-4fcc-8c22-cbddfe167f46" (UID: "13bd5acf-4425-4fcc-8c22-cbddfe167f46"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:09:17 crc kubenswrapper[4664]: I1003 08:09:17.787368 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13bd5acf-4425-4fcc-8c22-cbddfe167f46-kube-api-access-5kxnr" (OuterVolumeSpecName: "kube-api-access-5kxnr") pod "13bd5acf-4425-4fcc-8c22-cbddfe167f46" (UID: "13bd5acf-4425-4fcc-8c22-cbddfe167f46"). InnerVolumeSpecName "kube-api-access-5kxnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:09:17 crc kubenswrapper[4664]: I1003 08:09:17.787863 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13bd5acf-4425-4fcc-8c22-cbddfe167f46-scripts" (OuterVolumeSpecName: "scripts") pod "13bd5acf-4425-4fcc-8c22-cbddfe167f46" (UID: "13bd5acf-4425-4fcc-8c22-cbddfe167f46"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:09:17 crc kubenswrapper[4664]: I1003 08:09:17.788849 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "13bd5acf-4425-4fcc-8c22-cbddfe167f46" (UID: "13bd5acf-4425-4fcc-8c22-cbddfe167f46"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 08:09:17 crc kubenswrapper[4664]: I1003 08:09:17.830152 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13bd5acf-4425-4fcc-8c22-cbddfe167f46-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "13bd5acf-4425-4fcc-8c22-cbddfe167f46" (UID: "13bd5acf-4425-4fcc-8c22-cbddfe167f46"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:09:17 crc kubenswrapper[4664]: I1003 08:09:17.882542 4664 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13bd5acf-4425-4fcc-8c22-cbddfe167f46-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:17 crc kubenswrapper[4664]: I1003 08:09:17.882564 4664 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/13bd5acf-4425-4fcc-8c22-cbddfe167f46-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:17 crc kubenswrapper[4664]: I1003 08:09:17.882573 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kxnr\" (UniqueName: \"kubernetes.io/projected/13bd5acf-4425-4fcc-8c22-cbddfe167f46-kube-api-access-5kxnr\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:17 crc kubenswrapper[4664]: I1003 08:09:17.882591 4664 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 03 08:09:17 crc kubenswrapper[4664]: I1003 08:09:17.882600 4664 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13bd5acf-4425-4fcc-8c22-cbddfe167f46-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:17 crc kubenswrapper[4664]: I1003 08:09:17.901504 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="069410b4-da38-481c-8cf1-ae161edcabf1" path="/var/lib/kubelet/pods/069410b4-da38-481c-8cf1-ae161edcabf1/volumes" Oct 03 08:09:17 crc kubenswrapper[4664]: I1003 08:09:17.908053 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd" path="/var/lib/kubelet/pods/9d8fc6c1-462f-4dd9-af1b-9a53f665e0cd/volumes" Oct 03 08:09:17 crc kubenswrapper[4664]: I1003 08:09:17.910904 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13bd5acf-4425-4fcc-8c22-cbddfe167f46-config-data" (OuterVolumeSpecName: "config-data") pod "13bd5acf-4425-4fcc-8c22-cbddfe167f46" (UID: "13bd5acf-4425-4fcc-8c22-cbddfe167f46"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:09:17 crc kubenswrapper[4664]: I1003 08:09:17.916709 4664 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 03 08:09:17 crc kubenswrapper[4664]: I1003 08:09:17.964189 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 08:09:17 crc kubenswrapper[4664]: I1003 08:09:17.984295 4664 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13bd5acf-4425-4fcc-8c22-cbddfe167f46-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:17 crc kubenswrapper[4664]: I1003 08:09:17.984327 4664 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:17 crc kubenswrapper[4664]: W1003 08:09:17.992843 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6837bc3c_3e78_491c_8f0c_377955560009.slice/crio-e2f8ee4ceeae5c30394f605c54391c20196dd76da65ac3552edb6244a3d1ab10 WatchSource:0}: Error finding container e2f8ee4ceeae5c30394f605c54391c20196dd76da65ac3552edb6244a3d1ab10: Status 404 returned error can't find the container with id e2f8ee4ceeae5c30394f605c54391c20196dd76da65ac3552edb6244a3d1ab10 Oct 03 08:09:18 crc kubenswrapper[4664]: I1003 08:09:18.720187 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6837bc3c-3e78-491c-8f0c-377955560009","Type":"ContainerStarted","Data":"2de90cfa4619e4a13019591892950d05a8b7cfea2d32a02ef57d42d8c4658c76"} Oct 03 08:09:18 crc kubenswrapper[4664]: I1003 08:09:18.720510 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6837bc3c-3e78-491c-8f0c-377955560009","Type":"ContainerStarted","Data":"e2f8ee4ceeae5c30394f605c54391c20196dd76da65ac3552edb6244a3d1ab10"} Oct 03 08:09:18 crc kubenswrapper[4664]: I1003 08:09:18.723429 4664 generic.go:334] "Generic (PLEG): container finished" podID="bc164c8e-05b6-4b2e-bc8d-5eeb8970b4dc" containerID="b9c3c4d136277110b5b01a873e525e489aff77e0d0ef27946aaeaf217e35ca6c" exitCode=0 Oct 03 08:09:18 crc kubenswrapper[4664]: I1003 08:09:18.723469 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vb5f6" event={"ID":"bc164c8e-05b6-4b2e-bc8d-5eeb8970b4dc","Type":"ContainerDied","Data":"b9c3c4d136277110b5b01a873e525e489aff77e0d0ef27946aaeaf217e35ca6c"} Oct 03 08:09:18 crc kubenswrapper[4664]: I1003 08:09:18.729198 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"13bd5acf-4425-4fcc-8c22-cbddfe167f46","Type":"ContainerDied","Data":"99beec851990494ce5d4e9ded16851d5e3d44ab65b25ef8280e91dfbb39c362a"} Oct 03 08:09:18 crc kubenswrapper[4664]: I1003 08:09:18.729263 4664 scope.go:117] "RemoveContainer" containerID="3e1bf8930096612aa2991b40c4361f49d124f8a9baa403cf5f0f42cd3ce83bb7" Oct 03 08:09:18 crc kubenswrapper[4664]: I1003 08:09:18.729392 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 08:09:18 crc kubenswrapper[4664]: I1003 08:09:18.746098 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ccfbd46bc-qz9qm" event={"ID":"05056d09-f95e-40cb-96e7-100243e5a858","Type":"ContainerStarted","Data":"a2fb6c0bce8e85ad98ddf16f75ddbb5148c6cc73155c1dffb2f477cf8cacad53"} Oct 03 08:09:18 crc kubenswrapper[4664]: I1003 08:09:18.746142 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ccfbd46bc-qz9qm" event={"ID":"05056d09-f95e-40cb-96e7-100243e5a858","Type":"ContainerStarted","Data":"b97f52a660be319829ea44d1be28b4c87db16883f5a4bb7d32cfa674fc1a2c89"} Oct 03 08:09:18 crc kubenswrapper[4664]: I1003 08:09:18.746194 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-ccfbd46bc-qz9qm" Oct 03 08:09:18 crc kubenswrapper[4664]: I1003 08:09:18.749790 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-67cd87976d-7fbgw" event={"ID":"a226bb56-10cc-42f6-81dc-bf62f7f4038d","Type":"ContainerStarted","Data":"d6d73bfdf0f6c6c4b434763dd677186ae59f0ee45e14828798a481178235db5a"} Oct 03 08:09:18 crc kubenswrapper[4664]: I1003 08:09:18.749822 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-67cd87976d-7fbgw" event={"ID":"a226bb56-10cc-42f6-81dc-bf62f7f4038d","Type":"ContainerStarted","Data":"6169a3323ff8b098f2d4b7aca749719f40af3a420268183a90bd69bc3b6d884b"} Oct 03 08:09:18 crc kubenswrapper[4664]: I1003 08:09:18.750920 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-67cd87976d-7fbgw" Oct 03 08:09:18 crc kubenswrapper[4664]: I1003 08:09:18.750948 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-67cd87976d-7fbgw" Oct 03 08:09:18 crc kubenswrapper[4664]: I1003 08:09:18.771460 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-ccfbd46bc-qz9qm" podStartSLOduration=2.7714425929999997 podStartE2EDuration="2.771442593s" podCreationTimestamp="2025-10-03 08:09:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:09:18.76853877 +0000 UTC m=+1259.589729270" watchObservedRunningTime="2025-10-03 08:09:18.771442593 +0000 UTC m=+1259.592633083" Oct 03 08:09:18 crc kubenswrapper[4664]: I1003 08:09:18.793729 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-67cd87976d-7fbgw" podStartSLOduration=2.7937047379999997 podStartE2EDuration="2.793704738s" podCreationTimestamp="2025-10-03 08:09:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:09:18.790003542 +0000 UTC m=+1259.611194042" watchObservedRunningTime="2025-10-03 08:09:18.793704738 +0000 UTC m=+1259.614895228" Oct 03 08:09:18 crc kubenswrapper[4664]: I1003 08:09:18.849805 4664 scope.go:117] "RemoveContainer" containerID="6f72c1307115b257a6d41f829ba2ac53221291414478f9d951a2c222af9c4395" Oct 03 08:09:18 crc kubenswrapper[4664]: I1003 08:09:18.853097 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 08:09:18 crc kubenswrapper[4664]: I1003 08:09:18.862676 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 08:09:18 crc kubenswrapper[4664]: I1003 08:09:18.895751 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 08:09:18 crc kubenswrapper[4664]: E1003 08:09:18.896295 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13bd5acf-4425-4fcc-8c22-cbddfe167f46" containerName="glance-httpd" Oct 03 08:09:18 crc kubenswrapper[4664]: I1003 08:09:18.896323 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="13bd5acf-4425-4fcc-8c22-cbddfe167f46" containerName="glance-httpd" Oct 03 08:09:18 crc kubenswrapper[4664]: E1003 08:09:18.896345 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13bd5acf-4425-4fcc-8c22-cbddfe167f46" containerName="glance-log" Oct 03 08:09:18 crc kubenswrapper[4664]: I1003 08:09:18.896353 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="13bd5acf-4425-4fcc-8c22-cbddfe167f46" containerName="glance-log" Oct 03 08:09:18 crc kubenswrapper[4664]: I1003 08:09:18.896602 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="13bd5acf-4425-4fcc-8c22-cbddfe167f46" containerName="glance-httpd" Oct 03 08:09:18 crc kubenswrapper[4664]: I1003 08:09:18.896668 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="13bd5acf-4425-4fcc-8c22-cbddfe167f46" containerName="glance-log" Oct 03 08:09:18 crc kubenswrapper[4664]: I1003 08:09:18.897978 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 08:09:18 crc kubenswrapper[4664]: I1003 08:09:18.903151 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 08:09:18 crc kubenswrapper[4664]: I1003 08:09:18.905111 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 03 08:09:18 crc kubenswrapper[4664]: I1003 08:09:18.906532 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 03 08:09:19 crc kubenswrapper[4664]: I1003 08:09:19.008520 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7b4a287-6826-4b8f-945a-aad1d1deb92a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c7b4a287-6826-4b8f-945a-aad1d1deb92a\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:09:19 crc kubenswrapper[4664]: I1003 08:09:19.008896 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7b4a287-6826-4b8f-945a-aad1d1deb92a-logs\") pod \"glance-default-internal-api-0\" (UID: \"c7b4a287-6826-4b8f-945a-aad1d1deb92a\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:09:19 crc kubenswrapper[4664]: I1003 08:09:19.008926 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7b4a287-6826-4b8f-945a-aad1d1deb92a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c7b4a287-6826-4b8f-945a-aad1d1deb92a\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:09:19 crc kubenswrapper[4664]: I1003 08:09:19.008972 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np7r7\" (UniqueName: \"kubernetes.io/projected/c7b4a287-6826-4b8f-945a-aad1d1deb92a-kube-api-access-np7r7\") pod \"glance-default-internal-api-0\" (UID: \"c7b4a287-6826-4b8f-945a-aad1d1deb92a\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:09:19 crc kubenswrapper[4664]: I1003 08:09:19.009024 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7b4a287-6826-4b8f-945a-aad1d1deb92a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c7b4a287-6826-4b8f-945a-aad1d1deb92a\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:09:19 crc kubenswrapper[4664]: I1003 08:09:19.009047 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7b4a287-6826-4b8f-945a-aad1d1deb92a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c7b4a287-6826-4b8f-945a-aad1d1deb92a\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:09:19 crc kubenswrapper[4664]: I1003 08:09:19.009082 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"c7b4a287-6826-4b8f-945a-aad1d1deb92a\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:09:19 crc kubenswrapper[4664]: I1003 08:09:19.009099 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7b4a287-6826-4b8f-945a-aad1d1deb92a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c7b4a287-6826-4b8f-945a-aad1d1deb92a\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:09:19 crc kubenswrapper[4664]: I1003 08:09:19.111280 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7b4a287-6826-4b8f-945a-aad1d1deb92a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c7b4a287-6826-4b8f-945a-aad1d1deb92a\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:09:19 crc kubenswrapper[4664]: I1003 08:09:19.111884 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7b4a287-6826-4b8f-945a-aad1d1deb92a-logs\") pod \"glance-default-internal-api-0\" (UID: \"c7b4a287-6826-4b8f-945a-aad1d1deb92a\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:09:19 crc kubenswrapper[4664]: I1003 08:09:19.111910 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7b4a287-6826-4b8f-945a-aad1d1deb92a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c7b4a287-6826-4b8f-945a-aad1d1deb92a\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:09:19 crc kubenswrapper[4664]: I1003 08:09:19.111986 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np7r7\" (UniqueName: \"kubernetes.io/projected/c7b4a287-6826-4b8f-945a-aad1d1deb92a-kube-api-access-np7r7\") pod \"glance-default-internal-api-0\" (UID: \"c7b4a287-6826-4b8f-945a-aad1d1deb92a\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:09:19 crc kubenswrapper[4664]: I1003 08:09:19.112068 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7b4a287-6826-4b8f-945a-aad1d1deb92a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c7b4a287-6826-4b8f-945a-aad1d1deb92a\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:09:19 crc kubenswrapper[4664]: I1003 08:09:19.112108 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7b4a287-6826-4b8f-945a-aad1d1deb92a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c7b4a287-6826-4b8f-945a-aad1d1deb92a\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:09:19 crc kubenswrapper[4664]: I1003 08:09:19.112153 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"c7b4a287-6826-4b8f-945a-aad1d1deb92a\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:09:19 crc kubenswrapper[4664]: I1003 08:09:19.112172 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7b4a287-6826-4b8f-945a-aad1d1deb92a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c7b4a287-6826-4b8f-945a-aad1d1deb92a\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:09:19 crc kubenswrapper[4664]: I1003 08:09:19.111811 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7b4a287-6826-4b8f-945a-aad1d1deb92a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c7b4a287-6826-4b8f-945a-aad1d1deb92a\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:09:19 crc kubenswrapper[4664]: I1003 08:09:19.113153 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7b4a287-6826-4b8f-945a-aad1d1deb92a-logs\") pod \"glance-default-internal-api-0\" (UID: \"c7b4a287-6826-4b8f-945a-aad1d1deb92a\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:09:19 crc kubenswrapper[4664]: I1003 08:09:19.115294 4664 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"c7b4a287-6826-4b8f-945a-aad1d1deb92a\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Oct 03 08:09:19 crc kubenswrapper[4664]: I1003 08:09:19.118850 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7b4a287-6826-4b8f-945a-aad1d1deb92a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c7b4a287-6826-4b8f-945a-aad1d1deb92a\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:09:19 crc kubenswrapper[4664]: I1003 08:09:19.119189 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7b4a287-6826-4b8f-945a-aad1d1deb92a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c7b4a287-6826-4b8f-945a-aad1d1deb92a\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:09:19 crc kubenswrapper[4664]: I1003 08:09:19.121292 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7b4a287-6826-4b8f-945a-aad1d1deb92a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c7b4a287-6826-4b8f-945a-aad1d1deb92a\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:09:19 crc kubenswrapper[4664]: I1003 08:09:19.122848 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7b4a287-6826-4b8f-945a-aad1d1deb92a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c7b4a287-6826-4b8f-945a-aad1d1deb92a\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:09:19 crc kubenswrapper[4664]: I1003 08:09:19.130418 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np7r7\" (UniqueName: \"kubernetes.io/projected/c7b4a287-6826-4b8f-945a-aad1d1deb92a-kube-api-access-np7r7\") pod \"glance-default-internal-api-0\" (UID: \"c7b4a287-6826-4b8f-945a-aad1d1deb92a\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:09:19 crc kubenswrapper[4664]: I1003 08:09:19.177971 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"c7b4a287-6826-4b8f-945a-aad1d1deb92a\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:09:19 crc kubenswrapper[4664]: I1003 08:09:19.231469 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 08:09:19 crc kubenswrapper[4664]: I1003 08:09:19.765842 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6837bc3c-3e78-491c-8f0c-377955560009","Type":"ContainerStarted","Data":"eb1303c4f6e4d5ecb19909b618518eb2db4beabcd25acd9db5c40270d97f1986"} Oct 03 08:09:19 crc kubenswrapper[4664]: I1003 08:09:19.770581 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-lgr4h" event={"ID":"4d316d5e-f411-4940-af4d-9c42f5baae63","Type":"ContainerStarted","Data":"18a7ff075956ded64f9d6e2abb1947765764e84363b9a35a07db64b75756a64d"} Oct 03 08:09:19 crc kubenswrapper[4664]: I1003 08:09:19.797269 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.7972525790000002 podStartE2EDuration="3.797252579s" podCreationTimestamp="2025-10-03 08:09:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:09:19.790790905 +0000 UTC m=+1260.611981405" watchObservedRunningTime="2025-10-03 08:09:19.797252579 +0000 UTC m=+1260.618443069" Oct 03 08:09:19 crc kubenswrapper[4664]: I1003 08:09:19.849486 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-lgr4h" podStartSLOduration=6.41136195 podStartE2EDuration="56.849468779s" podCreationTimestamp="2025-10-03 08:08:23 +0000 UTC" firstStartedPulling="2025-10-03 08:08:28.206383952 +0000 UTC m=+1209.027574442" lastFinishedPulling="2025-10-03 08:09:18.644490781 +0000 UTC m=+1259.465681271" observedRunningTime="2025-10-03 08:09:19.843038596 +0000 UTC m=+1260.664229086" watchObservedRunningTime="2025-10-03 08:09:19.849468779 +0000 UTC m=+1260.670659269" Oct 03 08:09:19 crc kubenswrapper[4664]: I1003 08:09:19.870645 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 08:09:19 crc kubenswrapper[4664]: I1003 08:09:19.912020 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13bd5acf-4425-4fcc-8c22-cbddfe167f46" path="/var/lib/kubelet/pods/13bd5acf-4425-4fcc-8c22-cbddfe167f46/volumes" Oct 03 08:09:20 crc kubenswrapper[4664]: I1003 08:09:20.100503 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vb5f6" Oct 03 08:09:20 crc kubenswrapper[4664]: I1003 08:09:20.252357 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc164c8e-05b6-4b2e-bc8d-5eeb8970b4dc-combined-ca-bundle\") pod \"bc164c8e-05b6-4b2e-bc8d-5eeb8970b4dc\" (UID: \"bc164c8e-05b6-4b2e-bc8d-5eeb8970b4dc\") " Oct 03 08:09:20 crc kubenswrapper[4664]: I1003 08:09:20.252839 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bndr2\" (UniqueName: \"kubernetes.io/projected/bc164c8e-05b6-4b2e-bc8d-5eeb8970b4dc-kube-api-access-bndr2\") pod \"bc164c8e-05b6-4b2e-bc8d-5eeb8970b4dc\" (UID: \"bc164c8e-05b6-4b2e-bc8d-5eeb8970b4dc\") " Oct 03 08:09:20 crc kubenswrapper[4664]: I1003 08:09:20.252913 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bc164c8e-05b6-4b2e-bc8d-5eeb8970b4dc-config\") pod \"bc164c8e-05b6-4b2e-bc8d-5eeb8970b4dc\" (UID: \"bc164c8e-05b6-4b2e-bc8d-5eeb8970b4dc\") " Oct 03 08:09:20 crc kubenswrapper[4664]: I1003 08:09:20.260876 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc164c8e-05b6-4b2e-bc8d-5eeb8970b4dc-kube-api-access-bndr2" (OuterVolumeSpecName: "kube-api-access-bndr2") pod "bc164c8e-05b6-4b2e-bc8d-5eeb8970b4dc" (UID: "bc164c8e-05b6-4b2e-bc8d-5eeb8970b4dc"). InnerVolumeSpecName "kube-api-access-bndr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:09:20 crc kubenswrapper[4664]: I1003 08:09:20.287756 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc164c8e-05b6-4b2e-bc8d-5eeb8970b4dc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bc164c8e-05b6-4b2e-bc8d-5eeb8970b4dc" (UID: "bc164c8e-05b6-4b2e-bc8d-5eeb8970b4dc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:09:20 crc kubenswrapper[4664]: I1003 08:09:20.295931 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc164c8e-05b6-4b2e-bc8d-5eeb8970b4dc-config" (OuterVolumeSpecName: "config") pod "bc164c8e-05b6-4b2e-bc8d-5eeb8970b4dc" (UID: "bc164c8e-05b6-4b2e-bc8d-5eeb8970b4dc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:09:20 crc kubenswrapper[4664]: I1003 08:09:20.356407 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bndr2\" (UniqueName: \"kubernetes.io/projected/bc164c8e-05b6-4b2e-bc8d-5eeb8970b4dc-kube-api-access-bndr2\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:20 crc kubenswrapper[4664]: I1003 08:09:20.356456 4664 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/bc164c8e-05b6-4b2e-bc8d-5eeb8970b4dc-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:20 crc kubenswrapper[4664]: I1003 08:09:20.356468 4664 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc164c8e-05b6-4b2e-bc8d-5eeb8970b4dc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:20 crc kubenswrapper[4664]: I1003 08:09:20.787948 4664 generic.go:334] "Generic (PLEG): container finished" podID="82832c17-408b-4b89-992f-09e393024fe2" containerID="6c5a48133e9038524a116cd42bf27c44fb9734cc10a9f3b5e3e6f8820e6c5446" exitCode=0 Oct 03 08:09:20 crc kubenswrapper[4664]: I1003 08:09:20.788272 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wcvcw" event={"ID":"82832c17-408b-4b89-992f-09e393024fe2","Type":"ContainerDied","Data":"6c5a48133e9038524a116cd42bf27c44fb9734cc10a9f3b5e3e6f8820e6c5446"} Oct 03 08:09:20 crc kubenswrapper[4664]: I1003 08:09:20.807070 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vb5f6" event={"ID":"bc164c8e-05b6-4b2e-bc8d-5eeb8970b4dc","Type":"ContainerDied","Data":"97a116047312948a9ca99f68ced0c3fab929cf794a876804e83cee49c3c5a67e"} Oct 03 08:09:20 crc kubenswrapper[4664]: I1003 08:09:20.807117 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97a116047312948a9ca99f68ced0c3fab929cf794a876804e83cee49c3c5a67e" Oct 03 08:09:20 crc kubenswrapper[4664]: I1003 08:09:20.807339 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vb5f6" Oct 03 08:09:20 crc kubenswrapper[4664]: I1003 08:09:20.825053 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c7b4a287-6826-4b8f-945a-aad1d1deb92a","Type":"ContainerStarted","Data":"54d3fbc4e473e2f16d19aae198c89dc0d7dee395d64fa9560554f9bb24e82498"} Oct 03 08:09:20 crc kubenswrapper[4664]: I1003 08:09:20.825099 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c7b4a287-6826-4b8f-945a-aad1d1deb92a","Type":"ContainerStarted","Data":"ab3021c9bd21ce0c840f0cb79809e87d17c6f0d3f5331fb566d3f13df065f5c7"} Oct 03 08:09:20 crc kubenswrapper[4664]: I1003 08:09:20.904723 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-l2f6v"] Oct 03 08:09:20 crc kubenswrapper[4664]: E1003 08:09:20.905180 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc164c8e-05b6-4b2e-bc8d-5eeb8970b4dc" containerName="neutron-db-sync" Oct 03 08:09:20 crc kubenswrapper[4664]: I1003 08:09:20.905196 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc164c8e-05b6-4b2e-bc8d-5eeb8970b4dc" containerName="neutron-db-sync" Oct 03 08:09:20 crc kubenswrapper[4664]: I1003 08:09:20.905439 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc164c8e-05b6-4b2e-bc8d-5eeb8970b4dc" containerName="neutron-db-sync" Oct 03 08:09:20 crc kubenswrapper[4664]: I1003 08:09:20.907392 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-l2f6v" Oct 03 08:09:20 crc kubenswrapper[4664]: I1003 08:09:20.943705 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-l2f6v"] Oct 03 08:09:21 crc kubenswrapper[4664]: I1003 08:09:21.021776 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6c47458c7b-zb4ls"] Oct 03 08:09:21 crc kubenswrapper[4664]: I1003 08:09:21.024443 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c47458c7b-zb4ls" Oct 03 08:09:21 crc kubenswrapper[4664]: I1003 08:09:21.027471 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 03 08:09:21 crc kubenswrapper[4664]: I1003 08:09:21.027906 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 03 08:09:21 crc kubenswrapper[4664]: I1003 08:09:21.028084 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-km9tz" Oct 03 08:09:21 crc kubenswrapper[4664]: I1003 08:09:21.028242 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 03 08:09:21 crc kubenswrapper[4664]: I1003 08:09:21.030079 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6c47458c7b-zb4ls"] Oct 03 08:09:21 crc kubenswrapper[4664]: I1003 08:09:21.086801 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9397e9d7-d42d-405b-872c-df7b05e96870-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-l2f6v\" (UID: \"9397e9d7-d42d-405b-872c-df7b05e96870\") " pod="openstack/dnsmasq-dns-5ccc5c4795-l2f6v" Oct 03 08:09:21 crc kubenswrapper[4664]: I1003 08:09:21.086901 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9397e9d7-d42d-405b-872c-df7b05e96870-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-l2f6v\" (UID: \"9397e9d7-d42d-405b-872c-df7b05e96870\") " pod="openstack/dnsmasq-dns-5ccc5c4795-l2f6v" Oct 03 08:09:21 crc kubenswrapper[4664]: I1003 08:09:21.086931 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9397e9d7-d42d-405b-872c-df7b05e96870-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-l2f6v\" (UID: \"9397e9d7-d42d-405b-872c-df7b05e96870\") " pod="openstack/dnsmasq-dns-5ccc5c4795-l2f6v" Oct 03 08:09:21 crc kubenswrapper[4664]: I1003 08:09:21.086957 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9397e9d7-d42d-405b-872c-df7b05e96870-config\") pod \"dnsmasq-dns-5ccc5c4795-l2f6v\" (UID: \"9397e9d7-d42d-405b-872c-df7b05e96870\") " pod="openstack/dnsmasq-dns-5ccc5c4795-l2f6v" Oct 03 08:09:21 crc kubenswrapper[4664]: I1003 08:09:21.086987 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjv2n\" (UniqueName: \"kubernetes.io/projected/9397e9d7-d42d-405b-872c-df7b05e96870-kube-api-access-kjv2n\") pod \"dnsmasq-dns-5ccc5c4795-l2f6v\" (UID: \"9397e9d7-d42d-405b-872c-df7b05e96870\") " pod="openstack/dnsmasq-dns-5ccc5c4795-l2f6v" Oct 03 08:09:21 crc kubenswrapper[4664]: I1003 08:09:21.087015 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9397e9d7-d42d-405b-872c-df7b05e96870-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-l2f6v\" (UID: \"9397e9d7-d42d-405b-872c-df7b05e96870\") " pod="openstack/dnsmasq-dns-5ccc5c4795-l2f6v" Oct 03 08:09:21 crc kubenswrapper[4664]: I1003 08:09:21.087051 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6e72a85b-4087-4478-9d83-91a468eda59d-httpd-config\") pod \"neutron-6c47458c7b-zb4ls\" (UID: \"6e72a85b-4087-4478-9d83-91a468eda59d\") " pod="openstack/neutron-6c47458c7b-zb4ls" Oct 03 08:09:21 crc kubenswrapper[4664]: I1003 08:09:21.087095 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e72a85b-4087-4478-9d83-91a468eda59d-ovndb-tls-certs\") pod \"neutron-6c47458c7b-zb4ls\" (UID: \"6e72a85b-4087-4478-9d83-91a468eda59d\") " pod="openstack/neutron-6c47458c7b-zb4ls" Oct 03 08:09:21 crc kubenswrapper[4664]: I1003 08:09:21.087141 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e72a85b-4087-4478-9d83-91a468eda59d-combined-ca-bundle\") pod \"neutron-6c47458c7b-zb4ls\" (UID: \"6e72a85b-4087-4478-9d83-91a468eda59d\") " pod="openstack/neutron-6c47458c7b-zb4ls" Oct 03 08:09:21 crc kubenswrapper[4664]: I1003 08:09:21.087212 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7td7\" (UniqueName: \"kubernetes.io/projected/6e72a85b-4087-4478-9d83-91a468eda59d-kube-api-access-v7td7\") pod \"neutron-6c47458c7b-zb4ls\" (UID: \"6e72a85b-4087-4478-9d83-91a468eda59d\") " pod="openstack/neutron-6c47458c7b-zb4ls" Oct 03 08:09:21 crc kubenswrapper[4664]: I1003 08:09:21.087236 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6e72a85b-4087-4478-9d83-91a468eda59d-config\") pod \"neutron-6c47458c7b-zb4ls\" (UID: \"6e72a85b-4087-4478-9d83-91a468eda59d\") " pod="openstack/neutron-6c47458c7b-zb4ls" Oct 03 08:09:21 crc kubenswrapper[4664]: I1003 08:09:21.188731 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9397e9d7-d42d-405b-872c-df7b05e96870-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-l2f6v\" (UID: \"9397e9d7-d42d-405b-872c-df7b05e96870\") " pod="openstack/dnsmasq-dns-5ccc5c4795-l2f6v" Oct 03 08:09:21 crc kubenswrapper[4664]: I1003 08:09:21.188837 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9397e9d7-d42d-405b-872c-df7b05e96870-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-l2f6v\" (UID: \"9397e9d7-d42d-405b-872c-df7b05e96870\") " pod="openstack/dnsmasq-dns-5ccc5c4795-l2f6v" Oct 03 08:09:21 crc kubenswrapper[4664]: I1003 08:09:21.188876 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9397e9d7-d42d-405b-872c-df7b05e96870-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-l2f6v\" (UID: \"9397e9d7-d42d-405b-872c-df7b05e96870\") " pod="openstack/dnsmasq-dns-5ccc5c4795-l2f6v" Oct 03 08:09:21 crc kubenswrapper[4664]: I1003 08:09:21.188898 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9397e9d7-d42d-405b-872c-df7b05e96870-config\") pod \"dnsmasq-dns-5ccc5c4795-l2f6v\" (UID: \"9397e9d7-d42d-405b-872c-df7b05e96870\") " pod="openstack/dnsmasq-dns-5ccc5c4795-l2f6v" Oct 03 08:09:21 crc kubenswrapper[4664]: I1003 08:09:21.188924 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjv2n\" (UniqueName: \"kubernetes.io/projected/9397e9d7-d42d-405b-872c-df7b05e96870-kube-api-access-kjv2n\") pod \"dnsmasq-dns-5ccc5c4795-l2f6v\" (UID: \"9397e9d7-d42d-405b-872c-df7b05e96870\") " pod="openstack/dnsmasq-dns-5ccc5c4795-l2f6v" Oct 03 08:09:21 crc kubenswrapper[4664]: I1003 08:09:21.188961 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9397e9d7-d42d-405b-872c-df7b05e96870-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-l2f6v\" (UID: \"9397e9d7-d42d-405b-872c-df7b05e96870\") " pod="openstack/dnsmasq-dns-5ccc5c4795-l2f6v" Oct 03 08:09:21 crc kubenswrapper[4664]: I1003 08:09:21.188989 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6e72a85b-4087-4478-9d83-91a468eda59d-httpd-config\") pod \"neutron-6c47458c7b-zb4ls\" (UID: \"6e72a85b-4087-4478-9d83-91a468eda59d\") " pod="openstack/neutron-6c47458c7b-zb4ls" Oct 03 08:09:21 crc kubenswrapper[4664]: I1003 08:09:21.189039 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e72a85b-4087-4478-9d83-91a468eda59d-ovndb-tls-certs\") pod \"neutron-6c47458c7b-zb4ls\" (UID: \"6e72a85b-4087-4478-9d83-91a468eda59d\") " pod="openstack/neutron-6c47458c7b-zb4ls" Oct 03 08:09:21 crc kubenswrapper[4664]: I1003 08:09:21.189080 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e72a85b-4087-4478-9d83-91a468eda59d-combined-ca-bundle\") pod \"neutron-6c47458c7b-zb4ls\" (UID: \"6e72a85b-4087-4478-9d83-91a468eda59d\") " pod="openstack/neutron-6c47458c7b-zb4ls" Oct 03 08:09:21 crc kubenswrapper[4664]: I1003 08:09:21.189155 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7td7\" (UniqueName: \"kubernetes.io/projected/6e72a85b-4087-4478-9d83-91a468eda59d-kube-api-access-v7td7\") pod \"neutron-6c47458c7b-zb4ls\" (UID: \"6e72a85b-4087-4478-9d83-91a468eda59d\") " pod="openstack/neutron-6c47458c7b-zb4ls" Oct 03 08:09:21 crc kubenswrapper[4664]: I1003 08:09:21.189173 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6e72a85b-4087-4478-9d83-91a468eda59d-config\") pod \"neutron-6c47458c7b-zb4ls\" (UID: \"6e72a85b-4087-4478-9d83-91a468eda59d\") " pod="openstack/neutron-6c47458c7b-zb4ls" Oct 03 08:09:21 crc kubenswrapper[4664]: I1003 08:09:21.190945 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9397e9d7-d42d-405b-872c-df7b05e96870-config\") pod \"dnsmasq-dns-5ccc5c4795-l2f6v\" (UID: \"9397e9d7-d42d-405b-872c-df7b05e96870\") " pod="openstack/dnsmasq-dns-5ccc5c4795-l2f6v" Oct 03 08:09:21 crc kubenswrapper[4664]: I1003 08:09:21.191693 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9397e9d7-d42d-405b-872c-df7b05e96870-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-l2f6v\" (UID: \"9397e9d7-d42d-405b-872c-df7b05e96870\") " pod="openstack/dnsmasq-dns-5ccc5c4795-l2f6v" Oct 03 08:09:21 crc kubenswrapper[4664]: I1003 08:09:21.192462 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9397e9d7-d42d-405b-872c-df7b05e96870-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-l2f6v\" (UID: \"9397e9d7-d42d-405b-872c-df7b05e96870\") " pod="openstack/dnsmasq-dns-5ccc5c4795-l2f6v" Oct 03 08:09:21 crc kubenswrapper[4664]: I1003 08:09:21.193176 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9397e9d7-d42d-405b-872c-df7b05e96870-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-l2f6v\" (UID: \"9397e9d7-d42d-405b-872c-df7b05e96870\") " pod="openstack/dnsmasq-dns-5ccc5c4795-l2f6v" Oct 03 08:09:21 crc kubenswrapper[4664]: I1003 08:09:21.194435 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9397e9d7-d42d-405b-872c-df7b05e96870-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-l2f6v\" (UID: \"9397e9d7-d42d-405b-872c-df7b05e96870\") " pod="openstack/dnsmasq-dns-5ccc5c4795-l2f6v" Oct 03 08:09:21 crc kubenswrapper[4664]: I1003 08:09:21.199478 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6e72a85b-4087-4478-9d83-91a468eda59d-httpd-config\") pod \"neutron-6c47458c7b-zb4ls\" (UID: \"6e72a85b-4087-4478-9d83-91a468eda59d\") " pod="openstack/neutron-6c47458c7b-zb4ls" Oct 03 08:09:21 crc kubenswrapper[4664]: I1003 08:09:21.202737 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6e72a85b-4087-4478-9d83-91a468eda59d-config\") pod \"neutron-6c47458c7b-zb4ls\" (UID: \"6e72a85b-4087-4478-9d83-91a468eda59d\") " pod="openstack/neutron-6c47458c7b-zb4ls" Oct 03 08:09:21 crc kubenswrapper[4664]: I1003 08:09:21.208257 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e72a85b-4087-4478-9d83-91a468eda59d-ovndb-tls-certs\") pod \"neutron-6c47458c7b-zb4ls\" (UID: \"6e72a85b-4087-4478-9d83-91a468eda59d\") " pod="openstack/neutron-6c47458c7b-zb4ls" Oct 03 08:09:21 crc kubenswrapper[4664]: I1003 08:09:21.215213 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e72a85b-4087-4478-9d83-91a468eda59d-combined-ca-bundle\") pod \"neutron-6c47458c7b-zb4ls\" (UID: \"6e72a85b-4087-4478-9d83-91a468eda59d\") " pod="openstack/neutron-6c47458c7b-zb4ls" Oct 03 08:09:21 crc kubenswrapper[4664]: I1003 08:09:21.223307 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7td7\" (UniqueName: \"kubernetes.io/projected/6e72a85b-4087-4478-9d83-91a468eda59d-kube-api-access-v7td7\") pod \"neutron-6c47458c7b-zb4ls\" (UID: \"6e72a85b-4087-4478-9d83-91a468eda59d\") " pod="openstack/neutron-6c47458c7b-zb4ls" Oct 03 08:09:21 crc kubenswrapper[4664]: I1003 08:09:21.229207 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjv2n\" (UniqueName: \"kubernetes.io/projected/9397e9d7-d42d-405b-872c-df7b05e96870-kube-api-access-kjv2n\") pod \"dnsmasq-dns-5ccc5c4795-l2f6v\" (UID: \"9397e9d7-d42d-405b-872c-df7b05e96870\") " pod="openstack/dnsmasq-dns-5ccc5c4795-l2f6v" Oct 03 08:09:21 crc kubenswrapper[4664]: I1003 08:09:21.264251 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-l2f6v" Oct 03 08:09:21 crc kubenswrapper[4664]: I1003 08:09:21.357887 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c47458c7b-zb4ls" Oct 03 08:09:21 crc kubenswrapper[4664]: I1003 08:09:21.818682 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-l2f6v"] Oct 03 08:09:21 crc kubenswrapper[4664]: W1003 08:09:21.821333 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9397e9d7_d42d_405b_872c_df7b05e96870.slice/crio-1cf93a4ad52885b300b4cd8cc66506d6097a05ac67db64d26fb8bbae925dd628 WatchSource:0}: Error finding container 1cf93a4ad52885b300b4cd8cc66506d6097a05ac67db64d26fb8bbae925dd628: Status 404 returned error can't find the container with id 1cf93a4ad52885b300b4cd8cc66506d6097a05ac67db64d26fb8bbae925dd628 Oct 03 08:09:21 crc kubenswrapper[4664]: I1003 08:09:21.847945 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-l2f6v" event={"ID":"9397e9d7-d42d-405b-872c-df7b05e96870","Type":"ContainerStarted","Data":"1cf93a4ad52885b300b4cd8cc66506d6097a05ac67db64d26fb8bbae925dd628"} Oct 03 08:09:22 crc kubenswrapper[4664]: I1003 08:09:22.098198 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6c47458c7b-zb4ls"] Oct 03 08:09:22 crc kubenswrapper[4664]: I1003 08:09:22.489822 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wcvcw" Oct 03 08:09:22 crc kubenswrapper[4664]: I1003 08:09:22.629184 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4q5z\" (UniqueName: \"kubernetes.io/projected/82832c17-408b-4b89-992f-09e393024fe2-kube-api-access-q4q5z\") pod \"82832c17-408b-4b89-992f-09e393024fe2\" (UID: \"82832c17-408b-4b89-992f-09e393024fe2\") " Oct 03 08:09:22 crc kubenswrapper[4664]: I1003 08:09:22.630186 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82832c17-408b-4b89-992f-09e393024fe2-combined-ca-bundle\") pod \"82832c17-408b-4b89-992f-09e393024fe2\" (UID: \"82832c17-408b-4b89-992f-09e393024fe2\") " Oct 03 08:09:22 crc kubenswrapper[4664]: I1003 08:09:22.630270 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/82832c17-408b-4b89-992f-09e393024fe2-db-sync-config-data\") pod \"82832c17-408b-4b89-992f-09e393024fe2\" (UID: \"82832c17-408b-4b89-992f-09e393024fe2\") " Oct 03 08:09:22 crc kubenswrapper[4664]: I1003 08:09:22.635091 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82832c17-408b-4b89-992f-09e393024fe2-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "82832c17-408b-4b89-992f-09e393024fe2" (UID: "82832c17-408b-4b89-992f-09e393024fe2"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:09:22 crc kubenswrapper[4664]: I1003 08:09:22.635680 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82832c17-408b-4b89-992f-09e393024fe2-kube-api-access-q4q5z" (OuterVolumeSpecName: "kube-api-access-q4q5z") pod "82832c17-408b-4b89-992f-09e393024fe2" (UID: "82832c17-408b-4b89-992f-09e393024fe2"). InnerVolumeSpecName "kube-api-access-q4q5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:09:22 crc kubenswrapper[4664]: I1003 08:09:22.669900 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82832c17-408b-4b89-992f-09e393024fe2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "82832c17-408b-4b89-992f-09e393024fe2" (UID: "82832c17-408b-4b89-992f-09e393024fe2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:09:22 crc kubenswrapper[4664]: I1003 08:09:22.732076 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4q5z\" (UniqueName: \"kubernetes.io/projected/82832c17-408b-4b89-992f-09e393024fe2-kube-api-access-q4q5z\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:22 crc kubenswrapper[4664]: I1003 08:09:22.732121 4664 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82832c17-408b-4b89-992f-09e393024fe2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:22 crc kubenswrapper[4664]: I1003 08:09:22.732132 4664 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/82832c17-408b-4b89-992f-09e393024fe2-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:22 crc kubenswrapper[4664]: I1003 08:09:22.882831 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wcvcw" event={"ID":"82832c17-408b-4b89-992f-09e393024fe2","Type":"ContainerDied","Data":"6f27a5822343bfedb48b948d2add12e9b4d284c7e6f64fd32867fbdd95f0d54e"} Oct 03 08:09:22 crc kubenswrapper[4664]: I1003 08:09:22.882910 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wcvcw" Oct 03 08:09:22 crc kubenswrapper[4664]: I1003 08:09:22.892408 4664 generic.go:334] "Generic (PLEG): container finished" podID="9397e9d7-d42d-405b-872c-df7b05e96870" containerID="63df5bbfceffacd376672eed0b7f21021b0909bcb29d46289a353ca616f44c69" exitCode=0 Oct 03 08:09:22 crc kubenswrapper[4664]: I1003 08:09:22.883735 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f27a5822343bfedb48b948d2add12e9b4d284c7e6f64fd32867fbdd95f0d54e" Oct 03 08:09:22 crc kubenswrapper[4664]: I1003 08:09:22.895851 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c7b4a287-6826-4b8f-945a-aad1d1deb92a","Type":"ContainerStarted","Data":"6e101e3f9ad77dfaaff43805f29a5ac8937b6ae5f4dcde279efd7cf4b3c253a0"} Oct 03 08:09:22 crc kubenswrapper[4664]: I1003 08:09:22.895883 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c47458c7b-zb4ls" event={"ID":"6e72a85b-4087-4478-9d83-91a468eda59d","Type":"ContainerStarted","Data":"0d52ba589d591ea1719e9050e7a7a4c91cb57efc9056882128d6272f61ee7b0f"} Oct 03 08:09:22 crc kubenswrapper[4664]: I1003 08:09:22.895897 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c47458c7b-zb4ls" event={"ID":"6e72a85b-4087-4478-9d83-91a468eda59d","Type":"ContainerStarted","Data":"a9bcea5762989180bbf1b5cd957868de170959703d47ae52a11356c5e382a69e"} Oct 03 08:09:22 crc kubenswrapper[4664]: I1003 08:09:22.895906 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-l2f6v" event={"ID":"9397e9d7-d42d-405b-872c-df7b05e96870","Type":"ContainerDied","Data":"63df5bbfceffacd376672eed0b7f21021b0909bcb29d46289a353ca616f44c69"} Oct 03 08:09:22 crc kubenswrapper[4664]: I1003 08:09:22.918399 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.918375848 podStartE2EDuration="4.918375848s" podCreationTimestamp="2025-10-03 08:09:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:09:22.907969291 +0000 UTC m=+1263.729159801" watchObservedRunningTime="2025-10-03 08:09:22.918375848 +0000 UTC m=+1263.739566338" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.102016 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7f467f54bc-hkh4m"] Oct 03 08:09:23 crc kubenswrapper[4664]: E1003 08:09:23.104026 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82832c17-408b-4b89-992f-09e393024fe2" containerName="barbican-db-sync" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.104049 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="82832c17-408b-4b89-992f-09e393024fe2" containerName="barbican-db-sync" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.104287 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="82832c17-408b-4b89-992f-09e393024fe2" containerName="barbican-db-sync" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.105276 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7f467f54bc-hkh4m" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.117013 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.117574 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.117752 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7f467f54bc-hkh4m"] Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.117895 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-8xbqs" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.135641 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-58f9b497cd-8m4l7"] Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.138277 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-58f9b497cd-8m4l7" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.151017 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.191935 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-58f9b497cd-8m4l7"] Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.232144 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-l2f6v"] Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.257263 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gjfj\" (UniqueName: \"kubernetes.io/projected/ff0f93c1-983e-4202-b659-9a4b68fb015e-kube-api-access-7gjfj\") pod \"barbican-keystone-listener-58f9b497cd-8m4l7\" (UID: \"ff0f93c1-983e-4202-b659-9a4b68fb015e\") " pod="openstack/barbican-keystone-listener-58f9b497cd-8m4l7" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.257345 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ca16101-0bee-4cb4-b9f4-3a2db110eaba-combined-ca-bundle\") pod \"barbican-worker-7f467f54bc-hkh4m\" (UID: \"1ca16101-0bee-4cb4-b9f4-3a2db110eaba\") " pod="openstack/barbican-worker-7f467f54bc-hkh4m" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.257387 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ca16101-0bee-4cb4-b9f4-3a2db110eaba-config-data\") pod \"barbican-worker-7f467f54bc-hkh4m\" (UID: \"1ca16101-0bee-4cb4-b9f4-3a2db110eaba\") " pod="openstack/barbican-worker-7f467f54bc-hkh4m" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.257459 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff0f93c1-983e-4202-b659-9a4b68fb015e-combined-ca-bundle\") pod \"barbican-keystone-listener-58f9b497cd-8m4l7\" (UID: \"ff0f93c1-983e-4202-b659-9a4b68fb015e\") " pod="openstack/barbican-keystone-listener-58f9b497cd-8m4l7" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.257492 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff0f93c1-983e-4202-b659-9a4b68fb015e-logs\") pod \"barbican-keystone-listener-58f9b497cd-8m4l7\" (UID: \"ff0f93c1-983e-4202-b659-9a4b68fb015e\") " pod="openstack/barbican-keystone-listener-58f9b497cd-8m4l7" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.257525 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff0f93c1-983e-4202-b659-9a4b68fb015e-config-data-custom\") pod \"barbican-keystone-listener-58f9b497cd-8m4l7\" (UID: \"ff0f93c1-983e-4202-b659-9a4b68fb015e\") " pod="openstack/barbican-keystone-listener-58f9b497cd-8m4l7" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.257550 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1ca16101-0bee-4cb4-b9f4-3a2db110eaba-config-data-custom\") pod \"barbican-worker-7f467f54bc-hkh4m\" (UID: \"1ca16101-0bee-4cb4-b9f4-3a2db110eaba\") " pod="openstack/barbican-worker-7f467f54bc-hkh4m" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.257575 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff0f93c1-983e-4202-b659-9a4b68fb015e-config-data\") pod \"barbican-keystone-listener-58f9b497cd-8m4l7\" (UID: \"ff0f93c1-983e-4202-b659-9a4b68fb015e\") " pod="openstack/barbican-keystone-listener-58f9b497cd-8m4l7" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.257593 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ca16101-0bee-4cb4-b9f4-3a2db110eaba-logs\") pod \"barbican-worker-7f467f54bc-hkh4m\" (UID: \"1ca16101-0bee-4cb4-b9f4-3a2db110eaba\") " pod="openstack/barbican-worker-7f467f54bc-hkh4m" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.257649 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb2r4\" (UniqueName: \"kubernetes.io/projected/1ca16101-0bee-4cb4-b9f4-3a2db110eaba-kube-api-access-tb2r4\") pod \"barbican-worker-7f467f54bc-hkh4m\" (UID: \"1ca16101-0bee-4cb4-b9f4-3a2db110eaba\") " pod="openstack/barbican-worker-7f467f54bc-hkh4m" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.305028 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-vz86c"] Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.306982 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-vz86c" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.344645 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-vz86c"] Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.359960 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff0f93c1-983e-4202-b659-9a4b68fb015e-logs\") pod \"barbican-keystone-listener-58f9b497cd-8m4l7\" (UID: \"ff0f93c1-983e-4202-b659-9a4b68fb015e\") " pod="openstack/barbican-keystone-listener-58f9b497cd-8m4l7" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.362598 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff0f93c1-983e-4202-b659-9a4b68fb015e-logs\") pod \"barbican-keystone-listener-58f9b497cd-8m4l7\" (UID: \"ff0f93c1-983e-4202-b659-9a4b68fb015e\") " pod="openstack/barbican-keystone-listener-58f9b497cd-8m4l7" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.362693 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff0f93c1-983e-4202-b659-9a4b68fb015e-config-data-custom\") pod \"barbican-keystone-listener-58f9b497cd-8m4l7\" (UID: \"ff0f93c1-983e-4202-b659-9a4b68fb015e\") " pod="openstack/barbican-keystone-listener-58f9b497cd-8m4l7" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.362739 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1ca16101-0bee-4cb4-b9f4-3a2db110eaba-config-data-custom\") pod \"barbican-worker-7f467f54bc-hkh4m\" (UID: \"1ca16101-0bee-4cb4-b9f4-3a2db110eaba\") " pod="openstack/barbican-worker-7f467f54bc-hkh4m" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.362785 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff0f93c1-983e-4202-b659-9a4b68fb015e-config-data\") pod \"barbican-keystone-listener-58f9b497cd-8m4l7\" (UID: \"ff0f93c1-983e-4202-b659-9a4b68fb015e\") " pod="openstack/barbican-keystone-listener-58f9b497cd-8m4l7" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.362808 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ca16101-0bee-4cb4-b9f4-3a2db110eaba-logs\") pod \"barbican-worker-7f467f54bc-hkh4m\" (UID: \"1ca16101-0bee-4cb4-b9f4-3a2db110eaba\") " pod="openstack/barbican-worker-7f467f54bc-hkh4m" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.363055 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tb2r4\" (UniqueName: \"kubernetes.io/projected/1ca16101-0bee-4cb4-b9f4-3a2db110eaba-kube-api-access-tb2r4\") pod \"barbican-worker-7f467f54bc-hkh4m\" (UID: \"1ca16101-0bee-4cb4-b9f4-3a2db110eaba\") " pod="openstack/barbican-worker-7f467f54bc-hkh4m" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.363163 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gjfj\" (UniqueName: \"kubernetes.io/projected/ff0f93c1-983e-4202-b659-9a4b68fb015e-kube-api-access-7gjfj\") pod \"barbican-keystone-listener-58f9b497cd-8m4l7\" (UID: \"ff0f93c1-983e-4202-b659-9a4b68fb015e\") " pod="openstack/barbican-keystone-listener-58f9b497cd-8m4l7" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.363294 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ca16101-0bee-4cb4-b9f4-3a2db110eaba-combined-ca-bundle\") pod \"barbican-worker-7f467f54bc-hkh4m\" (UID: \"1ca16101-0bee-4cb4-b9f4-3a2db110eaba\") " pod="openstack/barbican-worker-7f467f54bc-hkh4m" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.363366 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ca16101-0bee-4cb4-b9f4-3a2db110eaba-config-data\") pod \"barbican-worker-7f467f54bc-hkh4m\" (UID: \"1ca16101-0bee-4cb4-b9f4-3a2db110eaba\") " pod="openstack/barbican-worker-7f467f54bc-hkh4m" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.363532 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff0f93c1-983e-4202-b659-9a4b68fb015e-combined-ca-bundle\") pod \"barbican-keystone-listener-58f9b497cd-8m4l7\" (UID: \"ff0f93c1-983e-4202-b659-9a4b68fb015e\") " pod="openstack/barbican-keystone-listener-58f9b497cd-8m4l7" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.369206 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ca16101-0bee-4cb4-b9f4-3a2db110eaba-logs\") pod \"barbican-worker-7f467f54bc-hkh4m\" (UID: \"1ca16101-0bee-4cb4-b9f4-3a2db110eaba\") " pod="openstack/barbican-worker-7f467f54bc-hkh4m" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.369904 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff0f93c1-983e-4202-b659-9a4b68fb015e-combined-ca-bundle\") pod \"barbican-keystone-listener-58f9b497cd-8m4l7\" (UID: \"ff0f93c1-983e-4202-b659-9a4b68fb015e\") " pod="openstack/barbican-keystone-listener-58f9b497cd-8m4l7" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.377760 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff0f93c1-983e-4202-b659-9a4b68fb015e-config-data\") pod \"barbican-keystone-listener-58f9b497cd-8m4l7\" (UID: \"ff0f93c1-983e-4202-b659-9a4b68fb015e\") " pod="openstack/barbican-keystone-listener-58f9b497cd-8m4l7" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.384916 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1ca16101-0bee-4cb4-b9f4-3a2db110eaba-config-data-custom\") pod \"barbican-worker-7f467f54bc-hkh4m\" (UID: \"1ca16101-0bee-4cb4-b9f4-3a2db110eaba\") " pod="openstack/barbican-worker-7f467f54bc-hkh4m" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.385486 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ca16101-0bee-4cb4-b9f4-3a2db110eaba-combined-ca-bundle\") pod \"barbican-worker-7f467f54bc-hkh4m\" (UID: \"1ca16101-0bee-4cb4-b9f4-3a2db110eaba\") " pod="openstack/barbican-worker-7f467f54bc-hkh4m" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.385625 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff0f93c1-983e-4202-b659-9a4b68fb015e-config-data-custom\") pod \"barbican-keystone-listener-58f9b497cd-8m4l7\" (UID: \"ff0f93c1-983e-4202-b659-9a4b68fb015e\") " pod="openstack/barbican-keystone-listener-58f9b497cd-8m4l7" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.386796 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ca16101-0bee-4cb4-b9f4-3a2db110eaba-config-data\") pod \"barbican-worker-7f467f54bc-hkh4m\" (UID: \"1ca16101-0bee-4cb4-b9f4-3a2db110eaba\") " pod="openstack/barbican-worker-7f467f54bc-hkh4m" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.390492 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tb2r4\" (UniqueName: \"kubernetes.io/projected/1ca16101-0bee-4cb4-b9f4-3a2db110eaba-kube-api-access-tb2r4\") pod \"barbican-worker-7f467f54bc-hkh4m\" (UID: \"1ca16101-0bee-4cb4-b9f4-3a2db110eaba\") " pod="openstack/barbican-worker-7f467f54bc-hkh4m" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.390561 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-58f858bbd-kn4ws"] Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.393353 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-58f858bbd-kn4ws" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.396477 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.405681 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-58f858bbd-kn4ws"] Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.439652 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gjfj\" (UniqueName: \"kubernetes.io/projected/ff0f93c1-983e-4202-b659-9a4b68fb015e-kube-api-access-7gjfj\") pod \"barbican-keystone-listener-58f9b497cd-8m4l7\" (UID: \"ff0f93c1-983e-4202-b659-9a4b68fb015e\") " pod="openstack/barbican-keystone-listener-58f9b497cd-8m4l7" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.472078 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8bz4\" (UniqueName: \"kubernetes.io/projected/007d3aea-e570-499e-8373-3e43e65865c1-kube-api-access-q8bz4\") pod \"barbican-api-58f858bbd-kn4ws\" (UID: \"007d3aea-e570-499e-8373-3e43e65865c1\") " pod="openstack/barbican-api-58f858bbd-kn4ws" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.472442 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/007d3aea-e570-499e-8373-3e43e65865c1-combined-ca-bundle\") pod \"barbican-api-58f858bbd-kn4ws\" (UID: \"007d3aea-e570-499e-8373-3e43e65865c1\") " pod="openstack/barbican-api-58f858bbd-kn4ws" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.472554 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/618be4c7-ebc2-43f5-aed3-f9e0c83fad8d-config\") pod \"dnsmasq-dns-688c87cc99-vz86c\" (UID: \"618be4c7-ebc2-43f5-aed3-f9e0c83fad8d\") " pod="openstack/dnsmasq-dns-688c87cc99-vz86c" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.472713 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/618be4c7-ebc2-43f5-aed3-f9e0c83fad8d-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-vz86c\" (UID: \"618be4c7-ebc2-43f5-aed3-f9e0c83fad8d\") " pod="openstack/dnsmasq-dns-688c87cc99-vz86c" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.472875 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/618be4c7-ebc2-43f5-aed3-f9e0c83fad8d-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-vz86c\" (UID: \"618be4c7-ebc2-43f5-aed3-f9e0c83fad8d\") " pod="openstack/dnsmasq-dns-688c87cc99-vz86c" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.473004 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xws2\" (UniqueName: \"kubernetes.io/projected/618be4c7-ebc2-43f5-aed3-f9e0c83fad8d-kube-api-access-5xws2\") pod \"dnsmasq-dns-688c87cc99-vz86c\" (UID: \"618be4c7-ebc2-43f5-aed3-f9e0c83fad8d\") " pod="openstack/dnsmasq-dns-688c87cc99-vz86c" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.473102 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/007d3aea-e570-499e-8373-3e43e65865c1-logs\") pod \"barbican-api-58f858bbd-kn4ws\" (UID: \"007d3aea-e570-499e-8373-3e43e65865c1\") " pod="openstack/barbican-api-58f858bbd-kn4ws" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.473597 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/007d3aea-e570-499e-8373-3e43e65865c1-config-data\") pod \"barbican-api-58f858bbd-kn4ws\" (UID: \"007d3aea-e570-499e-8373-3e43e65865c1\") " pod="openstack/barbican-api-58f858bbd-kn4ws" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.473743 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/618be4c7-ebc2-43f5-aed3-f9e0c83fad8d-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-vz86c\" (UID: \"618be4c7-ebc2-43f5-aed3-f9e0c83fad8d\") " pod="openstack/dnsmasq-dns-688c87cc99-vz86c" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.473905 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/618be4c7-ebc2-43f5-aed3-f9e0c83fad8d-dns-svc\") pod \"dnsmasq-dns-688c87cc99-vz86c\" (UID: \"618be4c7-ebc2-43f5-aed3-f9e0c83fad8d\") " pod="openstack/dnsmasq-dns-688c87cc99-vz86c" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.474063 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/007d3aea-e570-499e-8373-3e43e65865c1-config-data-custom\") pod \"barbican-api-58f858bbd-kn4ws\" (UID: \"007d3aea-e570-499e-8373-3e43e65865c1\") " pod="openstack/barbican-api-58f858bbd-kn4ws" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.504124 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-b754456d9-lc2mg"] Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.505722 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7f467f54bc-hkh4m" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.507090 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b754456d9-lc2mg" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.511316 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.511684 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.518951 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-58f9b497cd-8m4l7" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.532676 4664 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-85fcf9fb6-r8r76" podUID="f1015cf1-8e4b-44fd-a794-27edfecdceed" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.143:8443: connect: connection refused" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.550351 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b754456d9-lc2mg"] Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.575503 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/007d3aea-e570-499e-8373-3e43e65865c1-combined-ca-bundle\") pod \"barbican-api-58f858bbd-kn4ws\" (UID: \"007d3aea-e570-499e-8373-3e43e65865c1\") " pod="openstack/barbican-api-58f858bbd-kn4ws" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.575591 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/618be4c7-ebc2-43f5-aed3-f9e0c83fad8d-config\") pod \"dnsmasq-dns-688c87cc99-vz86c\" (UID: \"618be4c7-ebc2-43f5-aed3-f9e0c83fad8d\") " pod="openstack/dnsmasq-dns-688c87cc99-vz86c" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.575698 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/618be4c7-ebc2-43f5-aed3-f9e0c83fad8d-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-vz86c\" (UID: \"618be4c7-ebc2-43f5-aed3-f9e0c83fad8d\") " pod="openstack/dnsmasq-dns-688c87cc99-vz86c" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.575744 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/618be4c7-ebc2-43f5-aed3-f9e0c83fad8d-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-vz86c\" (UID: \"618be4c7-ebc2-43f5-aed3-f9e0c83fad8d\") " pod="openstack/dnsmasq-dns-688c87cc99-vz86c" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.575783 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xws2\" (UniqueName: \"kubernetes.io/projected/618be4c7-ebc2-43f5-aed3-f9e0c83fad8d-kube-api-access-5xws2\") pod \"dnsmasq-dns-688c87cc99-vz86c\" (UID: \"618be4c7-ebc2-43f5-aed3-f9e0c83fad8d\") " pod="openstack/dnsmasq-dns-688c87cc99-vz86c" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.575810 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/007d3aea-e570-499e-8373-3e43e65865c1-logs\") pod \"barbican-api-58f858bbd-kn4ws\" (UID: \"007d3aea-e570-499e-8373-3e43e65865c1\") " pod="openstack/barbican-api-58f858bbd-kn4ws" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.575857 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/007d3aea-e570-499e-8373-3e43e65865c1-config-data\") pod \"barbican-api-58f858bbd-kn4ws\" (UID: \"007d3aea-e570-499e-8373-3e43e65865c1\") " pod="openstack/barbican-api-58f858bbd-kn4ws" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.575891 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/618be4c7-ebc2-43f5-aed3-f9e0c83fad8d-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-vz86c\" (UID: \"618be4c7-ebc2-43f5-aed3-f9e0c83fad8d\") " pod="openstack/dnsmasq-dns-688c87cc99-vz86c" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.575941 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/618be4c7-ebc2-43f5-aed3-f9e0c83fad8d-dns-svc\") pod \"dnsmasq-dns-688c87cc99-vz86c\" (UID: \"618be4c7-ebc2-43f5-aed3-f9e0c83fad8d\") " pod="openstack/dnsmasq-dns-688c87cc99-vz86c" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.575993 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/007d3aea-e570-499e-8373-3e43e65865c1-config-data-custom\") pod \"barbican-api-58f858bbd-kn4ws\" (UID: \"007d3aea-e570-499e-8373-3e43e65865c1\") " pod="openstack/barbican-api-58f858bbd-kn4ws" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.576026 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8bz4\" (UniqueName: \"kubernetes.io/projected/007d3aea-e570-499e-8373-3e43e65865c1-kube-api-access-q8bz4\") pod \"barbican-api-58f858bbd-kn4ws\" (UID: \"007d3aea-e570-499e-8373-3e43e65865c1\") " pod="openstack/barbican-api-58f858bbd-kn4ws" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.577410 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/007d3aea-e570-499e-8373-3e43e65865c1-logs\") pod \"barbican-api-58f858bbd-kn4ws\" (UID: \"007d3aea-e570-499e-8373-3e43e65865c1\") " pod="openstack/barbican-api-58f858bbd-kn4ws" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.577530 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/618be4c7-ebc2-43f5-aed3-f9e0c83fad8d-config\") pod \"dnsmasq-dns-688c87cc99-vz86c\" (UID: \"618be4c7-ebc2-43f5-aed3-f9e0c83fad8d\") " pod="openstack/dnsmasq-dns-688c87cc99-vz86c" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.580773 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/618be4c7-ebc2-43f5-aed3-f9e0c83fad8d-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-vz86c\" (UID: \"618be4c7-ebc2-43f5-aed3-f9e0c83fad8d\") " pod="openstack/dnsmasq-dns-688c87cc99-vz86c" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.581738 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/618be4c7-ebc2-43f5-aed3-f9e0c83fad8d-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-vz86c\" (UID: \"618be4c7-ebc2-43f5-aed3-f9e0c83fad8d\") " pod="openstack/dnsmasq-dns-688c87cc99-vz86c" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.582250 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/618be4c7-ebc2-43f5-aed3-f9e0c83fad8d-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-vz86c\" (UID: \"618be4c7-ebc2-43f5-aed3-f9e0c83fad8d\") " pod="openstack/dnsmasq-dns-688c87cc99-vz86c" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.590716 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/618be4c7-ebc2-43f5-aed3-f9e0c83fad8d-dns-svc\") pod \"dnsmasq-dns-688c87cc99-vz86c\" (UID: \"618be4c7-ebc2-43f5-aed3-f9e0c83fad8d\") " pod="openstack/dnsmasq-dns-688c87cc99-vz86c" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.596189 4664 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-76644f9584-br5jb" podUID="30ce3373-ef30-4727-b57f-5be7963d1892" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.142:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.142:8443: connect: connection refused" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.596889 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/007d3aea-e570-499e-8373-3e43e65865c1-combined-ca-bundle\") pod \"barbican-api-58f858bbd-kn4ws\" (UID: \"007d3aea-e570-499e-8373-3e43e65865c1\") " pod="openstack/barbican-api-58f858bbd-kn4ws" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.597525 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/007d3aea-e570-499e-8373-3e43e65865c1-config-data\") pod \"barbican-api-58f858bbd-kn4ws\" (UID: \"007d3aea-e570-499e-8373-3e43e65865c1\") " pod="openstack/barbican-api-58f858bbd-kn4ws" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.599458 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/007d3aea-e570-499e-8373-3e43e65865c1-config-data-custom\") pod \"barbican-api-58f858bbd-kn4ws\" (UID: \"007d3aea-e570-499e-8373-3e43e65865c1\") " pod="openstack/barbican-api-58f858bbd-kn4ws" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.608096 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8bz4\" (UniqueName: \"kubernetes.io/projected/007d3aea-e570-499e-8373-3e43e65865c1-kube-api-access-q8bz4\") pod \"barbican-api-58f858bbd-kn4ws\" (UID: \"007d3aea-e570-499e-8373-3e43e65865c1\") " pod="openstack/barbican-api-58f858bbd-kn4ws" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.610300 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xws2\" (UniqueName: \"kubernetes.io/projected/618be4c7-ebc2-43f5-aed3-f9e0c83fad8d-kube-api-access-5xws2\") pod \"dnsmasq-dns-688c87cc99-vz86c\" (UID: \"618be4c7-ebc2-43f5-aed3-f9e0c83fad8d\") " pod="openstack/dnsmasq-dns-688c87cc99-vz86c" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.643160 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-vz86c" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.679134 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d882eea5-c7df-4023-b542-96e4057ad948-ovndb-tls-certs\") pod \"neutron-b754456d9-lc2mg\" (UID: \"d882eea5-c7df-4023-b542-96e4057ad948\") " pod="openstack/neutron-b754456d9-lc2mg" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.679328 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d882eea5-c7df-4023-b542-96e4057ad948-public-tls-certs\") pod \"neutron-b754456d9-lc2mg\" (UID: \"d882eea5-c7df-4023-b542-96e4057ad948\") " pod="openstack/neutron-b754456d9-lc2mg" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.679378 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d882eea5-c7df-4023-b542-96e4057ad948-httpd-config\") pod \"neutron-b754456d9-lc2mg\" (UID: \"d882eea5-c7df-4023-b542-96e4057ad948\") " pod="openstack/neutron-b754456d9-lc2mg" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.679511 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d882eea5-c7df-4023-b542-96e4057ad948-internal-tls-certs\") pod \"neutron-b754456d9-lc2mg\" (UID: \"d882eea5-c7df-4023-b542-96e4057ad948\") " pod="openstack/neutron-b754456d9-lc2mg" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.679654 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d882eea5-c7df-4023-b542-96e4057ad948-combined-ca-bundle\") pod \"neutron-b754456d9-lc2mg\" (UID: \"d882eea5-c7df-4023-b542-96e4057ad948\") " pod="openstack/neutron-b754456d9-lc2mg" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.679766 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d882eea5-c7df-4023-b542-96e4057ad948-config\") pod \"neutron-b754456d9-lc2mg\" (UID: \"d882eea5-c7df-4023-b542-96e4057ad948\") " pod="openstack/neutron-b754456d9-lc2mg" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.679914 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mb7xq\" (UniqueName: \"kubernetes.io/projected/d882eea5-c7df-4023-b542-96e4057ad948-kube-api-access-mb7xq\") pod \"neutron-b754456d9-lc2mg\" (UID: \"d882eea5-c7df-4023-b542-96e4057ad948\") " pod="openstack/neutron-b754456d9-lc2mg" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.781523 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d882eea5-c7df-4023-b542-96e4057ad948-public-tls-certs\") pod \"neutron-b754456d9-lc2mg\" (UID: \"d882eea5-c7df-4023-b542-96e4057ad948\") " pod="openstack/neutron-b754456d9-lc2mg" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.781581 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d882eea5-c7df-4023-b542-96e4057ad948-httpd-config\") pod \"neutron-b754456d9-lc2mg\" (UID: \"d882eea5-c7df-4023-b542-96e4057ad948\") " pod="openstack/neutron-b754456d9-lc2mg" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.781713 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d882eea5-c7df-4023-b542-96e4057ad948-internal-tls-certs\") pod \"neutron-b754456d9-lc2mg\" (UID: \"d882eea5-c7df-4023-b542-96e4057ad948\") " pod="openstack/neutron-b754456d9-lc2mg" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.781735 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d882eea5-c7df-4023-b542-96e4057ad948-combined-ca-bundle\") pod \"neutron-b754456d9-lc2mg\" (UID: \"d882eea5-c7df-4023-b542-96e4057ad948\") " pod="openstack/neutron-b754456d9-lc2mg" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.781761 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d882eea5-c7df-4023-b542-96e4057ad948-config\") pod \"neutron-b754456d9-lc2mg\" (UID: \"d882eea5-c7df-4023-b542-96e4057ad948\") " pod="openstack/neutron-b754456d9-lc2mg" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.781812 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mb7xq\" (UniqueName: \"kubernetes.io/projected/d882eea5-c7df-4023-b542-96e4057ad948-kube-api-access-mb7xq\") pod \"neutron-b754456d9-lc2mg\" (UID: \"d882eea5-c7df-4023-b542-96e4057ad948\") " pod="openstack/neutron-b754456d9-lc2mg" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.781884 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d882eea5-c7df-4023-b542-96e4057ad948-ovndb-tls-certs\") pod \"neutron-b754456d9-lc2mg\" (UID: \"d882eea5-c7df-4023-b542-96e4057ad948\") " pod="openstack/neutron-b754456d9-lc2mg" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.786374 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d882eea5-c7df-4023-b542-96e4057ad948-public-tls-certs\") pod \"neutron-b754456d9-lc2mg\" (UID: \"d882eea5-c7df-4023-b542-96e4057ad948\") " pod="openstack/neutron-b754456d9-lc2mg" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.786537 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d882eea5-c7df-4023-b542-96e4057ad948-internal-tls-certs\") pod \"neutron-b754456d9-lc2mg\" (UID: \"d882eea5-c7df-4023-b542-96e4057ad948\") " pod="openstack/neutron-b754456d9-lc2mg" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.787241 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d882eea5-c7df-4023-b542-96e4057ad948-httpd-config\") pod \"neutron-b754456d9-lc2mg\" (UID: \"d882eea5-c7df-4023-b542-96e4057ad948\") " pod="openstack/neutron-b754456d9-lc2mg" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.790401 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d882eea5-c7df-4023-b542-96e4057ad948-combined-ca-bundle\") pod \"neutron-b754456d9-lc2mg\" (UID: \"d882eea5-c7df-4023-b542-96e4057ad948\") " pod="openstack/neutron-b754456d9-lc2mg" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.792420 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d882eea5-c7df-4023-b542-96e4057ad948-config\") pod \"neutron-b754456d9-lc2mg\" (UID: \"d882eea5-c7df-4023-b542-96e4057ad948\") " pod="openstack/neutron-b754456d9-lc2mg" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.799259 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d882eea5-c7df-4023-b542-96e4057ad948-ovndb-tls-certs\") pod \"neutron-b754456d9-lc2mg\" (UID: \"d882eea5-c7df-4023-b542-96e4057ad948\") " pod="openstack/neutron-b754456d9-lc2mg" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.804461 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mb7xq\" (UniqueName: \"kubernetes.io/projected/d882eea5-c7df-4023-b542-96e4057ad948-kube-api-access-mb7xq\") pod \"neutron-b754456d9-lc2mg\" (UID: \"d882eea5-c7df-4023-b542-96e4057ad948\") " pod="openstack/neutron-b754456d9-lc2mg" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.804966 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-58f858bbd-kn4ws" Oct 03 08:09:23 crc kubenswrapper[4664]: I1003 08:09:23.857843 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b754456d9-lc2mg" Oct 03 08:09:27 crc kubenswrapper[4664]: I1003 08:09:27.210111 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 03 08:09:27 crc kubenswrapper[4664]: I1003 08:09:27.210794 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 03 08:09:27 crc kubenswrapper[4664]: I1003 08:09:27.280649 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 03 08:09:27 crc kubenswrapper[4664]: I1003 08:09:27.297389 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 03 08:09:27 crc kubenswrapper[4664]: I1003 08:09:27.478652 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-797fcd6b7d-kpnbp"] Oct 03 08:09:27 crc kubenswrapper[4664]: I1003 08:09:27.481995 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-797fcd6b7d-kpnbp" Oct 03 08:09:27 crc kubenswrapper[4664]: I1003 08:09:27.506346 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 03 08:09:27 crc kubenswrapper[4664]: I1003 08:09:27.506471 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 03 08:09:27 crc kubenswrapper[4664]: I1003 08:09:27.524629 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-797fcd6b7d-kpnbp"] Oct 03 08:09:27 crc kubenswrapper[4664]: I1003 08:09:27.562258 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4cmz\" (UniqueName: \"kubernetes.io/projected/dc9a9e6e-b8f5-4991-9a64-7a928f66075c-kube-api-access-l4cmz\") pod \"barbican-api-797fcd6b7d-kpnbp\" (UID: \"dc9a9e6e-b8f5-4991-9a64-7a928f66075c\") " pod="openstack/barbican-api-797fcd6b7d-kpnbp" Oct 03 08:09:27 crc kubenswrapper[4664]: I1003 08:09:27.562312 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dc9a9e6e-b8f5-4991-9a64-7a928f66075c-config-data-custom\") pod \"barbican-api-797fcd6b7d-kpnbp\" (UID: \"dc9a9e6e-b8f5-4991-9a64-7a928f66075c\") " pod="openstack/barbican-api-797fcd6b7d-kpnbp" Oct 03 08:09:27 crc kubenswrapper[4664]: I1003 08:09:27.562392 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc9a9e6e-b8f5-4991-9a64-7a928f66075c-config-data\") pod \"barbican-api-797fcd6b7d-kpnbp\" (UID: \"dc9a9e6e-b8f5-4991-9a64-7a928f66075c\") " pod="openstack/barbican-api-797fcd6b7d-kpnbp" Oct 03 08:09:27 crc kubenswrapper[4664]: I1003 08:09:27.562441 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc9a9e6e-b8f5-4991-9a64-7a928f66075c-public-tls-certs\") pod \"barbican-api-797fcd6b7d-kpnbp\" (UID: \"dc9a9e6e-b8f5-4991-9a64-7a928f66075c\") " pod="openstack/barbican-api-797fcd6b7d-kpnbp" Oct 03 08:09:27 crc kubenswrapper[4664]: I1003 08:09:27.562501 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc9a9e6e-b8f5-4991-9a64-7a928f66075c-logs\") pod \"barbican-api-797fcd6b7d-kpnbp\" (UID: \"dc9a9e6e-b8f5-4991-9a64-7a928f66075c\") " pod="openstack/barbican-api-797fcd6b7d-kpnbp" Oct 03 08:09:27 crc kubenswrapper[4664]: I1003 08:09:27.562518 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc9a9e6e-b8f5-4991-9a64-7a928f66075c-internal-tls-certs\") pod \"barbican-api-797fcd6b7d-kpnbp\" (UID: \"dc9a9e6e-b8f5-4991-9a64-7a928f66075c\") " pod="openstack/barbican-api-797fcd6b7d-kpnbp" Oct 03 08:09:27 crc kubenswrapper[4664]: I1003 08:09:27.562541 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc9a9e6e-b8f5-4991-9a64-7a928f66075c-combined-ca-bundle\") pod \"barbican-api-797fcd6b7d-kpnbp\" (UID: \"dc9a9e6e-b8f5-4991-9a64-7a928f66075c\") " pod="openstack/barbican-api-797fcd6b7d-kpnbp" Oct 03 08:09:27 crc kubenswrapper[4664]: I1003 08:09:27.664173 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc9a9e6e-b8f5-4991-9a64-7a928f66075c-logs\") pod \"barbican-api-797fcd6b7d-kpnbp\" (UID: \"dc9a9e6e-b8f5-4991-9a64-7a928f66075c\") " pod="openstack/barbican-api-797fcd6b7d-kpnbp" Oct 03 08:09:27 crc kubenswrapper[4664]: I1003 08:09:27.664217 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc9a9e6e-b8f5-4991-9a64-7a928f66075c-internal-tls-certs\") pod \"barbican-api-797fcd6b7d-kpnbp\" (UID: \"dc9a9e6e-b8f5-4991-9a64-7a928f66075c\") " pod="openstack/barbican-api-797fcd6b7d-kpnbp" Oct 03 08:09:27 crc kubenswrapper[4664]: I1003 08:09:27.664269 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc9a9e6e-b8f5-4991-9a64-7a928f66075c-combined-ca-bundle\") pod \"barbican-api-797fcd6b7d-kpnbp\" (UID: \"dc9a9e6e-b8f5-4991-9a64-7a928f66075c\") " pod="openstack/barbican-api-797fcd6b7d-kpnbp" Oct 03 08:09:27 crc kubenswrapper[4664]: I1003 08:09:27.664339 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4cmz\" (UniqueName: \"kubernetes.io/projected/dc9a9e6e-b8f5-4991-9a64-7a928f66075c-kube-api-access-l4cmz\") pod \"barbican-api-797fcd6b7d-kpnbp\" (UID: \"dc9a9e6e-b8f5-4991-9a64-7a928f66075c\") " pod="openstack/barbican-api-797fcd6b7d-kpnbp" Oct 03 08:09:27 crc kubenswrapper[4664]: I1003 08:09:27.664363 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dc9a9e6e-b8f5-4991-9a64-7a928f66075c-config-data-custom\") pod \"barbican-api-797fcd6b7d-kpnbp\" (UID: \"dc9a9e6e-b8f5-4991-9a64-7a928f66075c\") " pod="openstack/barbican-api-797fcd6b7d-kpnbp" Oct 03 08:09:27 crc kubenswrapper[4664]: I1003 08:09:27.664444 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc9a9e6e-b8f5-4991-9a64-7a928f66075c-config-data\") pod \"barbican-api-797fcd6b7d-kpnbp\" (UID: \"dc9a9e6e-b8f5-4991-9a64-7a928f66075c\") " pod="openstack/barbican-api-797fcd6b7d-kpnbp" Oct 03 08:09:27 crc kubenswrapper[4664]: I1003 08:09:27.664582 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc9a9e6e-b8f5-4991-9a64-7a928f66075c-public-tls-certs\") pod \"barbican-api-797fcd6b7d-kpnbp\" (UID: \"dc9a9e6e-b8f5-4991-9a64-7a928f66075c\") " pod="openstack/barbican-api-797fcd6b7d-kpnbp" Oct 03 08:09:27 crc kubenswrapper[4664]: I1003 08:09:27.664712 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc9a9e6e-b8f5-4991-9a64-7a928f66075c-logs\") pod \"barbican-api-797fcd6b7d-kpnbp\" (UID: \"dc9a9e6e-b8f5-4991-9a64-7a928f66075c\") " pod="openstack/barbican-api-797fcd6b7d-kpnbp" Oct 03 08:09:27 crc kubenswrapper[4664]: I1003 08:09:27.672104 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dc9a9e6e-b8f5-4991-9a64-7a928f66075c-config-data-custom\") pod \"barbican-api-797fcd6b7d-kpnbp\" (UID: \"dc9a9e6e-b8f5-4991-9a64-7a928f66075c\") " pod="openstack/barbican-api-797fcd6b7d-kpnbp" Oct 03 08:09:27 crc kubenswrapper[4664]: I1003 08:09:27.672122 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc9a9e6e-b8f5-4991-9a64-7a928f66075c-combined-ca-bundle\") pod \"barbican-api-797fcd6b7d-kpnbp\" (UID: \"dc9a9e6e-b8f5-4991-9a64-7a928f66075c\") " pod="openstack/barbican-api-797fcd6b7d-kpnbp" Oct 03 08:09:27 crc kubenswrapper[4664]: I1003 08:09:27.682788 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc9a9e6e-b8f5-4991-9a64-7a928f66075c-internal-tls-certs\") pod \"barbican-api-797fcd6b7d-kpnbp\" (UID: \"dc9a9e6e-b8f5-4991-9a64-7a928f66075c\") " pod="openstack/barbican-api-797fcd6b7d-kpnbp" Oct 03 08:09:27 crc kubenswrapper[4664]: I1003 08:09:27.683154 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc9a9e6e-b8f5-4991-9a64-7a928f66075c-public-tls-certs\") pod \"barbican-api-797fcd6b7d-kpnbp\" (UID: \"dc9a9e6e-b8f5-4991-9a64-7a928f66075c\") " pod="openstack/barbican-api-797fcd6b7d-kpnbp" Oct 03 08:09:27 crc kubenswrapper[4664]: I1003 08:09:27.684289 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc9a9e6e-b8f5-4991-9a64-7a928f66075c-config-data\") pod \"barbican-api-797fcd6b7d-kpnbp\" (UID: \"dc9a9e6e-b8f5-4991-9a64-7a928f66075c\") " pod="openstack/barbican-api-797fcd6b7d-kpnbp" Oct 03 08:09:27 crc kubenswrapper[4664]: I1003 08:09:27.687491 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4cmz\" (UniqueName: \"kubernetes.io/projected/dc9a9e6e-b8f5-4991-9a64-7a928f66075c-kube-api-access-l4cmz\") pod \"barbican-api-797fcd6b7d-kpnbp\" (UID: \"dc9a9e6e-b8f5-4991-9a64-7a928f66075c\") " pod="openstack/barbican-api-797fcd6b7d-kpnbp" Oct 03 08:09:27 crc kubenswrapper[4664]: I1003 08:09:27.839825 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-797fcd6b7d-kpnbp" Oct 03 08:09:27 crc kubenswrapper[4664]: I1003 08:09:27.944875 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 03 08:09:27 crc kubenswrapper[4664]: I1003 08:09:27.944938 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 03 08:09:29 crc kubenswrapper[4664]: I1003 08:09:29.232371 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 03 08:09:29 crc kubenswrapper[4664]: I1003 08:09:29.232764 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 03 08:09:29 crc kubenswrapper[4664]: I1003 08:09:29.267229 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 03 08:09:29 crc kubenswrapper[4664]: I1003 08:09:29.282224 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 03 08:09:29 crc kubenswrapper[4664]: I1003 08:09:29.963139 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 03 08:09:29 crc kubenswrapper[4664]: I1003 08:09:29.963751 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 03 08:09:30 crc kubenswrapper[4664]: I1003 08:09:30.166535 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 03 08:09:30 crc kubenswrapper[4664]: I1003 08:09:30.166723 4664 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 08:09:30 crc kubenswrapper[4664]: I1003 08:09:30.431136 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 03 08:09:31 crc kubenswrapper[4664]: I1003 08:09:31.978884 4664 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 08:09:31 crc kubenswrapper[4664]: I1003 08:09:31.978915 4664 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 08:09:32 crc kubenswrapper[4664]: I1003 08:09:32.179810 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 03 08:09:32 crc kubenswrapper[4664]: I1003 08:09:32.319771 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 03 08:09:32 crc kubenswrapper[4664]: E1003 08:09:32.420256 4664 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/ubi9/httpd-24:latest" Oct 03 08:09:32 crc kubenswrapper[4664]: E1003 08:09:32.420541 4664 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:proxy-httpd,Image:registry.redhat.io/ubi9/httpd-24:latest,Command:[/usr/sbin/httpd],Args:[-DFOREGROUND],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:proxy-httpd,HostPort:0,ContainerPort:3000,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf/httpd.conf,SubPath:httpd.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf.d/ssl.conf,SubPath:ssl.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:run-httpd,ReadOnly:false,MountPath:/run/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:log-httpd,ReadOnly:false,MountPath:/var/log/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nts6v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(7050eaa4-061a-4dd3-b4da-73e2abd04458): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 08:09:32 crc kubenswrapper[4664]: E1003 08:09:32.431906 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"proxy-httpd\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="7050eaa4-061a-4dd3-b4da-73e2abd04458" Oct 03 08:09:32 crc kubenswrapper[4664]: I1003 08:09:32.871114 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-58f9b497cd-8m4l7"] Oct 03 08:09:32 crc kubenswrapper[4664]: W1003 08:09:32.896257 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff0f93c1_983e_4202_b659_9a4b68fb015e.slice/crio-66cdd9feec83ebc38049785b3b021aca4ae32fdb684834e01a391de0dc1b0162 WatchSource:0}: Error finding container 66cdd9feec83ebc38049785b3b021aca4ae32fdb684834e01a391de0dc1b0162: Status 404 returned error can't find the container with id 66cdd9feec83ebc38049785b3b021aca4ae32fdb684834e01a391de0dc1b0162 Oct 03 08:09:32 crc kubenswrapper[4664]: I1003 08:09:32.997536 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-58f9b497cd-8m4l7" event={"ID":"ff0f93c1-983e-4202-b659-9a4b68fb015e","Type":"ContainerStarted","Data":"66cdd9feec83ebc38049785b3b021aca4ae32fdb684834e01a391de0dc1b0162"} Oct 03 08:09:33 crc kubenswrapper[4664]: I1003 08:09:33.007056 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c47458c7b-zb4ls" event={"ID":"6e72a85b-4087-4478-9d83-91a468eda59d","Type":"ContainerStarted","Data":"fa764f39bd76987fb25c5925222d5f6a1c5c9e9227bfd785e27565753d068f04"} Oct 03 08:09:33 crc kubenswrapper[4664]: I1003 08:09:33.009062 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6c47458c7b-zb4ls" Oct 03 08:09:33 crc kubenswrapper[4664]: I1003 08:09:33.015933 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-l2f6v" event={"ID":"9397e9d7-d42d-405b-872c-df7b05e96870","Type":"ContainerStarted","Data":"c6a853d9553fadd8194d3789b018a3ea80ffdfff2998a2f7eea4609c428a034d"} Oct 03 08:09:33 crc kubenswrapper[4664]: I1003 08:09:33.016339 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc5c4795-l2f6v" podUID="9397e9d7-d42d-405b-872c-df7b05e96870" containerName="dnsmasq-dns" containerID="cri-o://c6a853d9553fadd8194d3789b018a3ea80ffdfff2998a2f7eea4609c428a034d" gracePeriod=10 Oct 03 08:09:33 crc kubenswrapper[4664]: I1003 08:09:33.016446 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7050eaa4-061a-4dd3-b4da-73e2abd04458" containerName="ceilometer-central-agent" containerID="cri-o://2530ee8677ae62c71817ffe80d3cc18b30d54b1718a2b3ffc74a70b706a727d2" gracePeriod=30 Oct 03 08:09:33 crc kubenswrapper[4664]: I1003 08:09:33.016514 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc5c4795-l2f6v" Oct 03 08:09:33 crc kubenswrapper[4664]: I1003 08:09:33.016598 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7050eaa4-061a-4dd3-b4da-73e2abd04458" containerName="sg-core" containerID="cri-o://287d1f9ca3e681244d3cce039edfcd409cea2b90f4debc427dfadeab6c59b4e2" gracePeriod=30 Oct 03 08:09:33 crc kubenswrapper[4664]: I1003 08:09:33.016665 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7050eaa4-061a-4dd3-b4da-73e2abd04458" containerName="ceilometer-notification-agent" containerID="cri-o://dec11e910be9792de36f8c4bb865a990edfa318b57c9d2c1b37962bde0dc3069" gracePeriod=30 Oct 03 08:09:33 crc kubenswrapper[4664]: I1003 08:09:33.029827 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7f467f54bc-hkh4m"] Oct 03 08:09:33 crc kubenswrapper[4664]: I1003 08:09:33.037695 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-vz86c"] Oct 03 08:09:33 crc kubenswrapper[4664]: I1003 08:09:33.048257 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6c47458c7b-zb4ls" podStartSLOduration=13.048231789999999 podStartE2EDuration="13.04823179s" podCreationTimestamp="2025-10-03 08:09:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:09:33.029380192 +0000 UTC m=+1273.850570682" watchObservedRunningTime="2025-10-03 08:09:33.04823179 +0000 UTC m=+1273.869422280" Oct 03 08:09:33 crc kubenswrapper[4664]: I1003 08:09:33.061755 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc5c4795-l2f6v" podStartSLOduration=13.061736576 podStartE2EDuration="13.061736576s" podCreationTimestamp="2025-10-03 08:09:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:09:33.058063581 +0000 UTC m=+1273.879254081" watchObservedRunningTime="2025-10-03 08:09:33.061736576 +0000 UTC m=+1273.882927066" Oct 03 08:09:33 crc kubenswrapper[4664]: I1003 08:09:33.118386 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-797fcd6b7d-kpnbp"] Oct 03 08:09:33 crc kubenswrapper[4664]: I1003 08:09:33.243866 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b754456d9-lc2mg"] Oct 03 08:09:33 crc kubenswrapper[4664]: I1003 08:09:33.279329 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-58f858bbd-kn4ws"] Oct 03 08:09:33 crc kubenswrapper[4664]: I1003 08:09:33.606065 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-l2f6v" Oct 03 08:09:33 crc kubenswrapper[4664]: I1003 08:09:33.701574 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9397e9d7-d42d-405b-872c-df7b05e96870-dns-svc\") pod \"9397e9d7-d42d-405b-872c-df7b05e96870\" (UID: \"9397e9d7-d42d-405b-872c-df7b05e96870\") " Oct 03 08:09:33 crc kubenswrapper[4664]: I1003 08:09:33.701727 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9397e9d7-d42d-405b-872c-df7b05e96870-dns-swift-storage-0\") pod \"9397e9d7-d42d-405b-872c-df7b05e96870\" (UID: \"9397e9d7-d42d-405b-872c-df7b05e96870\") " Oct 03 08:09:33 crc kubenswrapper[4664]: I1003 08:09:33.702057 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjv2n\" (UniqueName: \"kubernetes.io/projected/9397e9d7-d42d-405b-872c-df7b05e96870-kube-api-access-kjv2n\") pod \"9397e9d7-d42d-405b-872c-df7b05e96870\" (UID: \"9397e9d7-d42d-405b-872c-df7b05e96870\") " Oct 03 08:09:33 crc kubenswrapper[4664]: I1003 08:09:33.702128 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9397e9d7-d42d-405b-872c-df7b05e96870-ovsdbserver-sb\") pod \"9397e9d7-d42d-405b-872c-df7b05e96870\" (UID: \"9397e9d7-d42d-405b-872c-df7b05e96870\") " Oct 03 08:09:33 crc kubenswrapper[4664]: I1003 08:09:33.702189 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9397e9d7-d42d-405b-872c-df7b05e96870-config\") pod \"9397e9d7-d42d-405b-872c-df7b05e96870\" (UID: \"9397e9d7-d42d-405b-872c-df7b05e96870\") " Oct 03 08:09:33 crc kubenswrapper[4664]: I1003 08:09:33.702255 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9397e9d7-d42d-405b-872c-df7b05e96870-ovsdbserver-nb\") pod \"9397e9d7-d42d-405b-872c-df7b05e96870\" (UID: \"9397e9d7-d42d-405b-872c-df7b05e96870\") " Oct 03 08:09:33 crc kubenswrapper[4664]: I1003 08:09:33.708824 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9397e9d7-d42d-405b-872c-df7b05e96870-kube-api-access-kjv2n" (OuterVolumeSpecName: "kube-api-access-kjv2n") pod "9397e9d7-d42d-405b-872c-df7b05e96870" (UID: "9397e9d7-d42d-405b-872c-df7b05e96870"). InnerVolumeSpecName "kube-api-access-kjv2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:09:33 crc kubenswrapper[4664]: I1003 08:09:33.767334 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9397e9d7-d42d-405b-872c-df7b05e96870-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9397e9d7-d42d-405b-872c-df7b05e96870" (UID: "9397e9d7-d42d-405b-872c-df7b05e96870"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:09:33 crc kubenswrapper[4664]: I1003 08:09:33.779438 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9397e9d7-d42d-405b-872c-df7b05e96870-config" (OuterVolumeSpecName: "config") pod "9397e9d7-d42d-405b-872c-df7b05e96870" (UID: "9397e9d7-d42d-405b-872c-df7b05e96870"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:09:33 crc kubenswrapper[4664]: I1003 08:09:33.784273 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9397e9d7-d42d-405b-872c-df7b05e96870-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9397e9d7-d42d-405b-872c-df7b05e96870" (UID: "9397e9d7-d42d-405b-872c-df7b05e96870"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:09:33 crc kubenswrapper[4664]: I1003 08:09:33.788660 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9397e9d7-d42d-405b-872c-df7b05e96870-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9397e9d7-d42d-405b-872c-df7b05e96870" (UID: "9397e9d7-d42d-405b-872c-df7b05e96870"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:09:33 crc kubenswrapper[4664]: I1003 08:09:33.804665 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjv2n\" (UniqueName: \"kubernetes.io/projected/9397e9d7-d42d-405b-872c-df7b05e96870-kube-api-access-kjv2n\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:33 crc kubenswrapper[4664]: I1003 08:09:33.804708 4664 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9397e9d7-d42d-405b-872c-df7b05e96870-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:33 crc kubenswrapper[4664]: I1003 08:09:33.804720 4664 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9397e9d7-d42d-405b-872c-df7b05e96870-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:33 crc kubenswrapper[4664]: I1003 08:09:33.804733 4664 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9397e9d7-d42d-405b-872c-df7b05e96870-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:33 crc kubenswrapper[4664]: I1003 08:09:33.804745 4664 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9397e9d7-d42d-405b-872c-df7b05e96870-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:33 crc kubenswrapper[4664]: I1003 08:09:33.811091 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9397e9d7-d42d-405b-872c-df7b05e96870-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9397e9d7-d42d-405b-872c-df7b05e96870" (UID: "9397e9d7-d42d-405b-872c-df7b05e96870"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:09:33 crc kubenswrapper[4664]: I1003 08:09:33.907716 4664 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9397e9d7-d42d-405b-872c-df7b05e96870-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:34 crc kubenswrapper[4664]: I1003 08:09:34.043731 4664 generic.go:334] "Generic (PLEG): container finished" podID="618be4c7-ebc2-43f5-aed3-f9e0c83fad8d" containerID="40a58d8ae436181512650f3adae277582da01b374f17cf08d3006ea4eb7a6b3b" exitCode=0 Oct 03 08:09:34 crc kubenswrapper[4664]: I1003 08:09:34.043901 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-vz86c" event={"ID":"618be4c7-ebc2-43f5-aed3-f9e0c83fad8d","Type":"ContainerDied","Data":"40a58d8ae436181512650f3adae277582da01b374f17cf08d3006ea4eb7a6b3b"} Oct 03 08:09:34 crc kubenswrapper[4664]: I1003 08:09:34.046210 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-vz86c" event={"ID":"618be4c7-ebc2-43f5-aed3-f9e0c83fad8d","Type":"ContainerStarted","Data":"62860479bda86ae6d37513b0f72b21c7b8d57d59bfe051716b193ff982b626d7"} Oct 03 08:09:34 crc kubenswrapper[4664]: I1003 08:09:34.051040 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b754456d9-lc2mg" event={"ID":"d882eea5-c7df-4023-b542-96e4057ad948","Type":"ContainerStarted","Data":"c9634e3e0fb9bb1813ef382b2f41ecdc6de87d57d4d38329d271dc65e2c5d978"} Oct 03 08:09:34 crc kubenswrapper[4664]: I1003 08:09:34.051761 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b754456d9-lc2mg" event={"ID":"d882eea5-c7df-4023-b542-96e4057ad948","Type":"ContainerStarted","Data":"5cf1b228f3b361703b256883e12e01026fdb61aa4efcc9a3b7623696cc5de2ff"} Oct 03 08:09:34 crc kubenswrapper[4664]: I1003 08:09:34.063743 4664 generic.go:334] "Generic (PLEG): container finished" podID="7050eaa4-061a-4dd3-b4da-73e2abd04458" containerID="287d1f9ca3e681244d3cce039edfcd409cea2b90f4debc427dfadeab6c59b4e2" exitCode=2 Oct 03 08:09:34 crc kubenswrapper[4664]: I1003 08:09:34.063780 4664 generic.go:334] "Generic (PLEG): container finished" podID="7050eaa4-061a-4dd3-b4da-73e2abd04458" containerID="2530ee8677ae62c71817ffe80d3cc18b30d54b1718a2b3ffc74a70b706a727d2" exitCode=0 Oct 03 08:09:34 crc kubenswrapper[4664]: I1003 08:09:34.063835 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7050eaa4-061a-4dd3-b4da-73e2abd04458","Type":"ContainerDied","Data":"287d1f9ca3e681244d3cce039edfcd409cea2b90f4debc427dfadeab6c59b4e2"} Oct 03 08:09:34 crc kubenswrapper[4664]: I1003 08:09:34.063863 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7050eaa4-061a-4dd3-b4da-73e2abd04458","Type":"ContainerDied","Data":"2530ee8677ae62c71817ffe80d3cc18b30d54b1718a2b3ffc74a70b706a727d2"} Oct 03 08:09:34 crc kubenswrapper[4664]: I1003 08:09:34.066226 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7f467f54bc-hkh4m" event={"ID":"1ca16101-0bee-4cb4-b9f4-3a2db110eaba","Type":"ContainerStarted","Data":"35d3652682ab97e12a79dd582b58f5f0d654d0866553dca765ec56a3b7a0716f"} Oct 03 08:09:34 crc kubenswrapper[4664]: I1003 08:09:34.082325 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-797fcd6b7d-kpnbp" event={"ID":"dc9a9e6e-b8f5-4991-9a64-7a928f66075c","Type":"ContainerStarted","Data":"d973681f98f94df968b98507e6713103253b6c1c94613654cfaa15272804ad64"} Oct 03 08:09:34 crc kubenswrapper[4664]: I1003 08:09:34.082369 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-797fcd6b7d-kpnbp" event={"ID":"dc9a9e6e-b8f5-4991-9a64-7a928f66075c","Type":"ContainerStarted","Data":"42b52311677cd844cb39bb909c9a242c17f1104a59ff09502e281eeed0404cb2"} Oct 03 08:09:34 crc kubenswrapper[4664]: I1003 08:09:34.084182 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-58f858bbd-kn4ws" event={"ID":"007d3aea-e570-499e-8373-3e43e65865c1","Type":"ContainerStarted","Data":"5d221f7f5f88e0b33e037e3d454abee6ee4db6e00718bc2a5fa26031433469c6"} Oct 03 08:09:34 crc kubenswrapper[4664]: I1003 08:09:34.084205 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-58f858bbd-kn4ws" event={"ID":"007d3aea-e570-499e-8373-3e43e65865c1","Type":"ContainerStarted","Data":"d18785270b9b3778c22cb4e4f38f22a1aee28663f0339d1cdfd9da0f5cfb3a29"} Oct 03 08:09:34 crc kubenswrapper[4664]: I1003 08:09:34.089110 4664 generic.go:334] "Generic (PLEG): container finished" podID="9397e9d7-d42d-405b-872c-df7b05e96870" containerID="c6a853d9553fadd8194d3789b018a3ea80ffdfff2998a2f7eea4609c428a034d" exitCode=0 Oct 03 08:09:34 crc kubenswrapper[4664]: I1003 08:09:34.090623 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-l2f6v" Oct 03 08:09:34 crc kubenswrapper[4664]: I1003 08:09:34.091635 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-l2f6v" event={"ID":"9397e9d7-d42d-405b-872c-df7b05e96870","Type":"ContainerDied","Data":"c6a853d9553fadd8194d3789b018a3ea80ffdfff2998a2f7eea4609c428a034d"} Oct 03 08:09:34 crc kubenswrapper[4664]: I1003 08:09:34.091674 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-l2f6v" event={"ID":"9397e9d7-d42d-405b-872c-df7b05e96870","Type":"ContainerDied","Data":"1cf93a4ad52885b300b4cd8cc66506d6097a05ac67db64d26fb8bbae925dd628"} Oct 03 08:09:34 crc kubenswrapper[4664]: I1003 08:09:34.091698 4664 scope.go:117] "RemoveContainer" containerID="c6a853d9553fadd8194d3789b018a3ea80ffdfff2998a2f7eea4609c428a034d" Oct 03 08:09:34 crc kubenswrapper[4664]: I1003 08:09:34.144663 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-l2f6v"] Oct 03 08:09:34 crc kubenswrapper[4664]: I1003 08:09:34.155033 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-l2f6v"] Oct 03 08:09:34 crc kubenswrapper[4664]: I1003 08:09:34.759945 4664 scope.go:117] "RemoveContainer" containerID="63df5bbfceffacd376672eed0b7f21021b0909bcb29d46289a353ca616f44c69" Oct 03 08:09:35 crc kubenswrapper[4664]: I1003 08:09:35.099920 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-797fcd6b7d-kpnbp" event={"ID":"dc9a9e6e-b8f5-4991-9a64-7a928f66075c","Type":"ContainerStarted","Data":"f003d931f3c66b85f87eb41dd3e4e42c3f47ab89e5b9f777e74dc28b04d03f4f"} Oct 03 08:09:35 crc kubenswrapper[4664]: I1003 08:09:35.101340 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-797fcd6b7d-kpnbp" Oct 03 08:09:35 crc kubenswrapper[4664]: I1003 08:09:35.101374 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-797fcd6b7d-kpnbp" Oct 03 08:09:35 crc kubenswrapper[4664]: I1003 08:09:35.126376 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-797fcd6b7d-kpnbp" podStartSLOduration=8.12635628 podStartE2EDuration="8.12635628s" podCreationTimestamp="2025-10-03 08:09:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:09:35.121361778 +0000 UTC m=+1275.942552268" watchObservedRunningTime="2025-10-03 08:09:35.12635628 +0000 UTC m=+1275.947546770" Oct 03 08:09:35 crc kubenswrapper[4664]: I1003 08:09:35.380036 4664 scope.go:117] "RemoveContainer" containerID="c6a853d9553fadd8194d3789b018a3ea80ffdfff2998a2f7eea4609c428a034d" Oct 03 08:09:35 crc kubenswrapper[4664]: E1003 08:09:35.380580 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6a853d9553fadd8194d3789b018a3ea80ffdfff2998a2f7eea4609c428a034d\": container with ID starting with c6a853d9553fadd8194d3789b018a3ea80ffdfff2998a2f7eea4609c428a034d not found: ID does not exist" containerID="c6a853d9553fadd8194d3789b018a3ea80ffdfff2998a2f7eea4609c428a034d" Oct 03 08:09:35 crc kubenswrapper[4664]: I1003 08:09:35.380638 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6a853d9553fadd8194d3789b018a3ea80ffdfff2998a2f7eea4609c428a034d"} err="failed to get container status \"c6a853d9553fadd8194d3789b018a3ea80ffdfff2998a2f7eea4609c428a034d\": rpc error: code = NotFound desc = could not find container \"c6a853d9553fadd8194d3789b018a3ea80ffdfff2998a2f7eea4609c428a034d\": container with ID starting with c6a853d9553fadd8194d3789b018a3ea80ffdfff2998a2f7eea4609c428a034d not found: ID does not exist" Oct 03 08:09:35 crc kubenswrapper[4664]: I1003 08:09:35.380664 4664 scope.go:117] "RemoveContainer" containerID="63df5bbfceffacd376672eed0b7f21021b0909bcb29d46289a353ca616f44c69" Oct 03 08:09:35 crc kubenswrapper[4664]: E1003 08:09:35.383944 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63df5bbfceffacd376672eed0b7f21021b0909bcb29d46289a353ca616f44c69\": container with ID starting with 63df5bbfceffacd376672eed0b7f21021b0909bcb29d46289a353ca616f44c69 not found: ID does not exist" containerID="63df5bbfceffacd376672eed0b7f21021b0909bcb29d46289a353ca616f44c69" Oct 03 08:09:35 crc kubenswrapper[4664]: I1003 08:09:35.383984 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63df5bbfceffacd376672eed0b7f21021b0909bcb29d46289a353ca616f44c69"} err="failed to get container status \"63df5bbfceffacd376672eed0b7f21021b0909bcb29d46289a353ca616f44c69\": rpc error: code = NotFound desc = could not find container \"63df5bbfceffacd376672eed0b7f21021b0909bcb29d46289a353ca616f44c69\": container with ID starting with 63df5bbfceffacd376672eed0b7f21021b0909bcb29d46289a353ca616f44c69 not found: ID does not exist" Oct 03 08:09:35 crc kubenswrapper[4664]: I1003 08:09:35.646904 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-85fcf9fb6-r8r76" Oct 03 08:09:35 crc kubenswrapper[4664]: I1003 08:09:35.887417 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9397e9d7-d42d-405b-872c-df7b05e96870" path="/var/lib/kubelet/pods/9397e9d7-d42d-405b-872c-df7b05e96870/volumes" Oct 03 08:09:35 crc kubenswrapper[4664]: I1003 08:09:35.942230 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-76644f9584-br5jb" Oct 03 08:09:36 crc kubenswrapper[4664]: I1003 08:09:36.130129 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-58f858bbd-kn4ws" event={"ID":"007d3aea-e570-499e-8373-3e43e65865c1","Type":"ContainerStarted","Data":"46fcd88dee1d3df9568ee3c1a5341aaf3ab6ccc47d4821f99f8d52cf576fa435"} Oct 03 08:09:36 crc kubenswrapper[4664]: I1003 08:09:36.131435 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-58f858bbd-kn4ws" Oct 03 08:09:36 crc kubenswrapper[4664]: I1003 08:09:36.131501 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-58f858bbd-kn4ws" Oct 03 08:09:36 crc kubenswrapper[4664]: I1003 08:09:36.138164 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-vz86c" event={"ID":"618be4c7-ebc2-43f5-aed3-f9e0c83fad8d","Type":"ContainerStarted","Data":"e15838efc9037f1d9632b7ff99ce466257dee4188a30f28bdf538bb19c926da1"} Oct 03 08:09:36 crc kubenswrapper[4664]: I1003 08:09:36.138323 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-688c87cc99-vz86c" Oct 03 08:09:36 crc kubenswrapper[4664]: I1003 08:09:36.144410 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b754456d9-lc2mg" event={"ID":"d882eea5-c7df-4023-b542-96e4057ad948","Type":"ContainerStarted","Data":"b868054f0618dd7b6656b23e1ba0e25ecf153d890b148ddfd8ec16704ff01040"} Oct 03 08:09:36 crc kubenswrapper[4664]: I1003 08:09:36.145315 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-b754456d9-lc2mg" Oct 03 08:09:36 crc kubenswrapper[4664]: I1003 08:09:36.162163 4664 generic.go:334] "Generic (PLEG): container finished" podID="7050eaa4-061a-4dd3-b4da-73e2abd04458" containerID="dec11e910be9792de36f8c4bb865a990edfa318b57c9d2c1b37962bde0dc3069" exitCode=0 Oct 03 08:09:36 crc kubenswrapper[4664]: I1003 08:09:36.162286 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7050eaa4-061a-4dd3-b4da-73e2abd04458","Type":"ContainerDied","Data":"dec11e910be9792de36f8c4bb865a990edfa318b57c9d2c1b37962bde0dc3069"} Oct 03 08:09:36 crc kubenswrapper[4664]: I1003 08:09:36.162389 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7050eaa4-061a-4dd3-b4da-73e2abd04458","Type":"ContainerDied","Data":"9202a00eed664a74b2d05e905ce3a5fac90948c27a204f5e1b3d0ea9524b3ed4"} Oct 03 08:09:36 crc kubenswrapper[4664]: I1003 08:09:36.162406 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9202a00eed664a74b2d05e905ce3a5fac90948c27a204f5e1b3d0ea9524b3ed4" Oct 03 08:09:36 crc kubenswrapper[4664]: I1003 08:09:36.170680 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 08:09:36 crc kubenswrapper[4664]: I1003 08:09:36.170820 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-58f858bbd-kn4ws" podStartSLOduration=13.170801959 podStartE2EDuration="13.170801959s" podCreationTimestamp="2025-10-03 08:09:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:09:36.149059259 +0000 UTC m=+1276.970249739" watchObservedRunningTime="2025-10-03 08:09:36.170801959 +0000 UTC m=+1276.991992449" Oct 03 08:09:36 crc kubenswrapper[4664]: I1003 08:09:36.187321 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-688c87cc99-vz86c" podStartSLOduration=13.18730429 podStartE2EDuration="13.18730429s" podCreationTimestamp="2025-10-03 08:09:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:09:36.177668175 +0000 UTC m=+1276.998858675" watchObservedRunningTime="2025-10-03 08:09:36.18730429 +0000 UTC m=+1277.008494770" Oct 03 08:09:36 crc kubenswrapper[4664]: I1003 08:09:36.207242 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-b754456d9-lc2mg" podStartSLOduration=13.207217438 podStartE2EDuration="13.207217438s" podCreationTimestamp="2025-10-03 08:09:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:09:36.198428877 +0000 UTC m=+1277.019619377" watchObservedRunningTime="2025-10-03 08:09:36.207217438 +0000 UTC m=+1277.028407928" Oct 03 08:09:36 crc kubenswrapper[4664]: I1003 08:09:36.263174 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7050eaa4-061a-4dd3-b4da-73e2abd04458-config-data\") pod \"7050eaa4-061a-4dd3-b4da-73e2abd04458\" (UID: \"7050eaa4-061a-4dd3-b4da-73e2abd04458\") " Oct 03 08:09:36 crc kubenswrapper[4664]: I1003 08:09:36.263291 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7050eaa4-061a-4dd3-b4da-73e2abd04458-sg-core-conf-yaml\") pod \"7050eaa4-061a-4dd3-b4da-73e2abd04458\" (UID: \"7050eaa4-061a-4dd3-b4da-73e2abd04458\") " Oct 03 08:09:36 crc kubenswrapper[4664]: I1003 08:09:36.263329 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nts6v\" (UniqueName: \"kubernetes.io/projected/7050eaa4-061a-4dd3-b4da-73e2abd04458-kube-api-access-nts6v\") pod \"7050eaa4-061a-4dd3-b4da-73e2abd04458\" (UID: \"7050eaa4-061a-4dd3-b4da-73e2abd04458\") " Oct 03 08:09:36 crc kubenswrapper[4664]: I1003 08:09:36.263358 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7050eaa4-061a-4dd3-b4da-73e2abd04458-scripts\") pod \"7050eaa4-061a-4dd3-b4da-73e2abd04458\" (UID: \"7050eaa4-061a-4dd3-b4da-73e2abd04458\") " Oct 03 08:09:36 crc kubenswrapper[4664]: I1003 08:09:36.263508 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7050eaa4-061a-4dd3-b4da-73e2abd04458-combined-ca-bundle\") pod \"7050eaa4-061a-4dd3-b4da-73e2abd04458\" (UID: \"7050eaa4-061a-4dd3-b4da-73e2abd04458\") " Oct 03 08:09:36 crc kubenswrapper[4664]: I1003 08:09:36.263660 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7050eaa4-061a-4dd3-b4da-73e2abd04458-log-httpd\") pod \"7050eaa4-061a-4dd3-b4da-73e2abd04458\" (UID: \"7050eaa4-061a-4dd3-b4da-73e2abd04458\") " Oct 03 08:09:36 crc kubenswrapper[4664]: I1003 08:09:36.263697 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7050eaa4-061a-4dd3-b4da-73e2abd04458-run-httpd\") pod \"7050eaa4-061a-4dd3-b4da-73e2abd04458\" (UID: \"7050eaa4-061a-4dd3-b4da-73e2abd04458\") " Oct 03 08:09:36 crc kubenswrapper[4664]: I1003 08:09:36.266042 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7050eaa4-061a-4dd3-b4da-73e2abd04458-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7050eaa4-061a-4dd3-b4da-73e2abd04458" (UID: "7050eaa4-061a-4dd3-b4da-73e2abd04458"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:09:36 crc kubenswrapper[4664]: I1003 08:09:36.269428 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7050eaa4-061a-4dd3-b4da-73e2abd04458-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7050eaa4-061a-4dd3-b4da-73e2abd04458" (UID: "7050eaa4-061a-4dd3-b4da-73e2abd04458"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:09:36 crc kubenswrapper[4664]: I1003 08:09:36.270871 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7050eaa4-061a-4dd3-b4da-73e2abd04458-kube-api-access-nts6v" (OuterVolumeSpecName: "kube-api-access-nts6v") pod "7050eaa4-061a-4dd3-b4da-73e2abd04458" (UID: "7050eaa4-061a-4dd3-b4da-73e2abd04458"). InnerVolumeSpecName "kube-api-access-nts6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:09:36 crc kubenswrapper[4664]: I1003 08:09:36.271170 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7050eaa4-061a-4dd3-b4da-73e2abd04458-scripts" (OuterVolumeSpecName: "scripts") pod "7050eaa4-061a-4dd3-b4da-73e2abd04458" (UID: "7050eaa4-061a-4dd3-b4da-73e2abd04458"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:09:36 crc kubenswrapper[4664]: I1003 08:09:36.304657 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7050eaa4-061a-4dd3-b4da-73e2abd04458-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7050eaa4-061a-4dd3-b4da-73e2abd04458" (UID: "7050eaa4-061a-4dd3-b4da-73e2abd04458"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:09:36 crc kubenswrapper[4664]: I1003 08:09:36.329581 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7050eaa4-061a-4dd3-b4da-73e2abd04458-config-data" (OuterVolumeSpecName: "config-data") pod "7050eaa4-061a-4dd3-b4da-73e2abd04458" (UID: "7050eaa4-061a-4dd3-b4da-73e2abd04458"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:09:36 crc kubenswrapper[4664]: I1003 08:09:36.343472 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7050eaa4-061a-4dd3-b4da-73e2abd04458-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7050eaa4-061a-4dd3-b4da-73e2abd04458" (UID: "7050eaa4-061a-4dd3-b4da-73e2abd04458"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:09:36 crc kubenswrapper[4664]: I1003 08:09:36.373645 4664 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7050eaa4-061a-4dd3-b4da-73e2abd04458-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:36 crc kubenswrapper[4664]: I1003 08:09:36.373886 4664 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7050eaa4-061a-4dd3-b4da-73e2abd04458-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:36 crc kubenswrapper[4664]: I1003 08:09:36.373957 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nts6v\" (UniqueName: \"kubernetes.io/projected/7050eaa4-061a-4dd3-b4da-73e2abd04458-kube-api-access-nts6v\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:36 crc kubenswrapper[4664]: I1003 08:09:36.379246 4664 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7050eaa4-061a-4dd3-b4da-73e2abd04458-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:36 crc kubenswrapper[4664]: I1003 08:09:36.379282 4664 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7050eaa4-061a-4dd3-b4da-73e2abd04458-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:36 crc kubenswrapper[4664]: I1003 08:09:36.379296 4664 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7050eaa4-061a-4dd3-b4da-73e2abd04458-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:36 crc kubenswrapper[4664]: I1003 08:09:36.379307 4664 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7050eaa4-061a-4dd3-b4da-73e2abd04458-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:37 crc kubenswrapper[4664]: I1003 08:09:37.174069 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7f467f54bc-hkh4m" event={"ID":"1ca16101-0bee-4cb4-b9f4-3a2db110eaba","Type":"ContainerStarted","Data":"0f709d2ad28530e9ef4d42c98ec2fac99bb2bce152017e1e24d47d377a1ad6d7"} Oct 03 08:09:37 crc kubenswrapper[4664]: I1003 08:09:37.174685 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7f467f54bc-hkh4m" event={"ID":"1ca16101-0bee-4cb4-b9f4-3a2db110eaba","Type":"ContainerStarted","Data":"1f31ccc934376737dfcc02242f377cc41c26fda1fd4cf945ad34f3841702954d"} Oct 03 08:09:37 crc kubenswrapper[4664]: I1003 08:09:37.177742 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 08:09:37 crc kubenswrapper[4664]: I1003 08:09:37.178454 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-58f9b497cd-8m4l7" event={"ID":"ff0f93c1-983e-4202-b659-9a4b68fb015e","Type":"ContainerStarted","Data":"612da40f3e5a8761b53cb9b1cba4ee9a4f7bc19464aab8733dba46614c51fc7f"} Oct 03 08:09:37 crc kubenswrapper[4664]: I1003 08:09:37.178516 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-58f9b497cd-8m4l7" event={"ID":"ff0f93c1-983e-4202-b659-9a4b68fb015e","Type":"ContainerStarted","Data":"33c172c269613a638bbf5f16ad7ac1480590e8e93adf8fd7586656513d45f244"} Oct 03 08:09:37 crc kubenswrapper[4664]: I1003 08:09:37.203069 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7f467f54bc-hkh4m" podStartSLOduration=11.259235652 podStartE2EDuration="14.20302953s" podCreationTimestamp="2025-10-03 08:09:23 +0000 UTC" firstStartedPulling="2025-10-03 08:09:33.025412369 +0000 UTC m=+1273.846602859" lastFinishedPulling="2025-10-03 08:09:35.969206247 +0000 UTC m=+1276.790396737" observedRunningTime="2025-10-03 08:09:37.199226391 +0000 UTC m=+1278.020416901" watchObservedRunningTime="2025-10-03 08:09:37.20302953 +0000 UTC m=+1278.024220020" Oct 03 08:09:37 crc kubenswrapper[4664]: I1003 08:09:37.223923 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-58f9b497cd-8m4l7" podStartSLOduration=11.203413059 podStartE2EDuration="14.223905655s" podCreationTimestamp="2025-10-03 08:09:23 +0000 UTC" firstStartedPulling="2025-10-03 08:09:32.899101956 +0000 UTC m=+1273.720292446" lastFinishedPulling="2025-10-03 08:09:35.919594552 +0000 UTC m=+1276.740785042" observedRunningTime="2025-10-03 08:09:37.221168677 +0000 UTC m=+1278.042359177" watchObservedRunningTime="2025-10-03 08:09:37.223905655 +0000 UTC m=+1278.045096145" Oct 03 08:09:37 crc kubenswrapper[4664]: I1003 08:09:37.286281 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:09:37 crc kubenswrapper[4664]: I1003 08:09:37.293809 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:09:37 crc kubenswrapper[4664]: I1003 08:09:37.330679 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:09:37 crc kubenswrapper[4664]: E1003 08:09:37.331205 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7050eaa4-061a-4dd3-b4da-73e2abd04458" containerName="ceilometer-central-agent" Oct 03 08:09:37 crc kubenswrapper[4664]: I1003 08:09:37.331229 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="7050eaa4-061a-4dd3-b4da-73e2abd04458" containerName="ceilometer-central-agent" Oct 03 08:09:37 crc kubenswrapper[4664]: E1003 08:09:37.331252 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7050eaa4-061a-4dd3-b4da-73e2abd04458" containerName="ceilometer-notification-agent" Oct 03 08:09:37 crc kubenswrapper[4664]: I1003 08:09:37.331261 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="7050eaa4-061a-4dd3-b4da-73e2abd04458" containerName="ceilometer-notification-agent" Oct 03 08:09:37 crc kubenswrapper[4664]: E1003 08:09:37.331273 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9397e9d7-d42d-405b-872c-df7b05e96870" containerName="init" Oct 03 08:09:37 crc kubenswrapper[4664]: I1003 08:09:37.331280 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="9397e9d7-d42d-405b-872c-df7b05e96870" containerName="init" Oct 03 08:09:37 crc kubenswrapper[4664]: E1003 08:09:37.331295 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7050eaa4-061a-4dd3-b4da-73e2abd04458" containerName="sg-core" Oct 03 08:09:37 crc kubenswrapper[4664]: I1003 08:09:37.331303 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="7050eaa4-061a-4dd3-b4da-73e2abd04458" containerName="sg-core" Oct 03 08:09:37 crc kubenswrapper[4664]: E1003 08:09:37.331336 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9397e9d7-d42d-405b-872c-df7b05e96870" containerName="dnsmasq-dns" Oct 03 08:09:37 crc kubenswrapper[4664]: I1003 08:09:37.331346 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="9397e9d7-d42d-405b-872c-df7b05e96870" containerName="dnsmasq-dns" Oct 03 08:09:37 crc kubenswrapper[4664]: I1003 08:09:37.331565 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="7050eaa4-061a-4dd3-b4da-73e2abd04458" containerName="ceilometer-notification-agent" Oct 03 08:09:37 crc kubenswrapper[4664]: I1003 08:09:37.331586 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="7050eaa4-061a-4dd3-b4da-73e2abd04458" containerName="sg-core" Oct 03 08:09:37 crc kubenswrapper[4664]: I1003 08:09:37.331601 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="9397e9d7-d42d-405b-872c-df7b05e96870" containerName="dnsmasq-dns" Oct 03 08:09:37 crc kubenswrapper[4664]: I1003 08:09:37.331634 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="7050eaa4-061a-4dd3-b4da-73e2abd04458" containerName="ceilometer-central-agent" Oct 03 08:09:37 crc kubenswrapper[4664]: I1003 08:09:37.333395 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 08:09:37 crc kubenswrapper[4664]: I1003 08:09:37.339058 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:09:37 crc kubenswrapper[4664]: I1003 08:09:37.339975 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 08:09:37 crc kubenswrapper[4664]: I1003 08:09:37.340135 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 08:09:37 crc kubenswrapper[4664]: I1003 08:09:37.405390 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvcwk\" (UniqueName: \"kubernetes.io/projected/e259c239-66b0-409a-8819-91430929950a-kube-api-access-cvcwk\") pod \"ceilometer-0\" (UID: \"e259c239-66b0-409a-8819-91430929950a\") " pod="openstack/ceilometer-0" Oct 03 08:09:37 crc kubenswrapper[4664]: I1003 08:09:37.405468 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e259c239-66b0-409a-8819-91430929950a-config-data\") pod \"ceilometer-0\" (UID: \"e259c239-66b0-409a-8819-91430929950a\") " pod="openstack/ceilometer-0" Oct 03 08:09:37 crc kubenswrapper[4664]: I1003 08:09:37.405508 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e259c239-66b0-409a-8819-91430929950a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e259c239-66b0-409a-8819-91430929950a\") " pod="openstack/ceilometer-0" Oct 03 08:09:37 crc kubenswrapper[4664]: I1003 08:09:37.405585 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e259c239-66b0-409a-8819-91430929950a-scripts\") pod \"ceilometer-0\" (UID: \"e259c239-66b0-409a-8819-91430929950a\") " pod="openstack/ceilometer-0" Oct 03 08:09:37 crc kubenswrapper[4664]: I1003 08:09:37.405622 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e259c239-66b0-409a-8819-91430929950a-run-httpd\") pod \"ceilometer-0\" (UID: \"e259c239-66b0-409a-8819-91430929950a\") " pod="openstack/ceilometer-0" Oct 03 08:09:37 crc kubenswrapper[4664]: I1003 08:09:37.405643 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e259c239-66b0-409a-8819-91430929950a-log-httpd\") pod \"ceilometer-0\" (UID: \"e259c239-66b0-409a-8819-91430929950a\") " pod="openstack/ceilometer-0" Oct 03 08:09:37 crc kubenswrapper[4664]: I1003 08:09:37.405700 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e259c239-66b0-409a-8819-91430929950a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e259c239-66b0-409a-8819-91430929950a\") " pod="openstack/ceilometer-0" Oct 03 08:09:37 crc kubenswrapper[4664]: I1003 08:09:37.507419 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e259c239-66b0-409a-8819-91430929950a-config-data\") pod \"ceilometer-0\" (UID: \"e259c239-66b0-409a-8819-91430929950a\") " pod="openstack/ceilometer-0" Oct 03 08:09:37 crc kubenswrapper[4664]: I1003 08:09:37.507511 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e259c239-66b0-409a-8819-91430929950a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e259c239-66b0-409a-8819-91430929950a\") " pod="openstack/ceilometer-0" Oct 03 08:09:37 crc kubenswrapper[4664]: I1003 08:09:37.507602 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e259c239-66b0-409a-8819-91430929950a-scripts\") pod \"ceilometer-0\" (UID: \"e259c239-66b0-409a-8819-91430929950a\") " pod="openstack/ceilometer-0" Oct 03 08:09:37 crc kubenswrapper[4664]: I1003 08:09:37.507643 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e259c239-66b0-409a-8819-91430929950a-run-httpd\") pod \"ceilometer-0\" (UID: \"e259c239-66b0-409a-8819-91430929950a\") " pod="openstack/ceilometer-0" Oct 03 08:09:37 crc kubenswrapper[4664]: I1003 08:09:37.507667 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e259c239-66b0-409a-8819-91430929950a-log-httpd\") pod \"ceilometer-0\" (UID: \"e259c239-66b0-409a-8819-91430929950a\") " pod="openstack/ceilometer-0" Oct 03 08:09:37 crc kubenswrapper[4664]: I1003 08:09:37.507690 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e259c239-66b0-409a-8819-91430929950a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e259c239-66b0-409a-8819-91430929950a\") " pod="openstack/ceilometer-0" Oct 03 08:09:37 crc kubenswrapper[4664]: I1003 08:09:37.507717 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvcwk\" (UniqueName: \"kubernetes.io/projected/e259c239-66b0-409a-8819-91430929950a-kube-api-access-cvcwk\") pod \"ceilometer-0\" (UID: \"e259c239-66b0-409a-8819-91430929950a\") " pod="openstack/ceilometer-0" Oct 03 08:09:37 crc kubenswrapper[4664]: I1003 08:09:37.508492 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e259c239-66b0-409a-8819-91430929950a-run-httpd\") pod \"ceilometer-0\" (UID: \"e259c239-66b0-409a-8819-91430929950a\") " pod="openstack/ceilometer-0" Oct 03 08:09:37 crc kubenswrapper[4664]: I1003 08:09:37.508952 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e259c239-66b0-409a-8819-91430929950a-log-httpd\") pod \"ceilometer-0\" (UID: \"e259c239-66b0-409a-8819-91430929950a\") " pod="openstack/ceilometer-0" Oct 03 08:09:37 crc kubenswrapper[4664]: I1003 08:09:37.513194 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e259c239-66b0-409a-8819-91430929950a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e259c239-66b0-409a-8819-91430929950a\") " pod="openstack/ceilometer-0" Oct 03 08:09:37 crc kubenswrapper[4664]: I1003 08:09:37.517274 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e259c239-66b0-409a-8819-91430929950a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e259c239-66b0-409a-8819-91430929950a\") " pod="openstack/ceilometer-0" Oct 03 08:09:37 crc kubenswrapper[4664]: I1003 08:09:37.517855 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e259c239-66b0-409a-8819-91430929950a-config-data\") pod \"ceilometer-0\" (UID: \"e259c239-66b0-409a-8819-91430929950a\") " pod="openstack/ceilometer-0" Oct 03 08:09:37 crc kubenswrapper[4664]: I1003 08:09:37.520181 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e259c239-66b0-409a-8819-91430929950a-scripts\") pod \"ceilometer-0\" (UID: \"e259c239-66b0-409a-8819-91430929950a\") " pod="openstack/ceilometer-0" Oct 03 08:09:37 crc kubenswrapper[4664]: I1003 08:09:37.546319 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvcwk\" (UniqueName: \"kubernetes.io/projected/e259c239-66b0-409a-8819-91430929950a-kube-api-access-cvcwk\") pod \"ceilometer-0\" (UID: \"e259c239-66b0-409a-8819-91430929950a\") " pod="openstack/ceilometer-0" Oct 03 08:09:37 crc kubenswrapper[4664]: I1003 08:09:37.654407 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 08:09:37 crc kubenswrapper[4664]: I1003 08:09:37.876300 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-85fcf9fb6-r8r76" Oct 03 08:09:37 crc kubenswrapper[4664]: I1003 08:09:37.929047 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7050eaa4-061a-4dd3-b4da-73e2abd04458" path="/var/lib/kubelet/pods/7050eaa4-061a-4dd3-b4da-73e2abd04458/volumes" Oct 03 08:09:37 crc kubenswrapper[4664]: I1003 08:09:37.965962 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-76644f9584-br5jb"] Oct 03 08:09:37 crc kubenswrapper[4664]: I1003 08:09:37.972904 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-76644f9584-br5jb" podUID="30ce3373-ef30-4727-b57f-5be7963d1892" containerName="horizon" containerID="cri-o://78ce76bbf2905152072cab64efc16b2aac3c660597a68006c974ce630c008830" gracePeriod=30 Oct 03 08:09:37 crc kubenswrapper[4664]: I1003 08:09:37.973729 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-76644f9584-br5jb" podUID="30ce3373-ef30-4727-b57f-5be7963d1892" containerName="horizon-log" containerID="cri-o://b67e64d8a85e712979c8209e18f5938ff206128e2c3d1b5097e5fa9956960c1c" gracePeriod=30 Oct 03 08:09:37 crc kubenswrapper[4664]: I1003 08:09:37.984496 4664 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-76644f9584-br5jb" podUID="30ce3373-ef30-4727-b57f-5be7963d1892" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.142:8443/dashboard/auth/login/?next=/dashboard/\": EOF" Oct 03 08:09:37 crc kubenswrapper[4664]: I1003 08:09:37.994693 4664 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-76644f9584-br5jb" podUID="30ce3373-ef30-4727-b57f-5be7963d1892" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.142:8443/dashboard/auth/login/?next=/dashboard/\": EOF" Oct 03 08:09:38 crc kubenswrapper[4664]: I1003 08:09:38.157490 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:09:38 crc kubenswrapper[4664]: I1003 08:09:38.197247 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e259c239-66b0-409a-8819-91430929950a","Type":"ContainerStarted","Data":"b007fc15da1f7652c2e2e7bb3d38853648fd2c936f8817ef0530ca1fa03a75ac"} Oct 03 08:09:40 crc kubenswrapper[4664]: I1003 08:09:40.218440 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e259c239-66b0-409a-8819-91430929950a","Type":"ContainerStarted","Data":"9ae2ef0d9c43e7e6df456cbd87079e52b75a8230fbc13b17fa002c29763e687b"} Oct 03 08:09:40 crc kubenswrapper[4664]: I1003 08:09:40.798936 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-58f858bbd-kn4ws" Oct 03 08:09:41 crc kubenswrapper[4664]: I1003 08:09:41.230094 4664 generic.go:334] "Generic (PLEG): container finished" podID="4d316d5e-f411-4940-af4d-9c42f5baae63" containerID="18a7ff075956ded64f9d6e2abb1947765764e84363b9a35a07db64b75756a64d" exitCode=0 Oct 03 08:09:41 crc kubenswrapper[4664]: I1003 08:09:41.230308 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-lgr4h" event={"ID":"4d316d5e-f411-4940-af4d-9c42f5baae63","Type":"ContainerDied","Data":"18a7ff075956ded64f9d6e2abb1947765764e84363b9a35a07db64b75756a64d"} Oct 03 08:09:41 crc kubenswrapper[4664]: I1003 08:09:41.233880 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e259c239-66b0-409a-8819-91430929950a","Type":"ContainerStarted","Data":"288a4092fd889c4bfc68e6659997e8fe06bbf3f67cd75e26f921c8e8b07f6632"} Oct 03 08:09:41 crc kubenswrapper[4664]: I1003 08:09:41.987365 4664 patch_prober.go:28] interesting pod/machine-config-daemon-x9dgm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:09:41 crc kubenswrapper[4664]: I1003 08:09:41.987701 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:09:41 crc kubenswrapper[4664]: I1003 08:09:41.987751 4664 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" Oct 03 08:09:41 crc kubenswrapper[4664]: I1003 08:09:41.988434 4664 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"06473cda750028c12efef390356377e8ae805e2359da1c4b578e9e258218058e"} pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 08:09:41 crc kubenswrapper[4664]: I1003 08:09:41.988491 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" containerID="cri-o://06473cda750028c12efef390356377e8ae805e2359da1c4b578e9e258218058e" gracePeriod=600 Oct 03 08:09:42 crc kubenswrapper[4664]: I1003 08:09:42.246069 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e259c239-66b0-409a-8819-91430929950a","Type":"ContainerStarted","Data":"90abca23b94a01f13efa40f02002f7a387d97640b51b60640219ec78d31b2367"} Oct 03 08:09:42 crc kubenswrapper[4664]: I1003 08:09:42.248469 4664 generic.go:334] "Generic (PLEG): container finished" podID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerID="06473cda750028c12efef390356377e8ae805e2359da1c4b578e9e258218058e" exitCode=0 Oct 03 08:09:42 crc kubenswrapper[4664]: I1003 08:09:42.248522 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" event={"ID":"598b81ce-0ce7-498f-9337-ae5e6e64682b","Type":"ContainerDied","Data":"06473cda750028c12efef390356377e8ae805e2359da1c4b578e9e258218058e"} Oct 03 08:09:42 crc kubenswrapper[4664]: I1003 08:09:42.248561 4664 scope.go:117] "RemoveContainer" containerID="d72d28e356e7ba889e503f6d77ef4dcc3b64c797b9e1df46488fe0f1d0abb973" Oct 03 08:09:42 crc kubenswrapper[4664]: I1003 08:09:42.484421 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-58f858bbd-kn4ws" Oct 03 08:09:42 crc kubenswrapper[4664]: I1003 08:09:42.673594 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-lgr4h" Oct 03 08:09:42 crc kubenswrapper[4664]: I1003 08:09:42.742386 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4d316d5e-f411-4940-af4d-9c42f5baae63-etc-machine-id\") pod \"4d316d5e-f411-4940-af4d-9c42f5baae63\" (UID: \"4d316d5e-f411-4940-af4d-9c42f5baae63\") " Oct 03 08:09:42 crc kubenswrapper[4664]: I1003 08:09:42.742437 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d316d5e-f411-4940-af4d-9c42f5baae63-combined-ca-bundle\") pod \"4d316d5e-f411-4940-af4d-9c42f5baae63\" (UID: \"4d316d5e-f411-4940-af4d-9c42f5baae63\") " Oct 03 08:09:42 crc kubenswrapper[4664]: I1003 08:09:42.742535 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4d316d5e-f411-4940-af4d-9c42f5baae63-db-sync-config-data\") pod \"4d316d5e-f411-4940-af4d-9c42f5baae63\" (UID: \"4d316d5e-f411-4940-af4d-9c42f5baae63\") " Oct 03 08:09:42 crc kubenswrapper[4664]: I1003 08:09:42.742632 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4d316d5e-f411-4940-af4d-9c42f5baae63-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "4d316d5e-f411-4940-af4d-9c42f5baae63" (UID: "4d316d5e-f411-4940-af4d-9c42f5baae63"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:09:42 crc kubenswrapper[4664]: I1003 08:09:42.742659 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d316d5e-f411-4940-af4d-9c42f5baae63-scripts\") pod \"4d316d5e-f411-4940-af4d-9c42f5baae63\" (UID: \"4d316d5e-f411-4940-af4d-9c42f5baae63\") " Oct 03 08:09:42 crc kubenswrapper[4664]: I1003 08:09:42.742838 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d316d5e-f411-4940-af4d-9c42f5baae63-config-data\") pod \"4d316d5e-f411-4940-af4d-9c42f5baae63\" (UID: \"4d316d5e-f411-4940-af4d-9c42f5baae63\") " Oct 03 08:09:42 crc kubenswrapper[4664]: I1003 08:09:42.743291 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhjb2\" (UniqueName: \"kubernetes.io/projected/4d316d5e-f411-4940-af4d-9c42f5baae63-kube-api-access-qhjb2\") pod \"4d316d5e-f411-4940-af4d-9c42f5baae63\" (UID: \"4d316d5e-f411-4940-af4d-9c42f5baae63\") " Oct 03 08:09:42 crc kubenswrapper[4664]: I1003 08:09:42.744352 4664 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4d316d5e-f411-4940-af4d-9c42f5baae63-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:42 crc kubenswrapper[4664]: I1003 08:09:42.757641 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d316d5e-f411-4940-af4d-9c42f5baae63-kube-api-access-qhjb2" (OuterVolumeSpecName: "kube-api-access-qhjb2") pod "4d316d5e-f411-4940-af4d-9c42f5baae63" (UID: "4d316d5e-f411-4940-af4d-9c42f5baae63"). InnerVolumeSpecName "kube-api-access-qhjb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:09:42 crc kubenswrapper[4664]: I1003 08:09:42.757908 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d316d5e-f411-4940-af4d-9c42f5baae63-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "4d316d5e-f411-4940-af4d-9c42f5baae63" (UID: "4d316d5e-f411-4940-af4d-9c42f5baae63"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:09:42 crc kubenswrapper[4664]: I1003 08:09:42.764771 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d316d5e-f411-4940-af4d-9c42f5baae63-scripts" (OuterVolumeSpecName: "scripts") pod "4d316d5e-f411-4940-af4d-9c42f5baae63" (UID: "4d316d5e-f411-4940-af4d-9c42f5baae63"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:09:42 crc kubenswrapper[4664]: I1003 08:09:42.782334 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d316d5e-f411-4940-af4d-9c42f5baae63-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4d316d5e-f411-4940-af4d-9c42f5baae63" (UID: "4d316d5e-f411-4940-af4d-9c42f5baae63"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:09:42 crc kubenswrapper[4664]: I1003 08:09:42.816841 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d316d5e-f411-4940-af4d-9c42f5baae63-config-data" (OuterVolumeSpecName: "config-data") pod "4d316d5e-f411-4940-af4d-9c42f5baae63" (UID: "4d316d5e-f411-4940-af4d-9c42f5baae63"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:09:42 crc kubenswrapper[4664]: I1003 08:09:42.846105 4664 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d316d5e-f411-4940-af4d-9c42f5baae63-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:42 crc kubenswrapper[4664]: I1003 08:09:42.846143 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhjb2\" (UniqueName: \"kubernetes.io/projected/4d316d5e-f411-4940-af4d-9c42f5baae63-kube-api-access-qhjb2\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:42 crc kubenswrapper[4664]: I1003 08:09:42.846159 4664 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d316d5e-f411-4940-af4d-9c42f5baae63-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:42 crc kubenswrapper[4664]: I1003 08:09:42.846170 4664 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4d316d5e-f411-4940-af4d-9c42f5baae63-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:42 crc kubenswrapper[4664]: I1003 08:09:42.846180 4664 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d316d5e-f411-4940-af4d-9c42f5baae63-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.269082 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-lgr4h" event={"ID":"4d316d5e-f411-4940-af4d-9c42f5baae63","Type":"ContainerDied","Data":"f52b6d1984a3852d12d9bb25723573b748d50ed3012cbefe0f42203494fb5cc7"} Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.269905 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f52b6d1984a3852d12d9bb25723573b748d50ed3012cbefe0f42203494fb5cc7" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.269112 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-lgr4h" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.273406 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" event={"ID":"598b81ce-0ce7-498f-9337-ae5e6e64682b","Type":"ContainerStarted","Data":"2dfe6ff457d0c2bccf5db2631d7781386b8da1168146e54f8a4ae9ce420f6b83"} Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.590131 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 08:09:43 crc kubenswrapper[4664]: E1003 08:09:43.601134 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d316d5e-f411-4940-af4d-9c42f5baae63" containerName="cinder-db-sync" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.601210 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d316d5e-f411-4940-af4d-9c42f5baae63" containerName="cinder-db-sync" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.601653 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d316d5e-f411-4940-af4d-9c42f5baae63" containerName="cinder-db-sync" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.604808 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.611760 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-fbgj8" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.615592 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.616740 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.622461 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.623041 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.651767 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-688c87cc99-vz86c" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.661511 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f-config-data\") pod \"cinder-scheduler-0\" (UID: \"acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f\") " pod="openstack/cinder-scheduler-0" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.661599 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f-scripts\") pod \"cinder-scheduler-0\" (UID: \"acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f\") " pod="openstack/cinder-scheduler-0" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.661659 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f\") " pod="openstack/cinder-scheduler-0" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.661696 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f\") " pod="openstack/cinder-scheduler-0" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.661728 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp6mk\" (UniqueName: \"kubernetes.io/projected/acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f-kube-api-access-lp6mk\") pod \"cinder-scheduler-0\" (UID: \"acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f\") " pod="openstack/cinder-scheduler-0" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.661773 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f\") " pod="openstack/cinder-scheduler-0" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.731807 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-vz86c"] Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.772280 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f-config-data\") pod \"cinder-scheduler-0\" (UID: \"acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f\") " pod="openstack/cinder-scheduler-0" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.772360 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f-scripts\") pod \"cinder-scheduler-0\" (UID: \"acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f\") " pod="openstack/cinder-scheduler-0" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.772406 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f\") " pod="openstack/cinder-scheduler-0" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.772424 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f\") " pod="openstack/cinder-scheduler-0" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.772466 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lp6mk\" (UniqueName: \"kubernetes.io/projected/acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f-kube-api-access-lp6mk\") pod \"cinder-scheduler-0\" (UID: \"acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f\") " pod="openstack/cinder-scheduler-0" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.772522 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f\") " pod="openstack/cinder-scheduler-0" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.783723 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f\") " pod="openstack/cinder-scheduler-0" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.791688 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f\") " pod="openstack/cinder-scheduler-0" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.792781 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f-config-data\") pod \"cinder-scheduler-0\" (UID: \"acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f\") " pod="openstack/cinder-scheduler-0" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.798050 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f-scripts\") pod \"cinder-scheduler-0\" (UID: \"acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f\") " pod="openstack/cinder-scheduler-0" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.798981 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f\") " pod="openstack/cinder-scheduler-0" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.815222 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lp6mk\" (UniqueName: \"kubernetes.io/projected/acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f-kube-api-access-lp6mk\") pod \"cinder-scheduler-0\" (UID: \"acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f\") " pod="openstack/cinder-scheduler-0" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.820899 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-x99b6"] Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.832307 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-x99b6" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.873930 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5x7s\" (UniqueName: \"kubernetes.io/projected/40e2fe2c-8b4e-4b78-899e-1daed68626da-kube-api-access-b5x7s\") pod \"dnsmasq-dns-6bb4fc677f-x99b6\" (UID: \"40e2fe2c-8b4e-4b78-899e-1daed68626da\") " pod="openstack/dnsmasq-dns-6bb4fc677f-x99b6" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.874032 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/40e2fe2c-8b4e-4b78-899e-1daed68626da-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-x99b6\" (UID: \"40e2fe2c-8b4e-4b78-899e-1daed68626da\") " pod="openstack/dnsmasq-dns-6bb4fc677f-x99b6" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.874130 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40e2fe2c-8b4e-4b78-899e-1daed68626da-config\") pod \"dnsmasq-dns-6bb4fc677f-x99b6\" (UID: \"40e2fe2c-8b4e-4b78-899e-1daed68626da\") " pod="openstack/dnsmasq-dns-6bb4fc677f-x99b6" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.874222 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/40e2fe2c-8b4e-4b78-899e-1daed68626da-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-x99b6\" (UID: \"40e2fe2c-8b4e-4b78-899e-1daed68626da\") " pod="openstack/dnsmasq-dns-6bb4fc677f-x99b6" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.874258 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/40e2fe2c-8b4e-4b78-899e-1daed68626da-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-x99b6\" (UID: \"40e2fe2c-8b4e-4b78-899e-1daed68626da\") " pod="openstack/dnsmasq-dns-6bb4fc677f-x99b6" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.874337 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/40e2fe2c-8b4e-4b78-899e-1daed68626da-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-x99b6\" (UID: \"40e2fe2c-8b4e-4b78-899e-1daed68626da\") " pod="openstack/dnsmasq-dns-6bb4fc677f-x99b6" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.928188 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-x99b6"] Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.941951 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.943640 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.957086 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.967795 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.975807 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.977810 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/427e9a44-7186-4c44-b528-3b24993c1e31-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"427e9a44-7186-4c44-b528-3b24993c1e31\") " pod="openstack/cinder-api-0" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.977901 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/40e2fe2c-8b4e-4b78-899e-1daed68626da-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-x99b6\" (UID: \"40e2fe2c-8b4e-4b78-899e-1daed68626da\") " pod="openstack/dnsmasq-dns-6bb4fc677f-x99b6" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.977956 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/40e2fe2c-8b4e-4b78-899e-1daed68626da-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-x99b6\" (UID: \"40e2fe2c-8b4e-4b78-899e-1daed68626da\") " pod="openstack/dnsmasq-dns-6bb4fc677f-x99b6" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.978029 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/40e2fe2c-8b4e-4b78-899e-1daed68626da-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-x99b6\" (UID: \"40e2fe2c-8b4e-4b78-899e-1daed68626da\") " pod="openstack/dnsmasq-dns-6bb4fc677f-x99b6" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.978120 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5x7s\" (UniqueName: \"kubernetes.io/projected/40e2fe2c-8b4e-4b78-899e-1daed68626da-kube-api-access-b5x7s\") pod \"dnsmasq-dns-6bb4fc677f-x99b6\" (UID: \"40e2fe2c-8b4e-4b78-899e-1daed68626da\") " pod="openstack/dnsmasq-dns-6bb4fc677f-x99b6" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.978180 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/40e2fe2c-8b4e-4b78-899e-1daed68626da-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-x99b6\" (UID: \"40e2fe2c-8b4e-4b78-899e-1daed68626da\") " pod="openstack/dnsmasq-dns-6bb4fc677f-x99b6" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.978208 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/427e9a44-7186-4c44-b528-3b24993c1e31-config-data-custom\") pod \"cinder-api-0\" (UID: \"427e9a44-7186-4c44-b528-3b24993c1e31\") " pod="openstack/cinder-api-0" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.978237 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/427e9a44-7186-4c44-b528-3b24993c1e31-scripts\") pod \"cinder-api-0\" (UID: \"427e9a44-7186-4c44-b528-3b24993c1e31\") " pod="openstack/cinder-api-0" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.978295 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/427e9a44-7186-4c44-b528-3b24993c1e31-logs\") pod \"cinder-api-0\" (UID: \"427e9a44-7186-4c44-b528-3b24993c1e31\") " pod="openstack/cinder-api-0" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.978326 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/427e9a44-7186-4c44-b528-3b24993c1e31-config-data\") pod \"cinder-api-0\" (UID: \"427e9a44-7186-4c44-b528-3b24993c1e31\") " pod="openstack/cinder-api-0" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.978363 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/427e9a44-7186-4c44-b528-3b24993c1e31-etc-machine-id\") pod \"cinder-api-0\" (UID: \"427e9a44-7186-4c44-b528-3b24993c1e31\") " pod="openstack/cinder-api-0" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.978423 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40e2fe2c-8b4e-4b78-899e-1daed68626da-config\") pod \"dnsmasq-dns-6bb4fc677f-x99b6\" (UID: \"40e2fe2c-8b4e-4b78-899e-1daed68626da\") " pod="openstack/dnsmasq-dns-6bb4fc677f-x99b6" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.978468 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tkd4\" (UniqueName: \"kubernetes.io/projected/427e9a44-7186-4c44-b528-3b24993c1e31-kube-api-access-4tkd4\") pod \"cinder-api-0\" (UID: \"427e9a44-7186-4c44-b528-3b24993c1e31\") " pod="openstack/cinder-api-0" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.981337 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/40e2fe2c-8b4e-4b78-899e-1daed68626da-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-x99b6\" (UID: \"40e2fe2c-8b4e-4b78-899e-1daed68626da\") " pod="openstack/dnsmasq-dns-6bb4fc677f-x99b6" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.981488 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/40e2fe2c-8b4e-4b78-899e-1daed68626da-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-x99b6\" (UID: \"40e2fe2c-8b4e-4b78-899e-1daed68626da\") " pod="openstack/dnsmasq-dns-6bb4fc677f-x99b6" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.982462 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40e2fe2c-8b4e-4b78-899e-1daed68626da-config\") pod \"dnsmasq-dns-6bb4fc677f-x99b6\" (UID: \"40e2fe2c-8b4e-4b78-899e-1daed68626da\") " pod="openstack/dnsmasq-dns-6bb4fc677f-x99b6" Oct 03 08:09:43 crc kubenswrapper[4664]: I1003 08:09:43.991781 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/40e2fe2c-8b4e-4b78-899e-1daed68626da-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-x99b6\" (UID: \"40e2fe2c-8b4e-4b78-899e-1daed68626da\") " pod="openstack/dnsmasq-dns-6bb4fc677f-x99b6" Oct 03 08:09:44 crc kubenswrapper[4664]: I1003 08:09:43.998456 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/40e2fe2c-8b4e-4b78-899e-1daed68626da-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-x99b6\" (UID: \"40e2fe2c-8b4e-4b78-899e-1daed68626da\") " pod="openstack/dnsmasq-dns-6bb4fc677f-x99b6" Oct 03 08:09:44 crc kubenswrapper[4664]: I1003 08:09:44.037335 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5x7s\" (UniqueName: \"kubernetes.io/projected/40e2fe2c-8b4e-4b78-899e-1daed68626da-kube-api-access-b5x7s\") pod \"dnsmasq-dns-6bb4fc677f-x99b6\" (UID: \"40e2fe2c-8b4e-4b78-899e-1daed68626da\") " pod="openstack/dnsmasq-dns-6bb4fc677f-x99b6" Oct 03 08:09:44 crc kubenswrapper[4664]: I1003 08:09:44.080673 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tkd4\" (UniqueName: \"kubernetes.io/projected/427e9a44-7186-4c44-b528-3b24993c1e31-kube-api-access-4tkd4\") pod \"cinder-api-0\" (UID: \"427e9a44-7186-4c44-b528-3b24993c1e31\") " pod="openstack/cinder-api-0" Oct 03 08:09:44 crc kubenswrapper[4664]: I1003 08:09:44.080763 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/427e9a44-7186-4c44-b528-3b24993c1e31-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"427e9a44-7186-4c44-b528-3b24993c1e31\") " pod="openstack/cinder-api-0" Oct 03 08:09:44 crc kubenswrapper[4664]: I1003 08:09:44.080836 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/427e9a44-7186-4c44-b528-3b24993c1e31-config-data-custom\") pod \"cinder-api-0\" (UID: \"427e9a44-7186-4c44-b528-3b24993c1e31\") " pod="openstack/cinder-api-0" Oct 03 08:09:44 crc kubenswrapper[4664]: I1003 08:09:44.080856 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/427e9a44-7186-4c44-b528-3b24993c1e31-scripts\") pod \"cinder-api-0\" (UID: \"427e9a44-7186-4c44-b528-3b24993c1e31\") " pod="openstack/cinder-api-0" Oct 03 08:09:44 crc kubenswrapper[4664]: I1003 08:09:44.080888 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/427e9a44-7186-4c44-b528-3b24993c1e31-logs\") pod \"cinder-api-0\" (UID: \"427e9a44-7186-4c44-b528-3b24993c1e31\") " pod="openstack/cinder-api-0" Oct 03 08:09:44 crc kubenswrapper[4664]: I1003 08:09:44.080910 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/427e9a44-7186-4c44-b528-3b24993c1e31-config-data\") pod \"cinder-api-0\" (UID: \"427e9a44-7186-4c44-b528-3b24993c1e31\") " pod="openstack/cinder-api-0" Oct 03 08:09:44 crc kubenswrapper[4664]: I1003 08:09:44.080932 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/427e9a44-7186-4c44-b528-3b24993c1e31-etc-machine-id\") pod \"cinder-api-0\" (UID: \"427e9a44-7186-4c44-b528-3b24993c1e31\") " pod="openstack/cinder-api-0" Oct 03 08:09:44 crc kubenswrapper[4664]: I1003 08:09:44.081040 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/427e9a44-7186-4c44-b528-3b24993c1e31-etc-machine-id\") pod \"cinder-api-0\" (UID: \"427e9a44-7186-4c44-b528-3b24993c1e31\") " pod="openstack/cinder-api-0" Oct 03 08:09:44 crc kubenswrapper[4664]: I1003 08:09:44.086770 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/427e9a44-7186-4c44-b528-3b24993c1e31-logs\") pod \"cinder-api-0\" (UID: \"427e9a44-7186-4c44-b528-3b24993c1e31\") " pod="openstack/cinder-api-0" Oct 03 08:09:44 crc kubenswrapper[4664]: I1003 08:09:44.100788 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/427e9a44-7186-4c44-b528-3b24993c1e31-config-data\") pod \"cinder-api-0\" (UID: \"427e9a44-7186-4c44-b528-3b24993c1e31\") " pod="openstack/cinder-api-0" Oct 03 08:09:44 crc kubenswrapper[4664]: I1003 08:09:44.104815 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/427e9a44-7186-4c44-b528-3b24993c1e31-scripts\") pod \"cinder-api-0\" (UID: \"427e9a44-7186-4c44-b528-3b24993c1e31\") " pod="openstack/cinder-api-0" Oct 03 08:09:44 crc kubenswrapper[4664]: I1003 08:09:44.105186 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/427e9a44-7186-4c44-b528-3b24993c1e31-config-data-custom\") pod \"cinder-api-0\" (UID: \"427e9a44-7186-4c44-b528-3b24993c1e31\") " pod="openstack/cinder-api-0" Oct 03 08:09:44 crc kubenswrapper[4664]: I1003 08:09:44.106401 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/427e9a44-7186-4c44-b528-3b24993c1e31-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"427e9a44-7186-4c44-b528-3b24993c1e31\") " pod="openstack/cinder-api-0" Oct 03 08:09:44 crc kubenswrapper[4664]: I1003 08:09:44.110172 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tkd4\" (UniqueName: \"kubernetes.io/projected/427e9a44-7186-4c44-b528-3b24993c1e31-kube-api-access-4tkd4\") pod \"cinder-api-0\" (UID: \"427e9a44-7186-4c44-b528-3b24993c1e31\") " pod="openstack/cinder-api-0" Oct 03 08:09:44 crc kubenswrapper[4664]: I1003 08:09:44.239769 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-x99b6" Oct 03 08:09:44 crc kubenswrapper[4664]: I1003 08:09:44.300844 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 08:09:44 crc kubenswrapper[4664]: I1003 08:09:44.391685 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e259c239-66b0-409a-8819-91430929950a","Type":"ContainerStarted","Data":"19b56281c965636b4ff097ca0f6d749632f22c733e4c76eaf2d6d5bda8d87c50"} Oct 03 08:09:44 crc kubenswrapper[4664]: I1003 08:09:44.393145 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-688c87cc99-vz86c" podUID="618be4c7-ebc2-43f5-aed3-f9e0c83fad8d" containerName="dnsmasq-dns" containerID="cri-o://e15838efc9037f1d9632b7ff99ce466257dee4188a30f28bdf538bb19c926da1" gracePeriod=10 Oct 03 08:09:44 crc kubenswrapper[4664]: I1003 08:09:44.393474 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 08:09:44 crc kubenswrapper[4664]: I1003 08:09:44.436871 4664 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-76644f9584-br5jb" podUID="30ce3373-ef30-4727-b57f-5be7963d1892" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.142:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:38656->10.217.0.142:8443: read: connection reset by peer" Oct 03 08:09:44 crc kubenswrapper[4664]: I1003 08:09:44.447507 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.268309143 podStartE2EDuration="7.447481469s" podCreationTimestamp="2025-10-03 08:09:37 +0000 UTC" firstStartedPulling="2025-10-03 08:09:38.169176734 +0000 UTC m=+1278.990367224" lastFinishedPulling="2025-10-03 08:09:43.34834906 +0000 UTC m=+1284.169539550" observedRunningTime="2025-10-03 08:09:44.428528798 +0000 UTC m=+1285.249719298" watchObservedRunningTime="2025-10-03 08:09:44.447481469 +0000 UTC m=+1285.268671969" Oct 03 08:09:44 crc kubenswrapper[4664]: I1003 08:09:44.679351 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 08:09:45 crc kubenswrapper[4664]: I1003 08:09:45.018150 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-x99b6"] Oct 03 08:09:45 crc kubenswrapper[4664]: I1003 08:09:45.128958 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-vz86c" Oct 03 08:09:45 crc kubenswrapper[4664]: I1003 08:09:45.251544 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/618be4c7-ebc2-43f5-aed3-f9e0c83fad8d-ovsdbserver-nb\") pod \"618be4c7-ebc2-43f5-aed3-f9e0c83fad8d\" (UID: \"618be4c7-ebc2-43f5-aed3-f9e0c83fad8d\") " Oct 03 08:09:45 crc kubenswrapper[4664]: I1003 08:09:45.251663 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/618be4c7-ebc2-43f5-aed3-f9e0c83fad8d-ovsdbserver-sb\") pod \"618be4c7-ebc2-43f5-aed3-f9e0c83fad8d\" (UID: \"618be4c7-ebc2-43f5-aed3-f9e0c83fad8d\") " Oct 03 08:09:45 crc kubenswrapper[4664]: I1003 08:09:45.251830 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xws2\" (UniqueName: \"kubernetes.io/projected/618be4c7-ebc2-43f5-aed3-f9e0c83fad8d-kube-api-access-5xws2\") pod \"618be4c7-ebc2-43f5-aed3-f9e0c83fad8d\" (UID: \"618be4c7-ebc2-43f5-aed3-f9e0c83fad8d\") " Oct 03 08:09:45 crc kubenswrapper[4664]: I1003 08:09:45.251889 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/618be4c7-ebc2-43f5-aed3-f9e0c83fad8d-config\") pod \"618be4c7-ebc2-43f5-aed3-f9e0c83fad8d\" (UID: \"618be4c7-ebc2-43f5-aed3-f9e0c83fad8d\") " Oct 03 08:09:45 crc kubenswrapper[4664]: I1003 08:09:45.252050 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/618be4c7-ebc2-43f5-aed3-f9e0c83fad8d-dns-swift-storage-0\") pod \"618be4c7-ebc2-43f5-aed3-f9e0c83fad8d\" (UID: \"618be4c7-ebc2-43f5-aed3-f9e0c83fad8d\") " Oct 03 08:09:45 crc kubenswrapper[4664]: I1003 08:09:45.252118 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/618be4c7-ebc2-43f5-aed3-f9e0c83fad8d-dns-svc\") pod \"618be4c7-ebc2-43f5-aed3-f9e0c83fad8d\" (UID: \"618be4c7-ebc2-43f5-aed3-f9e0c83fad8d\") " Oct 03 08:09:45 crc kubenswrapper[4664]: I1003 08:09:45.275066 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/618be4c7-ebc2-43f5-aed3-f9e0c83fad8d-kube-api-access-5xws2" (OuterVolumeSpecName: "kube-api-access-5xws2") pod "618be4c7-ebc2-43f5-aed3-f9e0c83fad8d" (UID: "618be4c7-ebc2-43f5-aed3-f9e0c83fad8d"). InnerVolumeSpecName "kube-api-access-5xws2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:09:45 crc kubenswrapper[4664]: I1003 08:09:45.292980 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 03 08:09:45 crc kubenswrapper[4664]: I1003 08:09:45.355151 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xws2\" (UniqueName: \"kubernetes.io/projected/618be4c7-ebc2-43f5-aed3-f9e0c83fad8d-kube-api-access-5xws2\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:45 crc kubenswrapper[4664]: I1003 08:09:45.408953 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"427e9a44-7186-4c44-b528-3b24993c1e31","Type":"ContainerStarted","Data":"b238fa2c9774ac9571ec2d17b3ae79db87f421a4aff20e85592a6f7329b3caa8"} Oct 03 08:09:45 crc kubenswrapper[4664]: I1003 08:09:45.421546 4664 generic.go:334] "Generic (PLEG): container finished" podID="618be4c7-ebc2-43f5-aed3-f9e0c83fad8d" containerID="e15838efc9037f1d9632b7ff99ce466257dee4188a30f28bdf538bb19c926da1" exitCode=0 Oct 03 08:09:45 crc kubenswrapper[4664]: I1003 08:09:45.421648 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-vz86c" event={"ID":"618be4c7-ebc2-43f5-aed3-f9e0c83fad8d","Type":"ContainerDied","Data":"e15838efc9037f1d9632b7ff99ce466257dee4188a30f28bdf538bb19c926da1"} Oct 03 08:09:45 crc kubenswrapper[4664]: I1003 08:09:45.421682 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-vz86c" event={"ID":"618be4c7-ebc2-43f5-aed3-f9e0c83fad8d","Type":"ContainerDied","Data":"62860479bda86ae6d37513b0f72b21c7b8d57d59bfe051716b193ff982b626d7"} Oct 03 08:09:45 crc kubenswrapper[4664]: I1003 08:09:45.421706 4664 scope.go:117] "RemoveContainer" containerID="e15838efc9037f1d9632b7ff99ce466257dee4188a30f28bdf538bb19c926da1" Oct 03 08:09:45 crc kubenswrapper[4664]: I1003 08:09:45.421840 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-vz86c" Oct 03 08:09:45 crc kubenswrapper[4664]: I1003 08:09:45.432086 4664 generic.go:334] "Generic (PLEG): container finished" podID="30ce3373-ef30-4727-b57f-5be7963d1892" containerID="78ce76bbf2905152072cab64efc16b2aac3c660597a68006c974ce630c008830" exitCode=0 Oct 03 08:09:45 crc kubenswrapper[4664]: I1003 08:09:45.432512 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76644f9584-br5jb" event={"ID":"30ce3373-ef30-4727-b57f-5be7963d1892","Type":"ContainerDied","Data":"78ce76bbf2905152072cab64efc16b2aac3c660597a68006c974ce630c008830"} Oct 03 08:09:45 crc kubenswrapper[4664]: I1003 08:09:45.442881 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f","Type":"ContainerStarted","Data":"825f2bb8ab089a84d717509952f98b1a61cb881f2429204cf89ed329323c6eff"} Oct 03 08:09:45 crc kubenswrapper[4664]: I1003 08:09:45.453954 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-x99b6" event={"ID":"40e2fe2c-8b4e-4b78-899e-1daed68626da","Type":"ContainerStarted","Data":"9aca1718d202ab8145850318a6c0f977caccab430da3a348cc22376004b729f0"} Oct 03 08:09:45 crc kubenswrapper[4664]: I1003 08:09:45.461405 4664 scope.go:117] "RemoveContainer" containerID="40a58d8ae436181512650f3adae277582da01b374f17cf08d3006ea4eb7a6b3b" Oct 03 08:09:45 crc kubenswrapper[4664]: I1003 08:09:45.497077 4664 scope.go:117] "RemoveContainer" containerID="e15838efc9037f1d9632b7ff99ce466257dee4188a30f28bdf538bb19c926da1" Oct 03 08:09:45 crc kubenswrapper[4664]: E1003 08:09:45.497978 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e15838efc9037f1d9632b7ff99ce466257dee4188a30f28bdf538bb19c926da1\": container with ID starting with e15838efc9037f1d9632b7ff99ce466257dee4188a30f28bdf538bb19c926da1 not found: ID does not exist" containerID="e15838efc9037f1d9632b7ff99ce466257dee4188a30f28bdf538bb19c926da1" Oct 03 08:09:45 crc kubenswrapper[4664]: I1003 08:09:45.498097 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e15838efc9037f1d9632b7ff99ce466257dee4188a30f28bdf538bb19c926da1"} err="failed to get container status \"e15838efc9037f1d9632b7ff99ce466257dee4188a30f28bdf538bb19c926da1\": rpc error: code = NotFound desc = could not find container \"e15838efc9037f1d9632b7ff99ce466257dee4188a30f28bdf538bb19c926da1\": container with ID starting with e15838efc9037f1d9632b7ff99ce466257dee4188a30f28bdf538bb19c926da1 not found: ID does not exist" Oct 03 08:09:45 crc kubenswrapper[4664]: I1003 08:09:45.498134 4664 scope.go:117] "RemoveContainer" containerID="40a58d8ae436181512650f3adae277582da01b374f17cf08d3006ea4eb7a6b3b" Oct 03 08:09:45 crc kubenswrapper[4664]: I1003 08:09:45.498495 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/618be4c7-ebc2-43f5-aed3-f9e0c83fad8d-config" (OuterVolumeSpecName: "config") pod "618be4c7-ebc2-43f5-aed3-f9e0c83fad8d" (UID: "618be4c7-ebc2-43f5-aed3-f9e0c83fad8d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:09:45 crc kubenswrapper[4664]: E1003 08:09:45.498697 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40a58d8ae436181512650f3adae277582da01b374f17cf08d3006ea4eb7a6b3b\": container with ID starting with 40a58d8ae436181512650f3adae277582da01b374f17cf08d3006ea4eb7a6b3b not found: ID does not exist" containerID="40a58d8ae436181512650f3adae277582da01b374f17cf08d3006ea4eb7a6b3b" Oct 03 08:09:45 crc kubenswrapper[4664]: I1003 08:09:45.498733 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40a58d8ae436181512650f3adae277582da01b374f17cf08d3006ea4eb7a6b3b"} err="failed to get container status \"40a58d8ae436181512650f3adae277582da01b374f17cf08d3006ea4eb7a6b3b\": rpc error: code = NotFound desc = could not find container \"40a58d8ae436181512650f3adae277582da01b374f17cf08d3006ea4eb7a6b3b\": container with ID starting with 40a58d8ae436181512650f3adae277582da01b374f17cf08d3006ea4eb7a6b3b not found: ID does not exist" Oct 03 08:09:45 crc kubenswrapper[4664]: I1003 08:09:45.529687 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/618be4c7-ebc2-43f5-aed3-f9e0c83fad8d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "618be4c7-ebc2-43f5-aed3-f9e0c83fad8d" (UID: "618be4c7-ebc2-43f5-aed3-f9e0c83fad8d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:09:45 crc kubenswrapper[4664]: I1003 08:09:45.549367 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/618be4c7-ebc2-43f5-aed3-f9e0c83fad8d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "618be4c7-ebc2-43f5-aed3-f9e0c83fad8d" (UID: "618be4c7-ebc2-43f5-aed3-f9e0c83fad8d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:09:45 crc kubenswrapper[4664]: I1003 08:09:45.564316 4664 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/618be4c7-ebc2-43f5-aed3-f9e0c83fad8d-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:45 crc kubenswrapper[4664]: I1003 08:09:45.564361 4664 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/618be4c7-ebc2-43f5-aed3-f9e0c83fad8d-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:45 crc kubenswrapper[4664]: I1003 08:09:45.564372 4664 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/618be4c7-ebc2-43f5-aed3-f9e0c83fad8d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:45 crc kubenswrapper[4664]: I1003 08:09:45.570973 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/618be4c7-ebc2-43f5-aed3-f9e0c83fad8d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "618be4c7-ebc2-43f5-aed3-f9e0c83fad8d" (UID: "618be4c7-ebc2-43f5-aed3-f9e0c83fad8d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:09:45 crc kubenswrapper[4664]: I1003 08:09:45.599205 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/618be4c7-ebc2-43f5-aed3-f9e0c83fad8d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "618be4c7-ebc2-43f5-aed3-f9e0c83fad8d" (UID: "618be4c7-ebc2-43f5-aed3-f9e0c83fad8d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:09:45 crc kubenswrapper[4664]: I1003 08:09:45.666450 4664 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/618be4c7-ebc2-43f5-aed3-f9e0c83fad8d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:45 crc kubenswrapper[4664]: I1003 08:09:45.666492 4664 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/618be4c7-ebc2-43f5-aed3-f9e0c83fad8d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:45 crc kubenswrapper[4664]: I1003 08:09:45.676938 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-797fcd6b7d-kpnbp" Oct 03 08:09:45 crc kubenswrapper[4664]: I1003 08:09:45.836515 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-vz86c"] Oct 03 08:09:45 crc kubenswrapper[4664]: I1003 08:09:45.842891 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-vz86c"] Oct 03 08:09:45 crc kubenswrapper[4664]: I1003 08:09:45.892150 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="618be4c7-ebc2-43f5-aed3-f9e0c83fad8d" path="/var/lib/kubelet/pods/618be4c7-ebc2-43f5-aed3-f9e0c83fad8d/volumes" Oct 03 08:09:46 crc kubenswrapper[4664]: I1003 08:09:46.404809 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 03 08:09:46 crc kubenswrapper[4664]: I1003 08:09:46.468052 4664 generic.go:334] "Generic (PLEG): container finished" podID="40e2fe2c-8b4e-4b78-899e-1daed68626da" containerID="76dd6b14ebccb0558775bfe7eb0e9b14dca7ad99855d3395322a024bf61a3cbd" exitCode=0 Oct 03 08:09:46 crc kubenswrapper[4664]: I1003 08:09:46.468135 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-x99b6" event={"ID":"40e2fe2c-8b4e-4b78-899e-1daed68626da","Type":"ContainerDied","Data":"76dd6b14ebccb0558775bfe7eb0e9b14dca7ad99855d3395322a024bf61a3cbd"} Oct 03 08:09:47 crc kubenswrapper[4664]: I1003 08:09:47.495101 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f","Type":"ContainerStarted","Data":"da0df2e299f01a9a0033b6926895ab064100c6876f2f24cd287547a42e5aed45"} Oct 03 08:09:47 crc kubenswrapper[4664]: I1003 08:09:47.500781 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-797fcd6b7d-kpnbp" Oct 03 08:09:47 crc kubenswrapper[4664]: I1003 08:09:47.512376 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-x99b6" event={"ID":"40e2fe2c-8b4e-4b78-899e-1daed68626da","Type":"ContainerStarted","Data":"7b78d6b5329290ad7260c2fb3e58ca6d0ae56f128ad2207d8d3dae7e1d9b9f7a"} Oct 03 08:09:47 crc kubenswrapper[4664]: I1003 08:09:47.512654 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bb4fc677f-x99b6" Oct 03 08:09:47 crc kubenswrapper[4664]: I1003 08:09:47.617010 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bb4fc677f-x99b6" podStartSLOduration=4.616986707 podStartE2EDuration="4.616986707s" podCreationTimestamp="2025-10-03 08:09:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:09:47.58309539 +0000 UTC m=+1288.404285880" watchObservedRunningTime="2025-10-03 08:09:47.616986707 +0000 UTC m=+1288.438177197" Oct 03 08:09:47 crc kubenswrapper[4664]: I1003 08:09:47.624277 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"427e9a44-7186-4c44-b528-3b24993c1e31","Type":"ContainerStarted","Data":"8478f3ff9b1c2381d4f766ec85ab175a41e2ff8b8df90acd31febae1806c0414"} Oct 03 08:09:47 crc kubenswrapper[4664]: I1003 08:09:47.649202 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-58f858bbd-kn4ws"] Oct 03 08:09:47 crc kubenswrapper[4664]: I1003 08:09:47.649483 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-58f858bbd-kn4ws" podUID="007d3aea-e570-499e-8373-3e43e65865c1" containerName="barbican-api-log" containerID="cri-o://5d221f7f5f88e0b33e037e3d454abee6ee4db6e00718bc2a5fa26031433469c6" gracePeriod=30 Oct 03 08:09:47 crc kubenswrapper[4664]: I1003 08:09:47.654687 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-58f858bbd-kn4ws" podUID="007d3aea-e570-499e-8373-3e43e65865c1" containerName="barbican-api" containerID="cri-o://46fcd88dee1d3df9568ee3c1a5341aaf3ab6ccc47d4821f99f8d52cf576fa435" gracePeriod=30 Oct 03 08:09:47 crc kubenswrapper[4664]: I1003 08:09:47.670026 4664 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-58f858bbd-kn4ws" podUID="007d3aea-e570-499e-8373-3e43e65865c1" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.157:9311/healthcheck\": EOF" Oct 03 08:09:47 crc kubenswrapper[4664]: I1003 08:09:47.704698 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-58f858bbd-kn4ws" podUID="007d3aea-e570-499e-8373-3e43e65865c1" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.157:9311/healthcheck\": EOF" Oct 03 08:09:48 crc kubenswrapper[4664]: I1003 08:09:48.636358 4664 generic.go:334] "Generic (PLEG): container finished" podID="007d3aea-e570-499e-8373-3e43e65865c1" containerID="5d221f7f5f88e0b33e037e3d454abee6ee4db6e00718bc2a5fa26031433469c6" exitCode=143 Oct 03 08:09:48 crc kubenswrapper[4664]: I1003 08:09:48.636443 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-58f858bbd-kn4ws" event={"ID":"007d3aea-e570-499e-8373-3e43e65865c1","Type":"ContainerDied","Data":"5d221f7f5f88e0b33e037e3d454abee6ee4db6e00718bc2a5fa26031433469c6"} Oct 03 08:09:48 crc kubenswrapper[4664]: I1003 08:09:48.639566 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f","Type":"ContainerStarted","Data":"1af1bb3b970c8ef1052412b8f46eff8a351e6b08149a0b11fe5770b5dd2e5064"} Oct 03 08:09:48 crc kubenswrapper[4664]: I1003 08:09:48.642621 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="427e9a44-7186-4c44-b528-3b24993c1e31" containerName="cinder-api-log" containerID="cri-o://8478f3ff9b1c2381d4f766ec85ab175a41e2ff8b8df90acd31febae1806c0414" gracePeriod=30 Oct 03 08:09:48 crc kubenswrapper[4664]: I1003 08:09:48.642936 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"427e9a44-7186-4c44-b528-3b24993c1e31","Type":"ContainerStarted","Data":"b9319a02b1e121c25a82a8f929971279e0d74dfea2911556d16b3d210f2168d1"} Oct 03 08:09:48 crc kubenswrapper[4664]: I1003 08:09:48.642983 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 03 08:09:48 crc kubenswrapper[4664]: I1003 08:09:48.643017 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="427e9a44-7186-4c44-b528-3b24993c1e31" containerName="cinder-api" containerID="cri-o://b9319a02b1e121c25a82a8f929971279e0d74dfea2911556d16b3d210f2168d1" gracePeriod=30 Oct 03 08:09:48 crc kubenswrapper[4664]: I1003 08:09:48.664263 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.468419968 podStartE2EDuration="5.664245086s" podCreationTimestamp="2025-10-03 08:09:43 +0000 UTC" firstStartedPulling="2025-10-03 08:09:44.750917996 +0000 UTC m=+1285.572108486" lastFinishedPulling="2025-10-03 08:09:45.946743114 +0000 UTC m=+1286.767933604" observedRunningTime="2025-10-03 08:09:48.661068045 +0000 UTC m=+1289.482258545" watchObservedRunningTime="2025-10-03 08:09:48.664245086 +0000 UTC m=+1289.485435576" Oct 03 08:09:48 crc kubenswrapper[4664]: I1003 08:09:48.693303 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.693287474 podStartE2EDuration="5.693287474s" podCreationTimestamp="2025-10-03 08:09:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:09:48.692055459 +0000 UTC m=+1289.513245969" watchObservedRunningTime="2025-10-03 08:09:48.693287474 +0000 UTC m=+1289.514477964" Oct 03 08:09:48 crc kubenswrapper[4664]: I1003 08:09:48.931129 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-67cd87976d-7fbgw" Oct 03 08:09:48 crc kubenswrapper[4664]: I1003 08:09:48.968985 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 03 08:09:48 crc kubenswrapper[4664]: I1003 08:09:48.974556 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-67cd87976d-7fbgw" Oct 03 08:09:49 crc kubenswrapper[4664]: I1003 08:09:49.685867 4664 generic.go:334] "Generic (PLEG): container finished" podID="427e9a44-7186-4c44-b528-3b24993c1e31" containerID="b9319a02b1e121c25a82a8f929971279e0d74dfea2911556d16b3d210f2168d1" exitCode=0 Oct 03 08:09:49 crc kubenswrapper[4664]: I1003 08:09:49.686168 4664 generic.go:334] "Generic (PLEG): container finished" podID="427e9a44-7186-4c44-b528-3b24993c1e31" containerID="8478f3ff9b1c2381d4f766ec85ab175a41e2ff8b8df90acd31febae1806c0414" exitCode=143 Oct 03 08:09:49 crc kubenswrapper[4664]: I1003 08:09:49.686240 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"427e9a44-7186-4c44-b528-3b24993c1e31","Type":"ContainerDied","Data":"b9319a02b1e121c25a82a8f929971279e0d74dfea2911556d16b3d210f2168d1"} Oct 03 08:09:49 crc kubenswrapper[4664]: I1003 08:09:49.686288 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"427e9a44-7186-4c44-b528-3b24993c1e31","Type":"ContainerDied","Data":"8478f3ff9b1c2381d4f766ec85ab175a41e2ff8b8df90acd31febae1806c0414"} Oct 03 08:09:49 crc kubenswrapper[4664]: I1003 08:09:49.888058 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 08:09:49 crc kubenswrapper[4664]: I1003 08:09:49.960679 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-ccfbd46bc-qz9qm" Oct 03 08:09:50 crc kubenswrapper[4664]: I1003 08:09:50.014922 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/427e9a44-7186-4c44-b528-3b24993c1e31-logs\") pod \"427e9a44-7186-4c44-b528-3b24993c1e31\" (UID: \"427e9a44-7186-4c44-b528-3b24993c1e31\") " Oct 03 08:09:50 crc kubenswrapper[4664]: I1003 08:09:50.015205 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tkd4\" (UniqueName: \"kubernetes.io/projected/427e9a44-7186-4c44-b528-3b24993c1e31-kube-api-access-4tkd4\") pod \"427e9a44-7186-4c44-b528-3b24993c1e31\" (UID: \"427e9a44-7186-4c44-b528-3b24993c1e31\") " Oct 03 08:09:50 crc kubenswrapper[4664]: I1003 08:09:50.015820 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/427e9a44-7186-4c44-b528-3b24993c1e31-config-data-custom\") pod \"427e9a44-7186-4c44-b528-3b24993c1e31\" (UID: \"427e9a44-7186-4c44-b528-3b24993c1e31\") " Oct 03 08:09:50 crc kubenswrapper[4664]: I1003 08:09:50.015873 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/427e9a44-7186-4c44-b528-3b24993c1e31-config-data\") pod \"427e9a44-7186-4c44-b528-3b24993c1e31\" (UID: \"427e9a44-7186-4c44-b528-3b24993c1e31\") " Oct 03 08:09:50 crc kubenswrapper[4664]: I1003 08:09:50.015895 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/427e9a44-7186-4c44-b528-3b24993c1e31-scripts\") pod \"427e9a44-7186-4c44-b528-3b24993c1e31\" (UID: \"427e9a44-7186-4c44-b528-3b24993c1e31\") " Oct 03 08:09:50 crc kubenswrapper[4664]: I1003 08:09:50.015922 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/427e9a44-7186-4c44-b528-3b24993c1e31-etc-machine-id\") pod \"427e9a44-7186-4c44-b528-3b24993c1e31\" (UID: \"427e9a44-7186-4c44-b528-3b24993c1e31\") " Oct 03 08:09:50 crc kubenswrapper[4664]: I1003 08:09:50.015946 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/427e9a44-7186-4c44-b528-3b24993c1e31-combined-ca-bundle\") pod \"427e9a44-7186-4c44-b528-3b24993c1e31\" (UID: \"427e9a44-7186-4c44-b528-3b24993c1e31\") " Oct 03 08:09:50 crc kubenswrapper[4664]: I1003 08:09:50.018160 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/427e9a44-7186-4c44-b528-3b24993c1e31-logs" (OuterVolumeSpecName: "logs") pod "427e9a44-7186-4c44-b528-3b24993c1e31" (UID: "427e9a44-7186-4c44-b528-3b24993c1e31"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:09:50 crc kubenswrapper[4664]: I1003 08:09:50.019312 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/427e9a44-7186-4c44-b528-3b24993c1e31-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "427e9a44-7186-4c44-b528-3b24993c1e31" (UID: "427e9a44-7186-4c44-b528-3b24993c1e31"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:09:50 crc kubenswrapper[4664]: I1003 08:09:50.032423 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/427e9a44-7186-4c44-b528-3b24993c1e31-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "427e9a44-7186-4c44-b528-3b24993c1e31" (UID: "427e9a44-7186-4c44-b528-3b24993c1e31"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:09:50 crc kubenswrapper[4664]: I1003 08:09:50.032885 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/427e9a44-7186-4c44-b528-3b24993c1e31-kube-api-access-4tkd4" (OuterVolumeSpecName: "kube-api-access-4tkd4") pod "427e9a44-7186-4c44-b528-3b24993c1e31" (UID: "427e9a44-7186-4c44-b528-3b24993c1e31"). InnerVolumeSpecName "kube-api-access-4tkd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:09:50 crc kubenswrapper[4664]: I1003 08:09:50.077856 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/427e9a44-7186-4c44-b528-3b24993c1e31-scripts" (OuterVolumeSpecName: "scripts") pod "427e9a44-7186-4c44-b528-3b24993c1e31" (UID: "427e9a44-7186-4c44-b528-3b24993c1e31"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:09:50 crc kubenswrapper[4664]: I1003 08:09:50.115993 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/427e9a44-7186-4c44-b528-3b24993c1e31-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "427e9a44-7186-4c44-b528-3b24993c1e31" (UID: "427e9a44-7186-4c44-b528-3b24993c1e31"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:09:50 crc kubenswrapper[4664]: I1003 08:09:50.117833 4664 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/427e9a44-7186-4c44-b528-3b24993c1e31-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:50 crc kubenswrapper[4664]: I1003 08:09:50.117861 4664 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/427e9a44-7186-4c44-b528-3b24993c1e31-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:50 crc kubenswrapper[4664]: I1003 08:09:50.117872 4664 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/427e9a44-7186-4c44-b528-3b24993c1e31-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:50 crc kubenswrapper[4664]: I1003 08:09:50.117881 4664 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/427e9a44-7186-4c44-b528-3b24993c1e31-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:50 crc kubenswrapper[4664]: I1003 08:09:50.117891 4664 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/427e9a44-7186-4c44-b528-3b24993c1e31-logs\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:50 crc kubenswrapper[4664]: I1003 08:09:50.117901 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tkd4\" (UniqueName: \"kubernetes.io/projected/427e9a44-7186-4c44-b528-3b24993c1e31-kube-api-access-4tkd4\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:50 crc kubenswrapper[4664]: I1003 08:09:50.173718 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/427e9a44-7186-4c44-b528-3b24993c1e31-config-data" (OuterVolumeSpecName: "config-data") pod "427e9a44-7186-4c44-b528-3b24993c1e31" (UID: "427e9a44-7186-4c44-b528-3b24993c1e31"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:09:50 crc kubenswrapper[4664]: I1003 08:09:50.219711 4664 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/427e9a44-7186-4c44-b528-3b24993c1e31-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:50 crc kubenswrapper[4664]: I1003 08:09:50.704220 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 08:09:50 crc kubenswrapper[4664]: I1003 08:09:50.704279 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"427e9a44-7186-4c44-b528-3b24993c1e31","Type":"ContainerDied","Data":"b238fa2c9774ac9571ec2d17b3ae79db87f421a4aff20e85592a6f7329b3caa8"} Oct 03 08:09:50 crc kubenswrapper[4664]: I1003 08:09:50.704323 4664 scope.go:117] "RemoveContainer" containerID="b9319a02b1e121c25a82a8f929971279e0d74dfea2911556d16b3d210f2168d1" Oct 03 08:09:50 crc kubenswrapper[4664]: I1003 08:09:50.742035 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 03 08:09:50 crc kubenswrapper[4664]: I1003 08:09:50.742643 4664 scope.go:117] "RemoveContainer" containerID="8478f3ff9b1c2381d4f766ec85ab175a41e2ff8b8df90acd31febae1806c0414" Oct 03 08:09:50 crc kubenswrapper[4664]: I1003 08:09:50.763735 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 03 08:09:50 crc kubenswrapper[4664]: I1003 08:09:50.767415 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 03 08:09:50 crc kubenswrapper[4664]: E1003 08:09:50.767858 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="618be4c7-ebc2-43f5-aed3-f9e0c83fad8d" containerName="dnsmasq-dns" Oct 03 08:09:50 crc kubenswrapper[4664]: I1003 08:09:50.767875 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="618be4c7-ebc2-43f5-aed3-f9e0c83fad8d" containerName="dnsmasq-dns" Oct 03 08:09:50 crc kubenswrapper[4664]: E1003 08:09:50.767890 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="427e9a44-7186-4c44-b528-3b24993c1e31" containerName="cinder-api-log" Oct 03 08:09:50 crc kubenswrapper[4664]: I1003 08:09:50.767897 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="427e9a44-7186-4c44-b528-3b24993c1e31" containerName="cinder-api-log" Oct 03 08:09:50 crc kubenswrapper[4664]: E1003 08:09:50.767932 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="427e9a44-7186-4c44-b528-3b24993c1e31" containerName="cinder-api" Oct 03 08:09:50 crc kubenswrapper[4664]: I1003 08:09:50.767939 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="427e9a44-7186-4c44-b528-3b24993c1e31" containerName="cinder-api" Oct 03 08:09:50 crc kubenswrapper[4664]: E1003 08:09:50.767946 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="618be4c7-ebc2-43f5-aed3-f9e0c83fad8d" containerName="init" Oct 03 08:09:50 crc kubenswrapper[4664]: I1003 08:09:50.767952 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="618be4c7-ebc2-43f5-aed3-f9e0c83fad8d" containerName="init" Oct 03 08:09:50 crc kubenswrapper[4664]: I1003 08:09:50.768116 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="427e9a44-7186-4c44-b528-3b24993c1e31" containerName="cinder-api-log" Oct 03 08:09:50 crc kubenswrapper[4664]: I1003 08:09:50.768139 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="427e9a44-7186-4c44-b528-3b24993c1e31" containerName="cinder-api" Oct 03 08:09:50 crc kubenswrapper[4664]: I1003 08:09:50.768154 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="618be4c7-ebc2-43f5-aed3-f9e0c83fad8d" containerName="dnsmasq-dns" Oct 03 08:09:50 crc kubenswrapper[4664]: I1003 08:09:50.769122 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 08:09:50 crc kubenswrapper[4664]: I1003 08:09:50.774039 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 03 08:09:50 crc kubenswrapper[4664]: I1003 08:09:50.774308 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 03 08:09:50 crc kubenswrapper[4664]: I1003 08:09:50.774906 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 03 08:09:50 crc kubenswrapper[4664]: I1003 08:09:50.796599 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 03 08:09:50 crc kubenswrapper[4664]: I1003 08:09:50.940097 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81114761-59aa-4b5d-8848-963f6c73efe2-logs\") pod \"cinder-api-0\" (UID: \"81114761-59aa-4b5d-8848-963f6c73efe2\") " pod="openstack/cinder-api-0" Oct 03 08:09:50 crc kubenswrapper[4664]: I1003 08:09:50.940930 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/81114761-59aa-4b5d-8848-963f6c73efe2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"81114761-59aa-4b5d-8848-963f6c73efe2\") " pod="openstack/cinder-api-0" Oct 03 08:09:50 crc kubenswrapper[4664]: I1003 08:09:50.941225 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/81114761-59aa-4b5d-8848-963f6c73efe2-public-tls-certs\") pod \"cinder-api-0\" (UID: \"81114761-59aa-4b5d-8848-963f6c73efe2\") " pod="openstack/cinder-api-0" Oct 03 08:09:50 crc kubenswrapper[4664]: I1003 08:09:50.941405 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81114761-59aa-4b5d-8848-963f6c73efe2-scripts\") pod \"cinder-api-0\" (UID: \"81114761-59aa-4b5d-8848-963f6c73efe2\") " pod="openstack/cinder-api-0" Oct 03 08:09:50 crc kubenswrapper[4664]: I1003 08:09:50.941546 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6b7t\" (UniqueName: \"kubernetes.io/projected/81114761-59aa-4b5d-8848-963f6c73efe2-kube-api-access-h6b7t\") pod \"cinder-api-0\" (UID: \"81114761-59aa-4b5d-8848-963f6c73efe2\") " pod="openstack/cinder-api-0" Oct 03 08:09:50 crc kubenswrapper[4664]: I1003 08:09:50.941631 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81114761-59aa-4b5d-8848-963f6c73efe2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"81114761-59aa-4b5d-8848-963f6c73efe2\") " pod="openstack/cinder-api-0" Oct 03 08:09:50 crc kubenswrapper[4664]: I1003 08:09:50.941658 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/81114761-59aa-4b5d-8848-963f6c73efe2-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"81114761-59aa-4b5d-8848-963f6c73efe2\") " pod="openstack/cinder-api-0" Oct 03 08:09:50 crc kubenswrapper[4664]: I1003 08:09:50.941792 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/81114761-59aa-4b5d-8848-963f6c73efe2-config-data-custom\") pod \"cinder-api-0\" (UID: \"81114761-59aa-4b5d-8848-963f6c73efe2\") " pod="openstack/cinder-api-0" Oct 03 08:09:50 crc kubenswrapper[4664]: I1003 08:09:50.941866 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81114761-59aa-4b5d-8848-963f6c73efe2-config-data\") pod \"cinder-api-0\" (UID: \"81114761-59aa-4b5d-8848-963f6c73efe2\") " pod="openstack/cinder-api-0" Oct 03 08:09:51 crc kubenswrapper[4664]: I1003 08:09:51.043030 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/81114761-59aa-4b5d-8848-963f6c73efe2-config-data-custom\") pod \"cinder-api-0\" (UID: \"81114761-59aa-4b5d-8848-963f6c73efe2\") " pod="openstack/cinder-api-0" Oct 03 08:09:51 crc kubenswrapper[4664]: I1003 08:09:51.043120 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81114761-59aa-4b5d-8848-963f6c73efe2-config-data\") pod \"cinder-api-0\" (UID: \"81114761-59aa-4b5d-8848-963f6c73efe2\") " pod="openstack/cinder-api-0" Oct 03 08:09:51 crc kubenswrapper[4664]: I1003 08:09:51.044193 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81114761-59aa-4b5d-8848-963f6c73efe2-logs\") pod \"cinder-api-0\" (UID: \"81114761-59aa-4b5d-8848-963f6c73efe2\") " pod="openstack/cinder-api-0" Oct 03 08:09:51 crc kubenswrapper[4664]: I1003 08:09:51.044239 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/81114761-59aa-4b5d-8848-963f6c73efe2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"81114761-59aa-4b5d-8848-963f6c73efe2\") " pod="openstack/cinder-api-0" Oct 03 08:09:51 crc kubenswrapper[4664]: I1003 08:09:51.044290 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/81114761-59aa-4b5d-8848-963f6c73efe2-public-tls-certs\") pod \"cinder-api-0\" (UID: \"81114761-59aa-4b5d-8848-963f6c73efe2\") " pod="openstack/cinder-api-0" Oct 03 08:09:51 crc kubenswrapper[4664]: I1003 08:09:51.044384 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/81114761-59aa-4b5d-8848-963f6c73efe2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"81114761-59aa-4b5d-8848-963f6c73efe2\") " pod="openstack/cinder-api-0" Oct 03 08:09:51 crc kubenswrapper[4664]: I1003 08:09:51.044473 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81114761-59aa-4b5d-8848-963f6c73efe2-scripts\") pod \"cinder-api-0\" (UID: \"81114761-59aa-4b5d-8848-963f6c73efe2\") " pod="openstack/cinder-api-0" Oct 03 08:09:51 crc kubenswrapper[4664]: I1003 08:09:51.044556 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6b7t\" (UniqueName: \"kubernetes.io/projected/81114761-59aa-4b5d-8848-963f6c73efe2-kube-api-access-h6b7t\") pod \"cinder-api-0\" (UID: \"81114761-59aa-4b5d-8848-963f6c73efe2\") " pod="openstack/cinder-api-0" Oct 03 08:09:51 crc kubenswrapper[4664]: I1003 08:09:51.044709 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81114761-59aa-4b5d-8848-963f6c73efe2-logs\") pod \"cinder-api-0\" (UID: \"81114761-59aa-4b5d-8848-963f6c73efe2\") " pod="openstack/cinder-api-0" Oct 03 08:09:51 crc kubenswrapper[4664]: I1003 08:09:51.044974 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81114761-59aa-4b5d-8848-963f6c73efe2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"81114761-59aa-4b5d-8848-963f6c73efe2\") " pod="openstack/cinder-api-0" Oct 03 08:09:51 crc kubenswrapper[4664]: I1003 08:09:51.045009 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/81114761-59aa-4b5d-8848-963f6c73efe2-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"81114761-59aa-4b5d-8848-963f6c73efe2\") " pod="openstack/cinder-api-0" Oct 03 08:09:51 crc kubenswrapper[4664]: I1003 08:09:51.048427 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/81114761-59aa-4b5d-8848-963f6c73efe2-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"81114761-59aa-4b5d-8848-963f6c73efe2\") " pod="openstack/cinder-api-0" Oct 03 08:09:51 crc kubenswrapper[4664]: I1003 08:09:51.048944 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81114761-59aa-4b5d-8848-963f6c73efe2-config-data\") pod \"cinder-api-0\" (UID: \"81114761-59aa-4b5d-8848-963f6c73efe2\") " pod="openstack/cinder-api-0" Oct 03 08:09:51 crc kubenswrapper[4664]: I1003 08:09:51.049375 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/81114761-59aa-4b5d-8848-963f6c73efe2-config-data-custom\") pod \"cinder-api-0\" (UID: \"81114761-59aa-4b5d-8848-963f6c73efe2\") " pod="openstack/cinder-api-0" Oct 03 08:09:51 crc kubenswrapper[4664]: I1003 08:09:51.049543 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81114761-59aa-4b5d-8848-963f6c73efe2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"81114761-59aa-4b5d-8848-963f6c73efe2\") " pod="openstack/cinder-api-0" Oct 03 08:09:51 crc kubenswrapper[4664]: I1003 08:09:51.051212 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81114761-59aa-4b5d-8848-963f6c73efe2-scripts\") pod \"cinder-api-0\" (UID: \"81114761-59aa-4b5d-8848-963f6c73efe2\") " pod="openstack/cinder-api-0" Oct 03 08:09:51 crc kubenswrapper[4664]: I1003 08:09:51.053743 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/81114761-59aa-4b5d-8848-963f6c73efe2-public-tls-certs\") pod \"cinder-api-0\" (UID: \"81114761-59aa-4b5d-8848-963f6c73efe2\") " pod="openstack/cinder-api-0" Oct 03 08:09:51 crc kubenswrapper[4664]: I1003 08:09:51.081416 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6b7t\" (UniqueName: \"kubernetes.io/projected/81114761-59aa-4b5d-8848-963f6c73efe2-kube-api-access-h6b7t\") pod \"cinder-api-0\" (UID: \"81114761-59aa-4b5d-8848-963f6c73efe2\") " pod="openstack/cinder-api-0" Oct 03 08:09:51 crc kubenswrapper[4664]: I1003 08:09:51.178751 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 08:09:51 crc kubenswrapper[4664]: I1003 08:09:51.370770 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6c47458c7b-zb4ls" Oct 03 08:09:51 crc kubenswrapper[4664]: I1003 08:09:51.680185 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 03 08:09:51 crc kubenswrapper[4664]: I1003 08:09:51.715455 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"81114761-59aa-4b5d-8848-963f6c73efe2","Type":"ContainerStarted","Data":"8323d2f321cb56135af8de58d112f179323e1fcbbdab73d545c6b5bcdd462d7f"} Oct 03 08:09:51 crc kubenswrapper[4664]: I1003 08:09:51.892409 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="427e9a44-7186-4c44-b528-3b24993c1e31" path="/var/lib/kubelet/pods/427e9a44-7186-4c44-b528-3b24993c1e31/volumes" Oct 03 08:09:52 crc kubenswrapper[4664]: I1003 08:09:52.234104 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 03 08:09:52 crc kubenswrapper[4664]: I1003 08:09:52.236173 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 08:09:52 crc kubenswrapper[4664]: I1003 08:09:52.240457 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 03 08:09:52 crc kubenswrapper[4664]: I1003 08:09:52.240474 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-k4fkd" Oct 03 08:09:52 crc kubenswrapper[4664]: I1003 08:09:52.241018 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 03 08:09:52 crc kubenswrapper[4664]: I1003 08:09:52.244339 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 03 08:09:52 crc kubenswrapper[4664]: I1003 08:09:52.268968 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/06ee9d13-7b0c-4619-8421-6f1a5d8a2f05-openstack-config\") pod \"openstackclient\" (UID: \"06ee9d13-7b0c-4619-8421-6f1a5d8a2f05\") " pod="openstack/openstackclient" Oct 03 08:09:52 crc kubenswrapper[4664]: I1003 08:09:52.269812 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xsk5\" (UniqueName: \"kubernetes.io/projected/06ee9d13-7b0c-4619-8421-6f1a5d8a2f05-kube-api-access-2xsk5\") pod \"openstackclient\" (UID: \"06ee9d13-7b0c-4619-8421-6f1a5d8a2f05\") " pod="openstack/openstackclient" Oct 03 08:09:52 crc kubenswrapper[4664]: I1003 08:09:52.270226 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/06ee9d13-7b0c-4619-8421-6f1a5d8a2f05-openstack-config-secret\") pod \"openstackclient\" (UID: \"06ee9d13-7b0c-4619-8421-6f1a5d8a2f05\") " pod="openstack/openstackclient" Oct 03 08:09:52 crc kubenswrapper[4664]: I1003 08:09:52.270267 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06ee9d13-7b0c-4619-8421-6f1a5d8a2f05-combined-ca-bundle\") pod \"openstackclient\" (UID: \"06ee9d13-7b0c-4619-8421-6f1a5d8a2f05\") " pod="openstack/openstackclient" Oct 03 08:09:52 crc kubenswrapper[4664]: I1003 08:09:52.371167 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xsk5\" (UniqueName: \"kubernetes.io/projected/06ee9d13-7b0c-4619-8421-6f1a5d8a2f05-kube-api-access-2xsk5\") pod \"openstackclient\" (UID: \"06ee9d13-7b0c-4619-8421-6f1a5d8a2f05\") " pod="openstack/openstackclient" Oct 03 08:09:52 crc kubenswrapper[4664]: I1003 08:09:52.371267 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/06ee9d13-7b0c-4619-8421-6f1a5d8a2f05-openstack-config-secret\") pod \"openstackclient\" (UID: \"06ee9d13-7b0c-4619-8421-6f1a5d8a2f05\") " pod="openstack/openstackclient" Oct 03 08:09:52 crc kubenswrapper[4664]: I1003 08:09:52.371302 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06ee9d13-7b0c-4619-8421-6f1a5d8a2f05-combined-ca-bundle\") pod \"openstackclient\" (UID: \"06ee9d13-7b0c-4619-8421-6f1a5d8a2f05\") " pod="openstack/openstackclient" Oct 03 08:09:52 crc kubenswrapper[4664]: I1003 08:09:52.371370 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/06ee9d13-7b0c-4619-8421-6f1a5d8a2f05-openstack-config\") pod \"openstackclient\" (UID: \"06ee9d13-7b0c-4619-8421-6f1a5d8a2f05\") " pod="openstack/openstackclient" Oct 03 08:09:52 crc kubenswrapper[4664]: I1003 08:09:52.372671 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/06ee9d13-7b0c-4619-8421-6f1a5d8a2f05-openstack-config\") pod \"openstackclient\" (UID: \"06ee9d13-7b0c-4619-8421-6f1a5d8a2f05\") " pod="openstack/openstackclient" Oct 03 08:09:52 crc kubenswrapper[4664]: I1003 08:09:52.379726 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/06ee9d13-7b0c-4619-8421-6f1a5d8a2f05-openstack-config-secret\") pod \"openstackclient\" (UID: \"06ee9d13-7b0c-4619-8421-6f1a5d8a2f05\") " pod="openstack/openstackclient" Oct 03 08:09:52 crc kubenswrapper[4664]: I1003 08:09:52.383008 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06ee9d13-7b0c-4619-8421-6f1a5d8a2f05-combined-ca-bundle\") pod \"openstackclient\" (UID: \"06ee9d13-7b0c-4619-8421-6f1a5d8a2f05\") " pod="openstack/openstackclient" Oct 03 08:09:52 crc kubenswrapper[4664]: I1003 08:09:52.393629 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xsk5\" (UniqueName: \"kubernetes.io/projected/06ee9d13-7b0c-4619-8421-6f1a5d8a2f05-kube-api-access-2xsk5\") pod \"openstackclient\" (UID: \"06ee9d13-7b0c-4619-8421-6f1a5d8a2f05\") " pod="openstack/openstackclient" Oct 03 08:09:52 crc kubenswrapper[4664]: I1003 08:09:52.558097 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 08:09:52 crc kubenswrapper[4664]: I1003 08:09:52.735222 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"81114761-59aa-4b5d-8848-963f6c73efe2","Type":"ContainerStarted","Data":"408727f57e754e698aaf5d0f3215d90b0738968a8d366424cc1d1f81baaad9b1"} Oct 03 08:09:53 crc kubenswrapper[4664]: I1003 08:09:53.045400 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 03 08:09:53 crc kubenswrapper[4664]: W1003 08:09:53.052016 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06ee9d13_7b0c_4619_8421_6f1a5d8a2f05.slice/crio-89c458c822f84297742fc7a0472ea4bd579442d0b249a2981f3a2764152acdd2 WatchSource:0}: Error finding container 89c458c822f84297742fc7a0472ea4bd579442d0b249a2981f3a2764152acdd2: Status 404 returned error can't find the container with id 89c458c822f84297742fc7a0472ea4bd579442d0b249a2981f3a2764152acdd2 Oct 03 08:09:53 crc kubenswrapper[4664]: I1003 08:09:53.114674 4664 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-58f858bbd-kn4ws" podUID="007d3aea-e570-499e-8373-3e43e65865c1" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.157:9311/healthcheck\": read tcp 10.217.0.2:47068->10.217.0.157:9311: read: connection reset by peer" Oct 03 08:09:53 crc kubenswrapper[4664]: I1003 08:09:53.114919 4664 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-58f858bbd-kn4ws" podUID="007d3aea-e570-499e-8373-3e43e65865c1" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.157:9311/healthcheck\": read tcp 10.217.0.2:47082->10.217.0.157:9311: read: connection reset by peer" Oct 03 08:09:53 crc kubenswrapper[4664]: I1003 08:09:53.582849 4664 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-76644f9584-br5jb" podUID="30ce3373-ef30-4727-b57f-5be7963d1892" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.142:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.142:8443: connect: connection refused" Oct 03 08:09:53 crc kubenswrapper[4664]: I1003 08:09:53.652772 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-58f858bbd-kn4ws" Oct 03 08:09:53 crc kubenswrapper[4664]: I1003 08:09:53.749579 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"06ee9d13-7b0c-4619-8421-6f1a5d8a2f05","Type":"ContainerStarted","Data":"89c458c822f84297742fc7a0472ea4bd579442d0b249a2981f3a2764152acdd2"} Oct 03 08:09:53 crc kubenswrapper[4664]: I1003 08:09:53.751719 4664 generic.go:334] "Generic (PLEG): container finished" podID="007d3aea-e570-499e-8373-3e43e65865c1" containerID="46fcd88dee1d3df9568ee3c1a5341aaf3ab6ccc47d4821f99f8d52cf576fa435" exitCode=0 Oct 03 08:09:53 crc kubenswrapper[4664]: I1003 08:09:53.751765 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-58f858bbd-kn4ws" event={"ID":"007d3aea-e570-499e-8373-3e43e65865c1","Type":"ContainerDied","Data":"46fcd88dee1d3df9568ee3c1a5341aaf3ab6ccc47d4821f99f8d52cf576fa435"} Oct 03 08:09:53 crc kubenswrapper[4664]: I1003 08:09:53.751785 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-58f858bbd-kn4ws" event={"ID":"007d3aea-e570-499e-8373-3e43e65865c1","Type":"ContainerDied","Data":"d18785270b9b3778c22cb4e4f38f22a1aee28663f0339d1cdfd9da0f5cfb3a29"} Oct 03 08:09:53 crc kubenswrapper[4664]: I1003 08:09:53.751802 4664 scope.go:117] "RemoveContainer" containerID="46fcd88dee1d3df9568ee3c1a5341aaf3ab6ccc47d4821f99f8d52cf576fa435" Oct 03 08:09:53 crc kubenswrapper[4664]: I1003 08:09:53.751903 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-58f858bbd-kn4ws" Oct 03 08:09:53 crc kubenswrapper[4664]: I1003 08:09:53.756217 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"81114761-59aa-4b5d-8848-963f6c73efe2","Type":"ContainerStarted","Data":"6b0a8d7188a31beffcf2e9e7b8936b55d0737e710e3d79c52c05a95ad7b9da9e"} Oct 03 08:09:53 crc kubenswrapper[4664]: I1003 08:09:53.757199 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 03 08:09:53 crc kubenswrapper[4664]: I1003 08:09:53.780841 4664 scope.go:117] "RemoveContainer" containerID="5d221f7f5f88e0b33e037e3d454abee6ee4db6e00718bc2a5fa26031433469c6" Oct 03 08:09:53 crc kubenswrapper[4664]: I1003 08:09:53.794968 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.794942929 podStartE2EDuration="3.794942929s" podCreationTimestamp="2025-10-03 08:09:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:09:53.783207134 +0000 UTC m=+1294.604397634" watchObservedRunningTime="2025-10-03 08:09:53.794942929 +0000 UTC m=+1294.616133419" Oct 03 08:09:53 crc kubenswrapper[4664]: I1003 08:09:53.801450 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/007d3aea-e570-499e-8373-3e43e65865c1-combined-ca-bundle\") pod \"007d3aea-e570-499e-8373-3e43e65865c1\" (UID: \"007d3aea-e570-499e-8373-3e43e65865c1\") " Oct 03 08:09:53 crc kubenswrapper[4664]: I1003 08:09:53.801555 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/007d3aea-e570-499e-8373-3e43e65865c1-config-data\") pod \"007d3aea-e570-499e-8373-3e43e65865c1\" (UID: \"007d3aea-e570-499e-8373-3e43e65865c1\") " Oct 03 08:09:53 crc kubenswrapper[4664]: I1003 08:09:53.801657 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8bz4\" (UniqueName: \"kubernetes.io/projected/007d3aea-e570-499e-8373-3e43e65865c1-kube-api-access-q8bz4\") pod \"007d3aea-e570-499e-8373-3e43e65865c1\" (UID: \"007d3aea-e570-499e-8373-3e43e65865c1\") " Oct 03 08:09:53 crc kubenswrapper[4664]: I1003 08:09:53.801718 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/007d3aea-e570-499e-8373-3e43e65865c1-config-data-custom\") pod \"007d3aea-e570-499e-8373-3e43e65865c1\" (UID: \"007d3aea-e570-499e-8373-3e43e65865c1\") " Oct 03 08:09:53 crc kubenswrapper[4664]: I1003 08:09:53.801828 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/007d3aea-e570-499e-8373-3e43e65865c1-logs\") pod \"007d3aea-e570-499e-8373-3e43e65865c1\" (UID: \"007d3aea-e570-499e-8373-3e43e65865c1\") " Oct 03 08:09:53 crc kubenswrapper[4664]: I1003 08:09:53.805250 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/007d3aea-e570-499e-8373-3e43e65865c1-logs" (OuterVolumeSpecName: "logs") pod "007d3aea-e570-499e-8373-3e43e65865c1" (UID: "007d3aea-e570-499e-8373-3e43e65865c1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:09:53 crc kubenswrapper[4664]: I1003 08:09:53.808148 4664 scope.go:117] "RemoveContainer" containerID="46fcd88dee1d3df9568ee3c1a5341aaf3ab6ccc47d4821f99f8d52cf576fa435" Oct 03 08:09:53 crc kubenswrapper[4664]: E1003 08:09:53.808683 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46fcd88dee1d3df9568ee3c1a5341aaf3ab6ccc47d4821f99f8d52cf576fa435\": container with ID starting with 46fcd88dee1d3df9568ee3c1a5341aaf3ab6ccc47d4821f99f8d52cf576fa435 not found: ID does not exist" containerID="46fcd88dee1d3df9568ee3c1a5341aaf3ab6ccc47d4821f99f8d52cf576fa435" Oct 03 08:09:53 crc kubenswrapper[4664]: I1003 08:09:53.808743 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46fcd88dee1d3df9568ee3c1a5341aaf3ab6ccc47d4821f99f8d52cf576fa435"} err="failed to get container status \"46fcd88dee1d3df9568ee3c1a5341aaf3ab6ccc47d4821f99f8d52cf576fa435\": rpc error: code = NotFound desc = could not find container \"46fcd88dee1d3df9568ee3c1a5341aaf3ab6ccc47d4821f99f8d52cf576fa435\": container with ID starting with 46fcd88dee1d3df9568ee3c1a5341aaf3ab6ccc47d4821f99f8d52cf576fa435 not found: ID does not exist" Oct 03 08:09:53 crc kubenswrapper[4664]: I1003 08:09:53.808773 4664 scope.go:117] "RemoveContainer" containerID="5d221f7f5f88e0b33e037e3d454abee6ee4db6e00718bc2a5fa26031433469c6" Oct 03 08:09:53 crc kubenswrapper[4664]: E1003 08:09:53.809270 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d221f7f5f88e0b33e037e3d454abee6ee4db6e00718bc2a5fa26031433469c6\": container with ID starting with 5d221f7f5f88e0b33e037e3d454abee6ee4db6e00718bc2a5fa26031433469c6 not found: ID does not exist" containerID="5d221f7f5f88e0b33e037e3d454abee6ee4db6e00718bc2a5fa26031433469c6" Oct 03 08:09:53 crc kubenswrapper[4664]: I1003 08:09:53.809293 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d221f7f5f88e0b33e037e3d454abee6ee4db6e00718bc2a5fa26031433469c6"} err="failed to get container status \"5d221f7f5f88e0b33e037e3d454abee6ee4db6e00718bc2a5fa26031433469c6\": rpc error: code = NotFound desc = could not find container \"5d221f7f5f88e0b33e037e3d454abee6ee4db6e00718bc2a5fa26031433469c6\": container with ID starting with 5d221f7f5f88e0b33e037e3d454abee6ee4db6e00718bc2a5fa26031433469c6 not found: ID does not exist" Oct 03 08:09:53 crc kubenswrapper[4664]: I1003 08:09:53.810554 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/007d3aea-e570-499e-8373-3e43e65865c1-kube-api-access-q8bz4" (OuterVolumeSpecName: "kube-api-access-q8bz4") pod "007d3aea-e570-499e-8373-3e43e65865c1" (UID: "007d3aea-e570-499e-8373-3e43e65865c1"). InnerVolumeSpecName "kube-api-access-q8bz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:09:53 crc kubenswrapper[4664]: I1003 08:09:53.814684 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/007d3aea-e570-499e-8373-3e43e65865c1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "007d3aea-e570-499e-8373-3e43e65865c1" (UID: "007d3aea-e570-499e-8373-3e43e65865c1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:09:53 crc kubenswrapper[4664]: I1003 08:09:53.838775 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/007d3aea-e570-499e-8373-3e43e65865c1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "007d3aea-e570-499e-8373-3e43e65865c1" (UID: "007d3aea-e570-499e-8373-3e43e65865c1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:09:53 crc kubenswrapper[4664]: I1003 08:09:53.877025 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/007d3aea-e570-499e-8373-3e43e65865c1-config-data" (OuterVolumeSpecName: "config-data") pod "007d3aea-e570-499e-8373-3e43e65865c1" (UID: "007d3aea-e570-499e-8373-3e43e65865c1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:09:53 crc kubenswrapper[4664]: I1003 08:09:53.898925 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-b754456d9-lc2mg" Oct 03 08:09:53 crc kubenswrapper[4664]: I1003 08:09:53.905374 4664 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/007d3aea-e570-499e-8373-3e43e65865c1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:53 crc kubenswrapper[4664]: I1003 08:09:53.905421 4664 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/007d3aea-e570-499e-8373-3e43e65865c1-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:53 crc kubenswrapper[4664]: I1003 08:09:53.905435 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8bz4\" (UniqueName: \"kubernetes.io/projected/007d3aea-e570-499e-8373-3e43e65865c1-kube-api-access-q8bz4\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:53 crc kubenswrapper[4664]: I1003 08:09:53.905449 4664 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/007d3aea-e570-499e-8373-3e43e65865c1-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:53 crc kubenswrapper[4664]: I1003 08:09:53.905461 4664 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/007d3aea-e570-499e-8373-3e43e65865c1-logs\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:53 crc kubenswrapper[4664]: I1003 08:09:53.979703 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6c47458c7b-zb4ls"] Oct 03 08:09:53 crc kubenswrapper[4664]: I1003 08:09:53.980320 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6c47458c7b-zb4ls" podUID="6e72a85b-4087-4478-9d83-91a468eda59d" containerName="neutron-api" containerID="cri-o://0d52ba589d591ea1719e9050e7a7a4c91cb57efc9056882128d6272f61ee7b0f" gracePeriod=30 Oct 03 08:09:53 crc kubenswrapper[4664]: I1003 08:09:53.981618 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6c47458c7b-zb4ls" podUID="6e72a85b-4087-4478-9d83-91a468eda59d" containerName="neutron-httpd" containerID="cri-o://fa764f39bd76987fb25c5925222d5f6a1c5c9e9227bfd785e27565753d068f04" gracePeriod=30 Oct 03 08:09:54 crc kubenswrapper[4664]: I1003 08:09:54.094888 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-58f858bbd-kn4ws"] Oct 03 08:09:54 crc kubenswrapper[4664]: I1003 08:09:54.103107 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-58f858bbd-kn4ws"] Oct 03 08:09:54 crc kubenswrapper[4664]: I1003 08:09:54.239961 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 03 08:09:54 crc kubenswrapper[4664]: I1003 08:09:54.243116 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bb4fc677f-x99b6" Oct 03 08:09:54 crc kubenswrapper[4664]: I1003 08:09:54.325500 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-rb6rm"] Oct 03 08:09:54 crc kubenswrapper[4664]: I1003 08:09:54.325956 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57c957c4ff-rb6rm" podUID="85f0c94a-31da-41e5-aa27-9840d3704a67" containerName="dnsmasq-dns" containerID="cri-o://2d76b47954e6fd7c49e12c3f0824653dc8bb66b39e76aecd27741298dc53ab1a" gracePeriod=10 Oct 03 08:09:54 crc kubenswrapper[4664]: I1003 08:09:54.348685 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 08:09:54 crc kubenswrapper[4664]: I1003 08:09:54.770621 4664 generic.go:334] "Generic (PLEG): container finished" podID="6e72a85b-4087-4478-9d83-91a468eda59d" containerID="fa764f39bd76987fb25c5925222d5f6a1c5c9e9227bfd785e27565753d068f04" exitCode=0 Oct 03 08:09:54 crc kubenswrapper[4664]: I1003 08:09:54.770718 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c47458c7b-zb4ls" event={"ID":"6e72a85b-4087-4478-9d83-91a468eda59d","Type":"ContainerDied","Data":"fa764f39bd76987fb25c5925222d5f6a1c5c9e9227bfd785e27565753d068f04"} Oct 03 08:09:54 crc kubenswrapper[4664]: I1003 08:09:54.777206 4664 generic.go:334] "Generic (PLEG): container finished" podID="85f0c94a-31da-41e5-aa27-9840d3704a67" containerID="2d76b47954e6fd7c49e12c3f0824653dc8bb66b39e76aecd27741298dc53ab1a" exitCode=0 Oct 03 08:09:54 crc kubenswrapper[4664]: I1003 08:09:54.777822 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-rb6rm" event={"ID":"85f0c94a-31da-41e5-aa27-9840d3704a67","Type":"ContainerDied","Data":"2d76b47954e6fd7c49e12c3f0824653dc8bb66b39e76aecd27741298dc53ab1a"} Oct 03 08:09:54 crc kubenswrapper[4664]: I1003 08:09:54.778514 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f" containerName="probe" containerID="cri-o://1af1bb3b970c8ef1052412b8f46eff8a351e6b08149a0b11fe5770b5dd2e5064" gracePeriod=30 Oct 03 08:09:54 crc kubenswrapper[4664]: I1003 08:09:54.778216 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f" containerName="cinder-scheduler" containerID="cri-o://da0df2e299f01a9a0033b6926895ab064100c6876f2f24cd287547a42e5aed45" gracePeriod=30 Oct 03 08:09:54 crc kubenswrapper[4664]: I1003 08:09:54.814659 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-rb6rm" Oct 03 08:09:54 crc kubenswrapper[4664]: I1003 08:09:54.928020 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85f0c94a-31da-41e5-aa27-9840d3704a67-dns-svc\") pod \"85f0c94a-31da-41e5-aa27-9840d3704a67\" (UID: \"85f0c94a-31da-41e5-aa27-9840d3704a67\") " Oct 03 08:09:54 crc kubenswrapper[4664]: I1003 08:09:54.928122 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85f0c94a-31da-41e5-aa27-9840d3704a67-config\") pod \"85f0c94a-31da-41e5-aa27-9840d3704a67\" (UID: \"85f0c94a-31da-41e5-aa27-9840d3704a67\") " Oct 03 08:09:54 crc kubenswrapper[4664]: I1003 08:09:54.928248 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/85f0c94a-31da-41e5-aa27-9840d3704a67-ovsdbserver-sb\") pod \"85f0c94a-31da-41e5-aa27-9840d3704a67\" (UID: \"85f0c94a-31da-41e5-aa27-9840d3704a67\") " Oct 03 08:09:54 crc kubenswrapper[4664]: I1003 08:09:54.928275 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/85f0c94a-31da-41e5-aa27-9840d3704a67-ovsdbserver-nb\") pod \"85f0c94a-31da-41e5-aa27-9840d3704a67\" (UID: \"85f0c94a-31da-41e5-aa27-9840d3704a67\") " Oct 03 08:09:54 crc kubenswrapper[4664]: I1003 08:09:54.928337 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kkmd\" (UniqueName: \"kubernetes.io/projected/85f0c94a-31da-41e5-aa27-9840d3704a67-kube-api-access-8kkmd\") pod \"85f0c94a-31da-41e5-aa27-9840d3704a67\" (UID: \"85f0c94a-31da-41e5-aa27-9840d3704a67\") " Oct 03 08:09:54 crc kubenswrapper[4664]: I1003 08:09:54.928379 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/85f0c94a-31da-41e5-aa27-9840d3704a67-dns-swift-storage-0\") pod \"85f0c94a-31da-41e5-aa27-9840d3704a67\" (UID: \"85f0c94a-31da-41e5-aa27-9840d3704a67\") " Oct 03 08:09:54 crc kubenswrapper[4664]: I1003 08:09:54.944323 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85f0c94a-31da-41e5-aa27-9840d3704a67-kube-api-access-8kkmd" (OuterVolumeSpecName: "kube-api-access-8kkmd") pod "85f0c94a-31da-41e5-aa27-9840d3704a67" (UID: "85f0c94a-31da-41e5-aa27-9840d3704a67"). InnerVolumeSpecName "kube-api-access-8kkmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:09:54 crc kubenswrapper[4664]: I1003 08:09:54.986067 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85f0c94a-31da-41e5-aa27-9840d3704a67-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "85f0c94a-31da-41e5-aa27-9840d3704a67" (UID: "85f0c94a-31da-41e5-aa27-9840d3704a67"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:09:55 crc kubenswrapper[4664]: I1003 08:09:55.068007 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kkmd\" (UniqueName: \"kubernetes.io/projected/85f0c94a-31da-41e5-aa27-9840d3704a67-kube-api-access-8kkmd\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:55 crc kubenswrapper[4664]: I1003 08:09:55.068054 4664 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85f0c94a-31da-41e5-aa27-9840d3704a67-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:55 crc kubenswrapper[4664]: I1003 08:09:55.080123 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85f0c94a-31da-41e5-aa27-9840d3704a67-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "85f0c94a-31da-41e5-aa27-9840d3704a67" (UID: "85f0c94a-31da-41e5-aa27-9840d3704a67"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:09:55 crc kubenswrapper[4664]: I1003 08:09:55.080760 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85f0c94a-31da-41e5-aa27-9840d3704a67-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "85f0c94a-31da-41e5-aa27-9840d3704a67" (UID: "85f0c94a-31da-41e5-aa27-9840d3704a67"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:09:55 crc kubenswrapper[4664]: I1003 08:09:55.094192 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85f0c94a-31da-41e5-aa27-9840d3704a67-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "85f0c94a-31da-41e5-aa27-9840d3704a67" (UID: "85f0c94a-31da-41e5-aa27-9840d3704a67"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:09:55 crc kubenswrapper[4664]: I1003 08:09:55.099777 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85f0c94a-31da-41e5-aa27-9840d3704a67-config" (OuterVolumeSpecName: "config") pod "85f0c94a-31da-41e5-aa27-9840d3704a67" (UID: "85f0c94a-31da-41e5-aa27-9840d3704a67"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:09:55 crc kubenswrapper[4664]: I1003 08:09:55.169918 4664 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/85f0c94a-31da-41e5-aa27-9840d3704a67-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:55 crc kubenswrapper[4664]: I1003 08:09:55.169962 4664 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/85f0c94a-31da-41e5-aa27-9840d3704a67-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:55 crc kubenswrapper[4664]: I1003 08:09:55.169976 4664 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/85f0c94a-31da-41e5-aa27-9840d3704a67-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:55 crc kubenswrapper[4664]: I1003 08:09:55.169989 4664 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85f0c94a-31da-41e5-aa27-9840d3704a67-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:55 crc kubenswrapper[4664]: I1003 08:09:55.802777 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-rb6rm" event={"ID":"85f0c94a-31da-41e5-aa27-9840d3704a67","Type":"ContainerDied","Data":"3fd0cd375b74b811bac2a3598c913a8298257e4c911faf9270f338e8c6b03064"} Oct 03 08:09:55 crc kubenswrapper[4664]: I1003 08:09:55.802818 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-rb6rm" Oct 03 08:09:55 crc kubenswrapper[4664]: I1003 08:09:55.802859 4664 scope.go:117] "RemoveContainer" containerID="2d76b47954e6fd7c49e12c3f0824653dc8bb66b39e76aecd27741298dc53ab1a" Oct 03 08:09:55 crc kubenswrapper[4664]: I1003 08:09:55.882159 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-rb6rm"] Oct 03 08:09:55 crc kubenswrapper[4664]: I1003 08:09:55.891894 4664 scope.go:117] "RemoveContainer" containerID="829e099c46902780ec295143722995fd114225c55c34277c05bb89f90cfb29b2" Oct 03 08:09:55 crc kubenswrapper[4664]: I1003 08:09:55.903289 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="007d3aea-e570-499e-8373-3e43e65865c1" path="/var/lib/kubelet/pods/007d3aea-e570-499e-8373-3e43e65865c1/volumes" Oct 03 08:09:55 crc kubenswrapper[4664]: I1003 08:09:55.903974 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-rb6rm"] Oct 03 08:09:56 crc kubenswrapper[4664]: I1003 08:09:56.520987 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 08:09:56 crc kubenswrapper[4664]: I1003 08:09:56.705994 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f-config-data-custom\") pod \"acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f\" (UID: \"acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f\") " Oct 03 08:09:56 crc kubenswrapper[4664]: I1003 08:09:56.706069 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f-scripts\") pod \"acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f\" (UID: \"acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f\") " Oct 03 08:09:56 crc kubenswrapper[4664]: I1003 08:09:56.706097 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lp6mk\" (UniqueName: \"kubernetes.io/projected/acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f-kube-api-access-lp6mk\") pod \"acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f\" (UID: \"acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f\") " Oct 03 08:09:56 crc kubenswrapper[4664]: I1003 08:09:56.706238 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f-config-data\") pod \"acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f\" (UID: \"acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f\") " Oct 03 08:09:56 crc kubenswrapper[4664]: I1003 08:09:56.706330 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f-combined-ca-bundle\") pod \"acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f\" (UID: \"acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f\") " Oct 03 08:09:56 crc kubenswrapper[4664]: I1003 08:09:56.706370 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f-etc-machine-id\") pod \"acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f\" (UID: \"acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f\") " Oct 03 08:09:56 crc kubenswrapper[4664]: I1003 08:09:56.706805 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f" (UID: "acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:09:56 crc kubenswrapper[4664]: I1003 08:09:56.715490 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f" (UID: "acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:09:56 crc kubenswrapper[4664]: I1003 08:09:56.715702 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f-kube-api-access-lp6mk" (OuterVolumeSpecName: "kube-api-access-lp6mk") pod "acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f" (UID: "acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f"). InnerVolumeSpecName "kube-api-access-lp6mk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:09:56 crc kubenswrapper[4664]: I1003 08:09:56.720725 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f-scripts" (OuterVolumeSpecName: "scripts") pod "acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f" (UID: "acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:09:56 crc kubenswrapper[4664]: I1003 08:09:56.736549 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-bcb7d647f-zjhrm"] Oct 03 08:09:56 crc kubenswrapper[4664]: E1003 08:09:56.737120 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85f0c94a-31da-41e5-aa27-9840d3704a67" containerName="dnsmasq-dns" Oct 03 08:09:56 crc kubenswrapper[4664]: I1003 08:09:56.737144 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="85f0c94a-31da-41e5-aa27-9840d3704a67" containerName="dnsmasq-dns" Oct 03 08:09:56 crc kubenswrapper[4664]: E1003 08:09:56.737167 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f" containerName="cinder-scheduler" Oct 03 08:09:56 crc kubenswrapper[4664]: I1003 08:09:56.737175 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f" containerName="cinder-scheduler" Oct 03 08:09:56 crc kubenswrapper[4664]: E1003 08:09:56.737190 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="007d3aea-e570-499e-8373-3e43e65865c1" containerName="barbican-api-log" Oct 03 08:09:56 crc kubenswrapper[4664]: I1003 08:09:56.737199 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="007d3aea-e570-499e-8373-3e43e65865c1" containerName="barbican-api-log" Oct 03 08:09:56 crc kubenswrapper[4664]: E1003 08:09:56.737229 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="007d3aea-e570-499e-8373-3e43e65865c1" containerName="barbican-api" Oct 03 08:09:56 crc kubenswrapper[4664]: I1003 08:09:56.737236 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="007d3aea-e570-499e-8373-3e43e65865c1" containerName="barbican-api" Oct 03 08:09:56 crc kubenswrapper[4664]: E1003 08:09:56.737255 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f" containerName="probe" Oct 03 08:09:56 crc kubenswrapper[4664]: I1003 08:09:56.737263 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f" containerName="probe" Oct 03 08:09:56 crc kubenswrapper[4664]: E1003 08:09:56.737277 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85f0c94a-31da-41e5-aa27-9840d3704a67" containerName="init" Oct 03 08:09:56 crc kubenswrapper[4664]: I1003 08:09:56.737283 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="85f0c94a-31da-41e5-aa27-9840d3704a67" containerName="init" Oct 03 08:09:56 crc kubenswrapper[4664]: I1003 08:09:56.737554 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f" containerName="probe" Oct 03 08:09:56 crc kubenswrapper[4664]: I1003 08:09:56.737595 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="007d3aea-e570-499e-8373-3e43e65865c1" containerName="barbican-api" Oct 03 08:09:56 crc kubenswrapper[4664]: I1003 08:09:56.737633 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="007d3aea-e570-499e-8373-3e43e65865c1" containerName="barbican-api-log" Oct 03 08:09:56 crc kubenswrapper[4664]: I1003 08:09:56.737648 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f" containerName="cinder-scheduler" Oct 03 08:09:56 crc kubenswrapper[4664]: I1003 08:09:56.737663 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="85f0c94a-31da-41e5-aa27-9840d3704a67" containerName="dnsmasq-dns" Oct 03 08:09:56 crc kubenswrapper[4664]: I1003 08:09:56.738904 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-bcb7d647f-zjhrm" Oct 03 08:09:56 crc kubenswrapper[4664]: I1003 08:09:56.743074 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Oct 03 08:09:56 crc kubenswrapper[4664]: I1003 08:09:56.743368 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Oct 03 08:09:56 crc kubenswrapper[4664]: I1003 08:09:56.747930 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 03 08:09:56 crc kubenswrapper[4664]: I1003 08:09:56.754302 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-bcb7d647f-zjhrm"] Oct 03 08:09:56 crc kubenswrapper[4664]: I1003 08:09:56.803909 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f" (UID: "acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:09:56 crc kubenswrapper[4664]: I1003 08:09:56.809154 4664 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:56 crc kubenswrapper[4664]: I1003 08:09:56.809195 4664 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:56 crc kubenswrapper[4664]: I1003 08:09:56.809208 4664 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:56 crc kubenswrapper[4664]: I1003 08:09:56.809220 4664 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:56 crc kubenswrapper[4664]: I1003 08:09:56.809247 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lp6mk\" (UniqueName: \"kubernetes.io/projected/acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f-kube-api-access-lp6mk\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:56 crc kubenswrapper[4664]: I1003 08:09:56.836205 4664 generic.go:334] "Generic (PLEG): container finished" podID="acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f" containerID="1af1bb3b970c8ef1052412b8f46eff8a351e6b08149a0b11fe5770b5dd2e5064" exitCode=0 Oct 03 08:09:56 crc kubenswrapper[4664]: I1003 08:09:56.836242 4664 generic.go:334] "Generic (PLEG): container finished" podID="acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f" containerID="da0df2e299f01a9a0033b6926895ab064100c6876f2f24cd287547a42e5aed45" exitCode=0 Oct 03 08:09:56 crc kubenswrapper[4664]: I1003 08:09:56.836285 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 08:09:56 crc kubenswrapper[4664]: I1003 08:09:56.836311 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f","Type":"ContainerDied","Data":"1af1bb3b970c8ef1052412b8f46eff8a351e6b08149a0b11fe5770b5dd2e5064"} Oct 03 08:09:56 crc kubenswrapper[4664]: I1003 08:09:56.836381 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f","Type":"ContainerDied","Data":"da0df2e299f01a9a0033b6926895ab064100c6876f2f24cd287547a42e5aed45"} Oct 03 08:09:56 crc kubenswrapper[4664]: I1003 08:09:56.836399 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f","Type":"ContainerDied","Data":"825f2bb8ab089a84d717509952f98b1a61cb881f2429204cf89ed329323c6eff"} Oct 03 08:09:56 crc kubenswrapper[4664]: I1003 08:09:56.836419 4664 scope.go:117] "RemoveContainer" containerID="1af1bb3b970c8ef1052412b8f46eff8a351e6b08149a0b11fe5770b5dd2e5064" Oct 03 08:09:56 crc kubenswrapper[4664]: I1003 08:09:56.912848 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f-config-data" (OuterVolumeSpecName: "config-data") pod "acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f" (UID: "acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:09:56 crc kubenswrapper[4664]: I1003 08:09:56.912887 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rpbq\" (UniqueName: \"kubernetes.io/projected/f47ce762-22b9-4066-87d2-39e16a5b6c6d-kube-api-access-8rpbq\") pod \"swift-proxy-bcb7d647f-zjhrm\" (UID: \"f47ce762-22b9-4066-87d2-39e16a5b6c6d\") " pod="openstack/swift-proxy-bcb7d647f-zjhrm" Oct 03 08:09:56 crc kubenswrapper[4664]: I1003 08:09:56.913243 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f47ce762-22b9-4066-87d2-39e16a5b6c6d-combined-ca-bundle\") pod \"swift-proxy-bcb7d647f-zjhrm\" (UID: \"f47ce762-22b9-4066-87d2-39e16a5b6c6d\") " pod="openstack/swift-proxy-bcb7d647f-zjhrm" Oct 03 08:09:56 crc kubenswrapper[4664]: I1003 08:09:56.913304 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f47ce762-22b9-4066-87d2-39e16a5b6c6d-log-httpd\") pod \"swift-proxy-bcb7d647f-zjhrm\" (UID: \"f47ce762-22b9-4066-87d2-39e16a5b6c6d\") " pod="openstack/swift-proxy-bcb7d647f-zjhrm" Oct 03 08:09:56 crc kubenswrapper[4664]: I1003 08:09:56.913432 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f47ce762-22b9-4066-87d2-39e16a5b6c6d-run-httpd\") pod \"swift-proxy-bcb7d647f-zjhrm\" (UID: \"f47ce762-22b9-4066-87d2-39e16a5b6c6d\") " pod="openstack/swift-proxy-bcb7d647f-zjhrm" Oct 03 08:09:56 crc kubenswrapper[4664]: I1003 08:09:56.913470 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f47ce762-22b9-4066-87d2-39e16a5b6c6d-public-tls-certs\") pod \"swift-proxy-bcb7d647f-zjhrm\" (UID: \"f47ce762-22b9-4066-87d2-39e16a5b6c6d\") " pod="openstack/swift-proxy-bcb7d647f-zjhrm" Oct 03 08:09:56 crc kubenswrapper[4664]: I1003 08:09:56.913555 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f47ce762-22b9-4066-87d2-39e16a5b6c6d-config-data\") pod \"swift-proxy-bcb7d647f-zjhrm\" (UID: \"f47ce762-22b9-4066-87d2-39e16a5b6c6d\") " pod="openstack/swift-proxy-bcb7d647f-zjhrm" Oct 03 08:09:56 crc kubenswrapper[4664]: I1003 08:09:56.913580 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f47ce762-22b9-4066-87d2-39e16a5b6c6d-internal-tls-certs\") pod \"swift-proxy-bcb7d647f-zjhrm\" (UID: \"f47ce762-22b9-4066-87d2-39e16a5b6c6d\") " pod="openstack/swift-proxy-bcb7d647f-zjhrm" Oct 03 08:09:56 crc kubenswrapper[4664]: I1003 08:09:56.913642 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f47ce762-22b9-4066-87d2-39e16a5b6c6d-etc-swift\") pod \"swift-proxy-bcb7d647f-zjhrm\" (UID: \"f47ce762-22b9-4066-87d2-39e16a5b6c6d\") " pod="openstack/swift-proxy-bcb7d647f-zjhrm" Oct 03 08:09:56 crc kubenswrapper[4664]: I1003 08:09:56.913879 4664 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:56 crc kubenswrapper[4664]: I1003 08:09:56.987147 4664 scope.go:117] "RemoveContainer" containerID="da0df2e299f01a9a0033b6926895ab064100c6876f2f24cd287547a42e5aed45" Oct 03 08:09:57 crc kubenswrapper[4664]: I1003 08:09:57.015738 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f47ce762-22b9-4066-87d2-39e16a5b6c6d-combined-ca-bundle\") pod \"swift-proxy-bcb7d647f-zjhrm\" (UID: \"f47ce762-22b9-4066-87d2-39e16a5b6c6d\") " pod="openstack/swift-proxy-bcb7d647f-zjhrm" Oct 03 08:09:57 crc kubenswrapper[4664]: I1003 08:09:57.015782 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f47ce762-22b9-4066-87d2-39e16a5b6c6d-log-httpd\") pod \"swift-proxy-bcb7d647f-zjhrm\" (UID: \"f47ce762-22b9-4066-87d2-39e16a5b6c6d\") " pod="openstack/swift-proxy-bcb7d647f-zjhrm" Oct 03 08:09:57 crc kubenswrapper[4664]: I1003 08:09:57.015842 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f47ce762-22b9-4066-87d2-39e16a5b6c6d-run-httpd\") pod \"swift-proxy-bcb7d647f-zjhrm\" (UID: \"f47ce762-22b9-4066-87d2-39e16a5b6c6d\") " pod="openstack/swift-proxy-bcb7d647f-zjhrm" Oct 03 08:09:57 crc kubenswrapper[4664]: I1003 08:09:57.015899 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f47ce762-22b9-4066-87d2-39e16a5b6c6d-public-tls-certs\") pod \"swift-proxy-bcb7d647f-zjhrm\" (UID: \"f47ce762-22b9-4066-87d2-39e16a5b6c6d\") " pod="openstack/swift-proxy-bcb7d647f-zjhrm" Oct 03 08:09:57 crc kubenswrapper[4664]: I1003 08:09:57.015944 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f47ce762-22b9-4066-87d2-39e16a5b6c6d-config-data\") pod \"swift-proxy-bcb7d647f-zjhrm\" (UID: \"f47ce762-22b9-4066-87d2-39e16a5b6c6d\") " pod="openstack/swift-proxy-bcb7d647f-zjhrm" Oct 03 08:09:57 crc kubenswrapper[4664]: I1003 08:09:57.015963 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f47ce762-22b9-4066-87d2-39e16a5b6c6d-internal-tls-certs\") pod \"swift-proxy-bcb7d647f-zjhrm\" (UID: \"f47ce762-22b9-4066-87d2-39e16a5b6c6d\") " pod="openstack/swift-proxy-bcb7d647f-zjhrm" Oct 03 08:09:57 crc kubenswrapper[4664]: I1003 08:09:57.015997 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f47ce762-22b9-4066-87d2-39e16a5b6c6d-etc-swift\") pod \"swift-proxy-bcb7d647f-zjhrm\" (UID: \"f47ce762-22b9-4066-87d2-39e16a5b6c6d\") " pod="openstack/swift-proxy-bcb7d647f-zjhrm" Oct 03 08:09:57 crc kubenswrapper[4664]: I1003 08:09:57.016126 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rpbq\" (UniqueName: \"kubernetes.io/projected/f47ce762-22b9-4066-87d2-39e16a5b6c6d-kube-api-access-8rpbq\") pod \"swift-proxy-bcb7d647f-zjhrm\" (UID: \"f47ce762-22b9-4066-87d2-39e16a5b6c6d\") " pod="openstack/swift-proxy-bcb7d647f-zjhrm" Oct 03 08:09:57 crc kubenswrapper[4664]: I1003 08:09:57.017253 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f47ce762-22b9-4066-87d2-39e16a5b6c6d-run-httpd\") pod \"swift-proxy-bcb7d647f-zjhrm\" (UID: \"f47ce762-22b9-4066-87d2-39e16a5b6c6d\") " pod="openstack/swift-proxy-bcb7d647f-zjhrm" Oct 03 08:09:57 crc kubenswrapper[4664]: I1003 08:09:57.018254 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f47ce762-22b9-4066-87d2-39e16a5b6c6d-log-httpd\") pod \"swift-proxy-bcb7d647f-zjhrm\" (UID: \"f47ce762-22b9-4066-87d2-39e16a5b6c6d\") " pod="openstack/swift-proxy-bcb7d647f-zjhrm" Oct 03 08:09:57 crc kubenswrapper[4664]: I1003 08:09:57.024544 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f47ce762-22b9-4066-87d2-39e16a5b6c6d-config-data\") pod \"swift-proxy-bcb7d647f-zjhrm\" (UID: \"f47ce762-22b9-4066-87d2-39e16a5b6c6d\") " pod="openstack/swift-proxy-bcb7d647f-zjhrm" Oct 03 08:09:57 crc kubenswrapper[4664]: I1003 08:09:57.029266 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f47ce762-22b9-4066-87d2-39e16a5b6c6d-combined-ca-bundle\") pod \"swift-proxy-bcb7d647f-zjhrm\" (UID: \"f47ce762-22b9-4066-87d2-39e16a5b6c6d\") " pod="openstack/swift-proxy-bcb7d647f-zjhrm" Oct 03 08:09:57 crc kubenswrapper[4664]: I1003 08:09:57.039925 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f47ce762-22b9-4066-87d2-39e16a5b6c6d-public-tls-certs\") pod \"swift-proxy-bcb7d647f-zjhrm\" (UID: \"f47ce762-22b9-4066-87d2-39e16a5b6c6d\") " pod="openstack/swift-proxy-bcb7d647f-zjhrm" Oct 03 08:09:57 crc kubenswrapper[4664]: I1003 08:09:57.040153 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f47ce762-22b9-4066-87d2-39e16a5b6c6d-etc-swift\") pod \"swift-proxy-bcb7d647f-zjhrm\" (UID: \"f47ce762-22b9-4066-87d2-39e16a5b6c6d\") " pod="openstack/swift-proxy-bcb7d647f-zjhrm" Oct 03 08:09:57 crc kubenswrapper[4664]: I1003 08:09:57.040512 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f47ce762-22b9-4066-87d2-39e16a5b6c6d-internal-tls-certs\") pod \"swift-proxy-bcb7d647f-zjhrm\" (UID: \"f47ce762-22b9-4066-87d2-39e16a5b6c6d\") " pod="openstack/swift-proxy-bcb7d647f-zjhrm" Oct 03 08:09:57 crc kubenswrapper[4664]: I1003 08:09:57.061859 4664 scope.go:117] "RemoveContainer" containerID="1af1bb3b970c8ef1052412b8f46eff8a351e6b08149a0b11fe5770b5dd2e5064" Oct 03 08:09:57 crc kubenswrapper[4664]: E1003 08:09:57.063702 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1af1bb3b970c8ef1052412b8f46eff8a351e6b08149a0b11fe5770b5dd2e5064\": container with ID starting with 1af1bb3b970c8ef1052412b8f46eff8a351e6b08149a0b11fe5770b5dd2e5064 not found: ID does not exist" containerID="1af1bb3b970c8ef1052412b8f46eff8a351e6b08149a0b11fe5770b5dd2e5064" Oct 03 08:09:57 crc kubenswrapper[4664]: I1003 08:09:57.063750 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1af1bb3b970c8ef1052412b8f46eff8a351e6b08149a0b11fe5770b5dd2e5064"} err="failed to get container status \"1af1bb3b970c8ef1052412b8f46eff8a351e6b08149a0b11fe5770b5dd2e5064\": rpc error: code = NotFound desc = could not find container \"1af1bb3b970c8ef1052412b8f46eff8a351e6b08149a0b11fe5770b5dd2e5064\": container with ID starting with 1af1bb3b970c8ef1052412b8f46eff8a351e6b08149a0b11fe5770b5dd2e5064 not found: ID does not exist" Oct 03 08:09:57 crc kubenswrapper[4664]: I1003 08:09:57.063777 4664 scope.go:117] "RemoveContainer" containerID="da0df2e299f01a9a0033b6926895ab064100c6876f2f24cd287547a42e5aed45" Oct 03 08:09:57 crc kubenswrapper[4664]: E1003 08:09:57.065229 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da0df2e299f01a9a0033b6926895ab064100c6876f2f24cd287547a42e5aed45\": container with ID starting with da0df2e299f01a9a0033b6926895ab064100c6876f2f24cd287547a42e5aed45 not found: ID does not exist" containerID="da0df2e299f01a9a0033b6926895ab064100c6876f2f24cd287547a42e5aed45" Oct 03 08:09:57 crc kubenswrapper[4664]: I1003 08:09:57.065257 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da0df2e299f01a9a0033b6926895ab064100c6876f2f24cd287547a42e5aed45"} err="failed to get container status \"da0df2e299f01a9a0033b6926895ab064100c6876f2f24cd287547a42e5aed45\": rpc error: code = NotFound desc = could not find container \"da0df2e299f01a9a0033b6926895ab064100c6876f2f24cd287547a42e5aed45\": container with ID starting with da0df2e299f01a9a0033b6926895ab064100c6876f2f24cd287547a42e5aed45 not found: ID does not exist" Oct 03 08:09:57 crc kubenswrapper[4664]: I1003 08:09:57.065280 4664 scope.go:117] "RemoveContainer" containerID="1af1bb3b970c8ef1052412b8f46eff8a351e6b08149a0b11fe5770b5dd2e5064" Oct 03 08:09:57 crc kubenswrapper[4664]: I1003 08:09:57.067233 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rpbq\" (UniqueName: \"kubernetes.io/projected/f47ce762-22b9-4066-87d2-39e16a5b6c6d-kube-api-access-8rpbq\") pod \"swift-proxy-bcb7d647f-zjhrm\" (UID: \"f47ce762-22b9-4066-87d2-39e16a5b6c6d\") " pod="openstack/swift-proxy-bcb7d647f-zjhrm" Oct 03 08:09:57 crc kubenswrapper[4664]: I1003 08:09:57.069838 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1af1bb3b970c8ef1052412b8f46eff8a351e6b08149a0b11fe5770b5dd2e5064"} err="failed to get container status \"1af1bb3b970c8ef1052412b8f46eff8a351e6b08149a0b11fe5770b5dd2e5064\": rpc error: code = NotFound desc = could not find container \"1af1bb3b970c8ef1052412b8f46eff8a351e6b08149a0b11fe5770b5dd2e5064\": container with ID starting with 1af1bb3b970c8ef1052412b8f46eff8a351e6b08149a0b11fe5770b5dd2e5064 not found: ID does not exist" Oct 03 08:09:57 crc kubenswrapper[4664]: I1003 08:09:57.069912 4664 scope.go:117] "RemoveContainer" containerID="da0df2e299f01a9a0033b6926895ab064100c6876f2f24cd287547a42e5aed45" Oct 03 08:09:57 crc kubenswrapper[4664]: I1003 08:09:57.076647 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da0df2e299f01a9a0033b6926895ab064100c6876f2f24cd287547a42e5aed45"} err="failed to get container status \"da0df2e299f01a9a0033b6926895ab064100c6876f2f24cd287547a42e5aed45\": rpc error: code = NotFound desc = could not find container \"da0df2e299f01a9a0033b6926895ab064100c6876f2f24cd287547a42e5aed45\": container with ID starting with da0df2e299f01a9a0033b6926895ab064100c6876f2f24cd287547a42e5aed45 not found: ID does not exist" Oct 03 08:09:57 crc kubenswrapper[4664]: I1003 08:09:57.082972 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-bcb7d647f-zjhrm" Oct 03 08:09:57 crc kubenswrapper[4664]: I1003 08:09:57.188528 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 08:09:57 crc kubenswrapper[4664]: I1003 08:09:57.207207 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 08:09:57 crc kubenswrapper[4664]: I1003 08:09:57.225436 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 08:09:57 crc kubenswrapper[4664]: I1003 08:09:57.227473 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 08:09:57 crc kubenswrapper[4664]: I1003 08:09:57.231665 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 03 08:09:57 crc kubenswrapper[4664]: I1003 08:09:57.245924 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 08:09:57 crc kubenswrapper[4664]: I1003 08:09:57.426447 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0bb12c08-bee2-4964-892c-98b4d9a75b82-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0bb12c08-bee2-4964-892c-98b4d9a75b82\") " pod="openstack/cinder-scheduler-0" Oct 03 08:09:57 crc kubenswrapper[4664]: I1003 08:09:57.426985 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bb12c08-bee2-4964-892c-98b4d9a75b82-config-data\") pod \"cinder-scheduler-0\" (UID: \"0bb12c08-bee2-4964-892c-98b4d9a75b82\") " pod="openstack/cinder-scheduler-0" Oct 03 08:09:57 crc kubenswrapper[4664]: I1003 08:09:57.427031 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsnsm\" (UniqueName: \"kubernetes.io/projected/0bb12c08-bee2-4964-892c-98b4d9a75b82-kube-api-access-fsnsm\") pod \"cinder-scheduler-0\" (UID: \"0bb12c08-bee2-4964-892c-98b4d9a75b82\") " pod="openstack/cinder-scheduler-0" Oct 03 08:09:57 crc kubenswrapper[4664]: I1003 08:09:57.427054 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0bb12c08-bee2-4964-892c-98b4d9a75b82-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0bb12c08-bee2-4964-892c-98b4d9a75b82\") " pod="openstack/cinder-scheduler-0" Oct 03 08:09:57 crc kubenswrapper[4664]: I1003 08:09:57.427115 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0bb12c08-bee2-4964-892c-98b4d9a75b82-scripts\") pod \"cinder-scheduler-0\" (UID: \"0bb12c08-bee2-4964-892c-98b4d9a75b82\") " pod="openstack/cinder-scheduler-0" Oct 03 08:09:57 crc kubenswrapper[4664]: I1003 08:09:57.427148 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb12c08-bee2-4964-892c-98b4d9a75b82-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0bb12c08-bee2-4964-892c-98b4d9a75b82\") " pod="openstack/cinder-scheduler-0" Oct 03 08:09:57 crc kubenswrapper[4664]: I1003 08:09:57.528812 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0bb12c08-bee2-4964-892c-98b4d9a75b82-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0bb12c08-bee2-4964-892c-98b4d9a75b82\") " pod="openstack/cinder-scheduler-0" Oct 03 08:09:57 crc kubenswrapper[4664]: I1003 08:09:57.528910 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bb12c08-bee2-4964-892c-98b4d9a75b82-config-data\") pod \"cinder-scheduler-0\" (UID: \"0bb12c08-bee2-4964-892c-98b4d9a75b82\") " pod="openstack/cinder-scheduler-0" Oct 03 08:09:57 crc kubenswrapper[4664]: I1003 08:09:57.528934 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0bb12c08-bee2-4964-892c-98b4d9a75b82-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0bb12c08-bee2-4964-892c-98b4d9a75b82\") " pod="openstack/cinder-scheduler-0" Oct 03 08:09:57 crc kubenswrapper[4664]: I1003 08:09:57.528943 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsnsm\" (UniqueName: \"kubernetes.io/projected/0bb12c08-bee2-4964-892c-98b4d9a75b82-kube-api-access-fsnsm\") pod \"cinder-scheduler-0\" (UID: \"0bb12c08-bee2-4964-892c-98b4d9a75b82\") " pod="openstack/cinder-scheduler-0" Oct 03 08:09:57 crc kubenswrapper[4664]: I1003 08:09:57.529098 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0bb12c08-bee2-4964-892c-98b4d9a75b82-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0bb12c08-bee2-4964-892c-98b4d9a75b82\") " pod="openstack/cinder-scheduler-0" Oct 03 08:09:57 crc kubenswrapper[4664]: I1003 08:09:57.529288 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0bb12c08-bee2-4964-892c-98b4d9a75b82-scripts\") pod \"cinder-scheduler-0\" (UID: \"0bb12c08-bee2-4964-892c-98b4d9a75b82\") " pod="openstack/cinder-scheduler-0" Oct 03 08:09:57 crc kubenswrapper[4664]: I1003 08:09:57.529353 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb12c08-bee2-4964-892c-98b4d9a75b82-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0bb12c08-bee2-4964-892c-98b4d9a75b82\") " pod="openstack/cinder-scheduler-0" Oct 03 08:09:57 crc kubenswrapper[4664]: I1003 08:09:57.536820 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0bb12c08-bee2-4964-892c-98b4d9a75b82-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0bb12c08-bee2-4964-892c-98b4d9a75b82\") " pod="openstack/cinder-scheduler-0" Oct 03 08:09:57 crc kubenswrapper[4664]: I1003 08:09:57.537199 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bb12c08-bee2-4964-892c-98b4d9a75b82-config-data\") pod \"cinder-scheduler-0\" (UID: \"0bb12c08-bee2-4964-892c-98b4d9a75b82\") " pod="openstack/cinder-scheduler-0" Oct 03 08:09:57 crc kubenswrapper[4664]: I1003 08:09:57.537477 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0bb12c08-bee2-4964-892c-98b4d9a75b82-scripts\") pod \"cinder-scheduler-0\" (UID: \"0bb12c08-bee2-4964-892c-98b4d9a75b82\") " pod="openstack/cinder-scheduler-0" Oct 03 08:09:57 crc kubenswrapper[4664]: I1003 08:09:57.537582 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb12c08-bee2-4964-892c-98b4d9a75b82-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0bb12c08-bee2-4964-892c-98b4d9a75b82\") " pod="openstack/cinder-scheduler-0" Oct 03 08:09:57 crc kubenswrapper[4664]: I1003 08:09:57.550944 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsnsm\" (UniqueName: \"kubernetes.io/projected/0bb12c08-bee2-4964-892c-98b4d9a75b82-kube-api-access-fsnsm\") pod \"cinder-scheduler-0\" (UID: \"0bb12c08-bee2-4964-892c-98b4d9a75b82\") " pod="openstack/cinder-scheduler-0" Oct 03 08:09:57 crc kubenswrapper[4664]: I1003 08:09:57.820434 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-bcb7d647f-zjhrm"] Oct 03 08:09:57 crc kubenswrapper[4664]: I1003 08:09:57.852109 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 08:09:57 crc kubenswrapper[4664]: I1003 08:09:57.935962 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85f0c94a-31da-41e5-aa27-9840d3704a67" path="/var/lib/kubelet/pods/85f0c94a-31da-41e5-aa27-9840d3704a67/volumes" Oct 03 08:09:57 crc kubenswrapper[4664]: I1003 08:09:57.936544 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f" path="/var/lib/kubelet/pods/acc65a0d-b0c7-44d0-888d-9b9e46bb7d9f/volumes" Oct 03 08:09:57 crc kubenswrapper[4664]: I1003 08:09:57.978804 4664 generic.go:334] "Generic (PLEG): container finished" podID="6e72a85b-4087-4478-9d83-91a468eda59d" containerID="0d52ba589d591ea1719e9050e7a7a4c91cb57efc9056882128d6272f61ee7b0f" exitCode=0 Oct 03 08:09:57 crc kubenswrapper[4664]: I1003 08:09:57.978883 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c47458c7b-zb4ls" event={"ID":"6e72a85b-4087-4478-9d83-91a468eda59d","Type":"ContainerDied","Data":"0d52ba589d591ea1719e9050e7a7a4c91cb57efc9056882128d6272f61ee7b0f"} Oct 03 08:09:58 crc kubenswrapper[4664]: I1003 08:09:58.144725 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c47458c7b-zb4ls" Oct 03 08:09:58 crc kubenswrapper[4664]: I1003 08:09:58.185410 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e72a85b-4087-4478-9d83-91a468eda59d-ovndb-tls-certs\") pod \"6e72a85b-4087-4478-9d83-91a468eda59d\" (UID: \"6e72a85b-4087-4478-9d83-91a468eda59d\") " Oct 03 08:09:58 crc kubenswrapper[4664]: I1003 08:09:58.185968 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6e72a85b-4087-4478-9d83-91a468eda59d-httpd-config\") pod \"6e72a85b-4087-4478-9d83-91a468eda59d\" (UID: \"6e72a85b-4087-4478-9d83-91a468eda59d\") " Oct 03 08:09:58 crc kubenswrapper[4664]: I1003 08:09:58.188334 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e72a85b-4087-4478-9d83-91a468eda59d-combined-ca-bundle\") pod \"6e72a85b-4087-4478-9d83-91a468eda59d\" (UID: \"6e72a85b-4087-4478-9d83-91a468eda59d\") " Oct 03 08:09:58 crc kubenswrapper[4664]: I1003 08:09:58.188426 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7td7\" (UniqueName: \"kubernetes.io/projected/6e72a85b-4087-4478-9d83-91a468eda59d-kube-api-access-v7td7\") pod \"6e72a85b-4087-4478-9d83-91a468eda59d\" (UID: \"6e72a85b-4087-4478-9d83-91a468eda59d\") " Oct 03 08:09:58 crc kubenswrapper[4664]: I1003 08:09:58.188495 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6e72a85b-4087-4478-9d83-91a468eda59d-config\") pod \"6e72a85b-4087-4478-9d83-91a468eda59d\" (UID: \"6e72a85b-4087-4478-9d83-91a468eda59d\") " Oct 03 08:09:58 crc kubenswrapper[4664]: I1003 08:09:58.201476 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e72a85b-4087-4478-9d83-91a468eda59d-kube-api-access-v7td7" (OuterVolumeSpecName: "kube-api-access-v7td7") pod "6e72a85b-4087-4478-9d83-91a468eda59d" (UID: "6e72a85b-4087-4478-9d83-91a468eda59d"). InnerVolumeSpecName "kube-api-access-v7td7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:09:58 crc kubenswrapper[4664]: I1003 08:09:58.203814 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e72a85b-4087-4478-9d83-91a468eda59d-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "6e72a85b-4087-4478-9d83-91a468eda59d" (UID: "6e72a85b-4087-4478-9d83-91a468eda59d"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:09:58 crc kubenswrapper[4664]: I1003 08:09:58.293368 4664 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6e72a85b-4087-4478-9d83-91a468eda59d-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:58 crc kubenswrapper[4664]: I1003 08:09:58.293413 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7td7\" (UniqueName: \"kubernetes.io/projected/6e72a85b-4087-4478-9d83-91a468eda59d-kube-api-access-v7td7\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:58 crc kubenswrapper[4664]: I1003 08:09:58.312285 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e72a85b-4087-4478-9d83-91a468eda59d-config" (OuterVolumeSpecName: "config") pod "6e72a85b-4087-4478-9d83-91a468eda59d" (UID: "6e72a85b-4087-4478-9d83-91a468eda59d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:09:58 crc kubenswrapper[4664]: I1003 08:09:58.337909 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e72a85b-4087-4478-9d83-91a468eda59d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6e72a85b-4087-4478-9d83-91a468eda59d" (UID: "6e72a85b-4087-4478-9d83-91a468eda59d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:09:58 crc kubenswrapper[4664]: I1003 08:09:58.346562 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e72a85b-4087-4478-9d83-91a468eda59d-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "6e72a85b-4087-4478-9d83-91a468eda59d" (UID: "6e72a85b-4087-4478-9d83-91a468eda59d"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:09:58 crc kubenswrapper[4664]: I1003 08:09:58.394948 4664 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e72a85b-4087-4478-9d83-91a468eda59d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:58 crc kubenswrapper[4664]: I1003 08:09:58.395319 4664 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/6e72a85b-4087-4478-9d83-91a468eda59d-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:58 crc kubenswrapper[4664]: I1003 08:09:58.395334 4664 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e72a85b-4087-4478-9d83-91a468eda59d-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 08:09:58 crc kubenswrapper[4664]: W1003 08:09:58.578253 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0bb12c08_bee2_4964_892c_98b4d9a75b82.slice/crio-88cfb9e06ceff3510e49aec5331ce2745b786fd9dd171d02bbcbf8fb098f59e6 WatchSource:0}: Error finding container 88cfb9e06ceff3510e49aec5331ce2745b786fd9dd171d02bbcbf8fb098f59e6: Status 404 returned error can't find the container with id 88cfb9e06ceff3510e49aec5331ce2745b786fd9dd171d02bbcbf8fb098f59e6 Oct 03 08:09:58 crc kubenswrapper[4664]: I1003 08:09:58.586645 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 08:09:58 crc kubenswrapper[4664]: I1003 08:09:58.826511 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:09:58 crc kubenswrapper[4664]: I1003 08:09:58.826797 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e259c239-66b0-409a-8819-91430929950a" containerName="ceilometer-central-agent" containerID="cri-o://9ae2ef0d9c43e7e6df456cbd87079e52b75a8230fbc13b17fa002c29763e687b" gracePeriod=30 Oct 03 08:09:58 crc kubenswrapper[4664]: I1003 08:09:58.827517 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e259c239-66b0-409a-8819-91430929950a" containerName="proxy-httpd" containerID="cri-o://19b56281c965636b4ff097ca0f6d749632f22c733e4c76eaf2d6d5bda8d87c50" gracePeriod=30 Oct 03 08:09:58 crc kubenswrapper[4664]: I1003 08:09:58.827584 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e259c239-66b0-409a-8819-91430929950a" containerName="sg-core" containerID="cri-o://90abca23b94a01f13efa40f02002f7a387d97640b51b60640219ec78d31b2367" gracePeriod=30 Oct 03 08:09:58 crc kubenswrapper[4664]: I1003 08:09:58.827652 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e259c239-66b0-409a-8819-91430929950a" containerName="ceilometer-notification-agent" containerID="cri-o://288a4092fd889c4bfc68e6659997e8fe06bbf3f67cd75e26f921c8e8b07f6632" gracePeriod=30 Oct 03 08:09:58 crc kubenswrapper[4664]: I1003 08:09:58.839210 4664 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="e259c239-66b0-409a-8819-91430929950a" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Oct 03 08:09:59 crc kubenswrapper[4664]: I1003 08:09:59.033036 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-bcb7d647f-zjhrm" event={"ID":"f47ce762-22b9-4066-87d2-39e16a5b6c6d","Type":"ContainerStarted","Data":"67f7bf5d77c006feff1fa301434a817fe4b7ffe69cb6878657fd374ed4705648"} Oct 03 08:09:59 crc kubenswrapper[4664]: I1003 08:09:59.033673 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-bcb7d647f-zjhrm" Oct 03 08:09:59 crc kubenswrapper[4664]: I1003 08:09:59.033726 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-bcb7d647f-zjhrm" event={"ID":"f47ce762-22b9-4066-87d2-39e16a5b6c6d","Type":"ContainerStarted","Data":"5700417c74fcd9a4d78500598935512151c61f50446adfc91a819e534eff7fa3"} Oct 03 08:09:59 crc kubenswrapper[4664]: I1003 08:09:59.033744 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-bcb7d647f-zjhrm" event={"ID":"f47ce762-22b9-4066-87d2-39e16a5b6c6d","Type":"ContainerStarted","Data":"27b4b998741c8dfe13100c54a44ea1d6dd4cffdcbafd544bcef80627c8d78fa8"} Oct 03 08:09:59 crc kubenswrapper[4664]: I1003 08:09:59.033760 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-bcb7d647f-zjhrm" Oct 03 08:09:59 crc kubenswrapper[4664]: I1003 08:09:59.038981 4664 generic.go:334] "Generic (PLEG): container finished" podID="e259c239-66b0-409a-8819-91430929950a" containerID="90abca23b94a01f13efa40f02002f7a387d97640b51b60640219ec78d31b2367" exitCode=2 Oct 03 08:09:59 crc kubenswrapper[4664]: I1003 08:09:59.039078 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e259c239-66b0-409a-8819-91430929950a","Type":"ContainerDied","Data":"90abca23b94a01f13efa40f02002f7a387d97640b51b60640219ec78d31b2367"} Oct 03 08:09:59 crc kubenswrapper[4664]: I1003 08:09:59.041695 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0bb12c08-bee2-4964-892c-98b4d9a75b82","Type":"ContainerStarted","Data":"88cfb9e06ceff3510e49aec5331ce2745b786fd9dd171d02bbcbf8fb098f59e6"} Oct 03 08:09:59 crc kubenswrapper[4664]: I1003 08:09:59.044674 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c47458c7b-zb4ls" event={"ID":"6e72a85b-4087-4478-9d83-91a468eda59d","Type":"ContainerDied","Data":"a9bcea5762989180bbf1b5cd957868de170959703d47ae52a11356c5e382a69e"} Oct 03 08:09:59 crc kubenswrapper[4664]: I1003 08:09:59.044734 4664 scope.go:117] "RemoveContainer" containerID="fa764f39bd76987fb25c5925222d5f6a1c5c9e9227bfd785e27565753d068f04" Oct 03 08:09:59 crc kubenswrapper[4664]: I1003 08:09:59.044909 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c47458c7b-zb4ls" Oct 03 08:09:59 crc kubenswrapper[4664]: I1003 08:09:59.068286 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-bcb7d647f-zjhrm" podStartSLOduration=3.068259906 podStartE2EDuration="3.068259906s" podCreationTimestamp="2025-10-03 08:09:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:09:59.054150206 +0000 UTC m=+1299.875340696" watchObservedRunningTime="2025-10-03 08:09:59.068259906 +0000 UTC m=+1299.889450396" Oct 03 08:09:59 crc kubenswrapper[4664]: I1003 08:09:59.114307 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6c47458c7b-zb4ls"] Oct 03 08:09:59 crc kubenswrapper[4664]: I1003 08:09:59.116202 4664 scope.go:117] "RemoveContainer" containerID="0d52ba589d591ea1719e9050e7a7a4c91cb57efc9056882128d6272f61ee7b0f" Oct 03 08:09:59 crc kubenswrapper[4664]: I1003 08:09:59.124351 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6c47458c7b-zb4ls"] Oct 03 08:09:59 crc kubenswrapper[4664]: I1003 08:09:59.891249 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e72a85b-4087-4478-9d83-91a468eda59d" path="/var/lib/kubelet/pods/6e72a85b-4087-4478-9d83-91a468eda59d/volumes" Oct 03 08:10:00 crc kubenswrapper[4664]: I1003 08:10:00.059643 4664 generic.go:334] "Generic (PLEG): container finished" podID="e259c239-66b0-409a-8819-91430929950a" containerID="19b56281c965636b4ff097ca0f6d749632f22c733e4c76eaf2d6d5bda8d87c50" exitCode=0 Oct 03 08:10:00 crc kubenswrapper[4664]: I1003 08:10:00.059676 4664 generic.go:334] "Generic (PLEG): container finished" podID="e259c239-66b0-409a-8819-91430929950a" containerID="9ae2ef0d9c43e7e6df456cbd87079e52b75a8230fbc13b17fa002c29763e687b" exitCode=0 Oct 03 08:10:00 crc kubenswrapper[4664]: I1003 08:10:00.059743 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e259c239-66b0-409a-8819-91430929950a","Type":"ContainerDied","Data":"19b56281c965636b4ff097ca0f6d749632f22c733e4c76eaf2d6d5bda8d87c50"} Oct 03 08:10:00 crc kubenswrapper[4664]: I1003 08:10:00.059795 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e259c239-66b0-409a-8819-91430929950a","Type":"ContainerDied","Data":"9ae2ef0d9c43e7e6df456cbd87079e52b75a8230fbc13b17fa002c29763e687b"} Oct 03 08:10:00 crc kubenswrapper[4664]: I1003 08:10:00.061936 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0bb12c08-bee2-4964-892c-98b4d9a75b82","Type":"ContainerStarted","Data":"bdb1a9e6c10d070caa8dee2ff92a02ddcf5e166a05afde1ba382d2fa06a9853d"} Oct 03 08:10:00 crc kubenswrapper[4664]: I1003 08:10:00.061977 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0bb12c08-bee2-4964-892c-98b4d9a75b82","Type":"ContainerStarted","Data":"7ba99da9a63472a39c5d11456cfdc9044d31e64281cb307e3f6330c327df357e"} Oct 03 08:10:00 crc kubenswrapper[4664]: I1003 08:10:00.086011 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.085993649 podStartE2EDuration="3.085993649s" podCreationTimestamp="2025-10-03 08:09:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:10:00.08122647 +0000 UTC m=+1300.902416960" watchObservedRunningTime="2025-10-03 08:10:00.085993649 +0000 UTC m=+1300.907184139" Oct 03 08:10:01 crc kubenswrapper[4664]: I1003 08:10:01.081130 4664 generic.go:334] "Generic (PLEG): container finished" podID="e259c239-66b0-409a-8819-91430929950a" containerID="288a4092fd889c4bfc68e6659997e8fe06bbf3f67cd75e26f921c8e8b07f6632" exitCode=0 Oct 03 08:10:01 crc kubenswrapper[4664]: I1003 08:10:01.081228 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e259c239-66b0-409a-8819-91430929950a","Type":"ContainerDied","Data":"288a4092fd889c4bfc68e6659997e8fe06bbf3f67cd75e26f921c8e8b07f6632"} Oct 03 08:10:02 crc kubenswrapper[4664]: I1003 08:10:02.852818 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 03 08:10:03 crc kubenswrapper[4664]: I1003 08:10:03.468770 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 03 08:10:03 crc kubenswrapper[4664]: I1003 08:10:03.582910 4664 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-76644f9584-br5jb" podUID="30ce3373-ef30-4727-b57f-5be7963d1892" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.142:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.142:8443: connect: connection refused" Oct 03 08:10:03 crc kubenswrapper[4664]: I1003 08:10:03.626538 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-zp2sz"] Oct 03 08:10:03 crc kubenswrapper[4664]: E1003 08:10:03.627058 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e72a85b-4087-4478-9d83-91a468eda59d" containerName="neutron-api" Oct 03 08:10:03 crc kubenswrapper[4664]: I1003 08:10:03.627081 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e72a85b-4087-4478-9d83-91a468eda59d" containerName="neutron-api" Oct 03 08:10:03 crc kubenswrapper[4664]: E1003 08:10:03.627100 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e72a85b-4087-4478-9d83-91a468eda59d" containerName="neutron-httpd" Oct 03 08:10:03 crc kubenswrapper[4664]: I1003 08:10:03.627108 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e72a85b-4087-4478-9d83-91a468eda59d" containerName="neutron-httpd" Oct 03 08:10:03 crc kubenswrapper[4664]: I1003 08:10:03.627339 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e72a85b-4087-4478-9d83-91a468eda59d" containerName="neutron-api" Oct 03 08:10:03 crc kubenswrapper[4664]: I1003 08:10:03.627376 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e72a85b-4087-4478-9d83-91a468eda59d" containerName="neutron-httpd" Oct 03 08:10:03 crc kubenswrapper[4664]: I1003 08:10:03.628768 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zp2sz" Oct 03 08:10:03 crc kubenswrapper[4664]: I1003 08:10:03.656712 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-zp2sz"] Oct 03 08:10:03 crc kubenswrapper[4664]: I1003 08:10:03.728807 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-4q9vk"] Oct 03 08:10:03 crc kubenswrapper[4664]: I1003 08:10:03.730170 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-4q9vk" Oct 03 08:10:03 crc kubenswrapper[4664]: I1003 08:10:03.740043 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4v7q\" (UniqueName: \"kubernetes.io/projected/1af25c5f-f748-4b42-8c46-f7929cb72bdf-kube-api-access-p4v7q\") pod \"nova-api-db-create-zp2sz\" (UID: \"1af25c5f-f748-4b42-8c46-f7929cb72bdf\") " pod="openstack/nova-api-db-create-zp2sz" Oct 03 08:10:03 crc kubenswrapper[4664]: I1003 08:10:03.742295 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-4q9vk"] Oct 03 08:10:03 crc kubenswrapper[4664]: I1003 08:10:03.811264 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-n7nck"] Oct 03 08:10:03 crc kubenswrapper[4664]: I1003 08:10:03.813057 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-n7nck" Oct 03 08:10:03 crc kubenswrapper[4664]: I1003 08:10:03.840667 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-n7nck"] Oct 03 08:10:03 crc kubenswrapper[4664]: I1003 08:10:03.842229 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4v7q\" (UniqueName: \"kubernetes.io/projected/1af25c5f-f748-4b42-8c46-f7929cb72bdf-kube-api-access-p4v7q\") pod \"nova-api-db-create-zp2sz\" (UID: \"1af25c5f-f748-4b42-8c46-f7929cb72bdf\") " pod="openstack/nova-api-db-create-zp2sz" Oct 03 08:10:03 crc kubenswrapper[4664]: I1003 08:10:03.842481 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm9nn\" (UniqueName: \"kubernetes.io/projected/baf2ab95-9e7c-4e4f-b653-c4dbb00b0745-kube-api-access-xm9nn\") pod \"nova-cell0-db-create-4q9vk\" (UID: \"baf2ab95-9e7c-4e4f-b653-c4dbb00b0745\") " pod="openstack/nova-cell0-db-create-4q9vk" Oct 03 08:10:03 crc kubenswrapper[4664]: I1003 08:10:03.865191 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4v7q\" (UniqueName: \"kubernetes.io/projected/1af25c5f-f748-4b42-8c46-f7929cb72bdf-kube-api-access-p4v7q\") pod \"nova-api-db-create-zp2sz\" (UID: \"1af25c5f-f748-4b42-8c46-f7929cb72bdf\") " pod="openstack/nova-api-db-create-zp2sz" Oct 03 08:10:03 crc kubenswrapper[4664]: I1003 08:10:03.944385 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62slg\" (UniqueName: \"kubernetes.io/projected/76e6be14-99be-4c39-8b45-43cf4dd937c5-kube-api-access-62slg\") pod \"nova-cell1-db-create-n7nck\" (UID: \"76e6be14-99be-4c39-8b45-43cf4dd937c5\") " pod="openstack/nova-cell1-db-create-n7nck" Oct 03 08:10:03 crc kubenswrapper[4664]: I1003 08:10:03.944466 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xm9nn\" (UniqueName: \"kubernetes.io/projected/baf2ab95-9e7c-4e4f-b653-c4dbb00b0745-kube-api-access-xm9nn\") pod \"nova-cell0-db-create-4q9vk\" (UID: \"baf2ab95-9e7c-4e4f-b653-c4dbb00b0745\") " pod="openstack/nova-cell0-db-create-4q9vk" Oct 03 08:10:03 crc kubenswrapper[4664]: I1003 08:10:03.962338 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm9nn\" (UniqueName: \"kubernetes.io/projected/baf2ab95-9e7c-4e4f-b653-c4dbb00b0745-kube-api-access-xm9nn\") pod \"nova-cell0-db-create-4q9vk\" (UID: \"baf2ab95-9e7c-4e4f-b653-c4dbb00b0745\") " pod="openstack/nova-cell0-db-create-4q9vk" Oct 03 08:10:03 crc kubenswrapper[4664]: I1003 08:10:03.980833 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zp2sz" Oct 03 08:10:04 crc kubenswrapper[4664]: I1003 08:10:04.046312 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62slg\" (UniqueName: \"kubernetes.io/projected/76e6be14-99be-4c39-8b45-43cf4dd937c5-kube-api-access-62slg\") pod \"nova-cell1-db-create-n7nck\" (UID: \"76e6be14-99be-4c39-8b45-43cf4dd937c5\") " pod="openstack/nova-cell1-db-create-n7nck" Oct 03 08:10:04 crc kubenswrapper[4664]: I1003 08:10:04.068532 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-4q9vk" Oct 03 08:10:04 crc kubenswrapper[4664]: I1003 08:10:04.088805 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62slg\" (UniqueName: \"kubernetes.io/projected/76e6be14-99be-4c39-8b45-43cf4dd937c5-kube-api-access-62slg\") pod \"nova-cell1-db-create-n7nck\" (UID: \"76e6be14-99be-4c39-8b45-43cf4dd937c5\") " pod="openstack/nova-cell1-db-create-n7nck" Oct 03 08:10:04 crc kubenswrapper[4664]: I1003 08:10:04.150376 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-n7nck" Oct 03 08:10:05 crc kubenswrapper[4664]: I1003 08:10:05.955704 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 08:10:05 crc kubenswrapper[4664]: I1003 08:10:05.956365 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6837bc3c-3e78-491c-8f0c-377955560009" containerName="glance-httpd" containerID="cri-o://eb1303c4f6e4d5ecb19909b618518eb2db4beabcd25acd9db5c40270d97f1986" gracePeriod=30 Oct 03 08:10:05 crc kubenswrapper[4664]: I1003 08:10:05.956554 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6837bc3c-3e78-491c-8f0c-377955560009" containerName="glance-log" containerID="cri-o://2de90cfa4619e4a13019591892950d05a8b7cfea2d32a02ef57d42d8c4658c76" gracePeriod=30 Oct 03 08:10:06 crc kubenswrapper[4664]: I1003 08:10:06.142524 4664 generic.go:334] "Generic (PLEG): container finished" podID="6837bc3c-3e78-491c-8f0c-377955560009" containerID="2de90cfa4619e4a13019591892950d05a8b7cfea2d32a02ef57d42d8c4658c76" exitCode=143 Oct 03 08:10:06 crc kubenswrapper[4664]: I1003 08:10:06.142579 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6837bc3c-3e78-491c-8f0c-377955560009","Type":"ContainerDied","Data":"2de90cfa4619e4a13019591892950d05a8b7cfea2d32a02ef57d42d8c4658c76"} Oct 03 08:10:06 crc kubenswrapper[4664]: I1003 08:10:06.746368 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 08:10:06 crc kubenswrapper[4664]: I1003 08:10:06.809017 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e259c239-66b0-409a-8819-91430929950a-config-data\") pod \"e259c239-66b0-409a-8819-91430929950a\" (UID: \"e259c239-66b0-409a-8819-91430929950a\") " Oct 03 08:10:06 crc kubenswrapper[4664]: I1003 08:10:06.809431 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e259c239-66b0-409a-8819-91430929950a-scripts\") pod \"e259c239-66b0-409a-8819-91430929950a\" (UID: \"e259c239-66b0-409a-8819-91430929950a\") " Oct 03 08:10:06 crc kubenswrapper[4664]: I1003 08:10:06.809646 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e259c239-66b0-409a-8819-91430929950a-sg-core-conf-yaml\") pod \"e259c239-66b0-409a-8819-91430929950a\" (UID: \"e259c239-66b0-409a-8819-91430929950a\") " Oct 03 08:10:06 crc kubenswrapper[4664]: I1003 08:10:06.809695 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e259c239-66b0-409a-8819-91430929950a-combined-ca-bundle\") pod \"e259c239-66b0-409a-8819-91430929950a\" (UID: \"e259c239-66b0-409a-8819-91430929950a\") " Oct 03 08:10:06 crc kubenswrapper[4664]: I1003 08:10:06.809781 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e259c239-66b0-409a-8819-91430929950a-log-httpd\") pod \"e259c239-66b0-409a-8819-91430929950a\" (UID: \"e259c239-66b0-409a-8819-91430929950a\") " Oct 03 08:10:06 crc kubenswrapper[4664]: I1003 08:10:06.809821 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvcwk\" (UniqueName: \"kubernetes.io/projected/e259c239-66b0-409a-8819-91430929950a-kube-api-access-cvcwk\") pod \"e259c239-66b0-409a-8819-91430929950a\" (UID: \"e259c239-66b0-409a-8819-91430929950a\") " Oct 03 08:10:06 crc kubenswrapper[4664]: I1003 08:10:06.809881 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e259c239-66b0-409a-8819-91430929950a-run-httpd\") pod \"e259c239-66b0-409a-8819-91430929950a\" (UID: \"e259c239-66b0-409a-8819-91430929950a\") " Oct 03 08:10:06 crc kubenswrapper[4664]: I1003 08:10:06.811307 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e259c239-66b0-409a-8819-91430929950a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e259c239-66b0-409a-8819-91430929950a" (UID: "e259c239-66b0-409a-8819-91430929950a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:10:06 crc kubenswrapper[4664]: I1003 08:10:06.811566 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e259c239-66b0-409a-8819-91430929950a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e259c239-66b0-409a-8819-91430929950a" (UID: "e259c239-66b0-409a-8819-91430929950a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:10:06 crc kubenswrapper[4664]: I1003 08:10:06.817263 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e259c239-66b0-409a-8819-91430929950a-kube-api-access-cvcwk" (OuterVolumeSpecName: "kube-api-access-cvcwk") pod "e259c239-66b0-409a-8819-91430929950a" (UID: "e259c239-66b0-409a-8819-91430929950a"). InnerVolumeSpecName "kube-api-access-cvcwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:10:06 crc kubenswrapper[4664]: I1003 08:10:06.817985 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e259c239-66b0-409a-8819-91430929950a-scripts" (OuterVolumeSpecName: "scripts") pod "e259c239-66b0-409a-8819-91430929950a" (UID: "e259c239-66b0-409a-8819-91430929950a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:10:06 crc kubenswrapper[4664]: I1003 08:10:06.840767 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e259c239-66b0-409a-8819-91430929950a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e259c239-66b0-409a-8819-91430929950a" (UID: "e259c239-66b0-409a-8819-91430929950a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:10:06 crc kubenswrapper[4664]: I1003 08:10:06.899433 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e259c239-66b0-409a-8819-91430929950a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e259c239-66b0-409a-8819-91430929950a" (UID: "e259c239-66b0-409a-8819-91430929950a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:10:06 crc kubenswrapper[4664]: I1003 08:10:06.912253 4664 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e259c239-66b0-409a-8819-91430929950a-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:06 crc kubenswrapper[4664]: I1003 08:10:06.912283 4664 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e259c239-66b0-409a-8819-91430929950a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:06 crc kubenswrapper[4664]: I1003 08:10:06.912295 4664 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e259c239-66b0-409a-8819-91430929950a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:06 crc kubenswrapper[4664]: I1003 08:10:06.912306 4664 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e259c239-66b0-409a-8819-91430929950a-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:06 crc kubenswrapper[4664]: I1003 08:10:06.912317 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvcwk\" (UniqueName: \"kubernetes.io/projected/e259c239-66b0-409a-8819-91430929950a-kube-api-access-cvcwk\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:06 crc kubenswrapper[4664]: I1003 08:10:06.912325 4664 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e259c239-66b0-409a-8819-91430929950a-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:06 crc kubenswrapper[4664]: I1003 08:10:06.933144 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e259c239-66b0-409a-8819-91430929950a-config-data" (OuterVolumeSpecName: "config-data") pod "e259c239-66b0-409a-8819-91430929950a" (UID: "e259c239-66b0-409a-8819-91430929950a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:10:07 crc kubenswrapper[4664]: I1003 08:10:07.014303 4664 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e259c239-66b0-409a-8819-91430929950a-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:07 crc kubenswrapper[4664]: W1003 08:10:07.043487 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbaf2ab95_9e7c_4e4f_b653_c4dbb00b0745.slice/crio-73f0b8f74947e35e4c4af6500cff9f8b8f05000063c226f9dd70a003885ed614 WatchSource:0}: Error finding container 73f0b8f74947e35e4c4af6500cff9f8b8f05000063c226f9dd70a003885ed614: Status 404 returned error can't find the container with id 73f0b8f74947e35e4c4af6500cff9f8b8f05000063c226f9dd70a003885ed614 Oct 03 08:10:07 crc kubenswrapper[4664]: I1003 08:10:07.044065 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-4q9vk"] Oct 03 08:10:07 crc kubenswrapper[4664]: I1003 08:10:07.094800 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-bcb7d647f-zjhrm" Oct 03 08:10:07 crc kubenswrapper[4664]: I1003 08:10:07.094880 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-bcb7d647f-zjhrm" Oct 03 08:10:07 crc kubenswrapper[4664]: I1003 08:10:07.156922 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"06ee9d13-7b0c-4619-8421-6f1a5d8a2f05","Type":"ContainerStarted","Data":"76afa978484182f06e866924548c19ed129f5eb01d9689e314182aaba924df0c"} Oct 03 08:10:07 crc kubenswrapper[4664]: I1003 08:10:07.160686 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-zp2sz"] Oct 03 08:10:07 crc kubenswrapper[4664]: I1003 08:10:07.163919 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-4q9vk" event={"ID":"baf2ab95-9e7c-4e4f-b653-c4dbb00b0745","Type":"ContainerStarted","Data":"73f0b8f74947e35e4c4af6500cff9f8b8f05000063c226f9dd70a003885ed614"} Oct 03 08:10:07 crc kubenswrapper[4664]: W1003 08:10:07.165675 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1af25c5f_f748_4b42_8c46_f7929cb72bdf.slice/crio-d90708c6ecf7cbfac991b2bbb4bee73e5e2e703300302773145b52efc41b84b0 WatchSource:0}: Error finding container d90708c6ecf7cbfac991b2bbb4bee73e5e2e703300302773145b52efc41b84b0: Status 404 returned error can't find the container with id d90708c6ecf7cbfac991b2bbb4bee73e5e2e703300302773145b52efc41b84b0 Oct 03 08:10:07 crc kubenswrapper[4664]: W1003 08:10:07.171003 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76e6be14_99be_4c39_8b45_43cf4dd937c5.slice/crio-f57fa68dfe68d122266301e2dc35efc6359b00577c901a8ba0096cb5f721eda4 WatchSource:0}: Error finding container f57fa68dfe68d122266301e2dc35efc6359b00577c901a8ba0096cb5f721eda4: Status 404 returned error can't find the container with id f57fa68dfe68d122266301e2dc35efc6359b00577c901a8ba0096cb5f721eda4 Oct 03 08:10:07 crc kubenswrapper[4664]: I1003 08:10:07.172585 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e259c239-66b0-409a-8819-91430929950a","Type":"ContainerDied","Data":"b007fc15da1f7652c2e2e7bb3d38853648fd2c936f8817ef0530ca1fa03a75ac"} Oct 03 08:10:07 crc kubenswrapper[4664]: I1003 08:10:07.172684 4664 scope.go:117] "RemoveContainer" containerID="19b56281c965636b4ff097ca0f6d749632f22c733e4c76eaf2d6d5bda8d87c50" Oct 03 08:10:07 crc kubenswrapper[4664]: I1003 08:10:07.172859 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 08:10:07 crc kubenswrapper[4664]: I1003 08:10:07.201088 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-n7nck"] Oct 03 08:10:07 crc kubenswrapper[4664]: I1003 08:10:07.212736 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.796925915 podStartE2EDuration="15.212704257s" podCreationTimestamp="2025-10-03 08:09:52 +0000 UTC" firstStartedPulling="2025-10-03 08:09:53.054174804 +0000 UTC m=+1293.875365294" lastFinishedPulling="2025-10-03 08:10:06.469953146 +0000 UTC m=+1307.291143636" observedRunningTime="2025-10-03 08:10:07.180071508 +0000 UTC m=+1308.001262018" watchObservedRunningTime="2025-10-03 08:10:07.212704257 +0000 UTC m=+1308.033894747" Oct 03 08:10:07 crc kubenswrapper[4664]: I1003 08:10:07.242750 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:10:07 crc kubenswrapper[4664]: I1003 08:10:07.252893 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:10:07 crc kubenswrapper[4664]: I1003 08:10:07.256943 4664 scope.go:117] "RemoveContainer" containerID="90abca23b94a01f13efa40f02002f7a387d97640b51b60640219ec78d31b2367" Oct 03 08:10:07 crc kubenswrapper[4664]: I1003 08:10:07.278565 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:10:07 crc kubenswrapper[4664]: E1003 08:10:07.279516 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e259c239-66b0-409a-8819-91430929950a" containerName="proxy-httpd" Oct 03 08:10:07 crc kubenswrapper[4664]: I1003 08:10:07.279554 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="e259c239-66b0-409a-8819-91430929950a" containerName="proxy-httpd" Oct 03 08:10:07 crc kubenswrapper[4664]: E1003 08:10:07.280130 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e259c239-66b0-409a-8819-91430929950a" containerName="sg-core" Oct 03 08:10:07 crc kubenswrapper[4664]: I1003 08:10:07.280154 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="e259c239-66b0-409a-8819-91430929950a" containerName="sg-core" Oct 03 08:10:07 crc kubenswrapper[4664]: E1003 08:10:07.280178 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e259c239-66b0-409a-8819-91430929950a" containerName="ceilometer-notification-agent" Oct 03 08:10:07 crc kubenswrapper[4664]: I1003 08:10:07.280186 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="e259c239-66b0-409a-8819-91430929950a" containerName="ceilometer-notification-agent" Oct 03 08:10:07 crc kubenswrapper[4664]: E1003 08:10:07.280200 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e259c239-66b0-409a-8819-91430929950a" containerName="ceilometer-central-agent" Oct 03 08:10:07 crc kubenswrapper[4664]: I1003 08:10:07.280207 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="e259c239-66b0-409a-8819-91430929950a" containerName="ceilometer-central-agent" Oct 03 08:10:07 crc kubenswrapper[4664]: I1003 08:10:07.280415 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="e259c239-66b0-409a-8819-91430929950a" containerName="proxy-httpd" Oct 03 08:10:07 crc kubenswrapper[4664]: I1003 08:10:07.280442 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="e259c239-66b0-409a-8819-91430929950a" containerName="ceilometer-notification-agent" Oct 03 08:10:07 crc kubenswrapper[4664]: I1003 08:10:07.280453 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="e259c239-66b0-409a-8819-91430929950a" containerName="ceilometer-central-agent" Oct 03 08:10:07 crc kubenswrapper[4664]: I1003 08:10:07.280461 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="e259c239-66b0-409a-8819-91430929950a" containerName="sg-core" Oct 03 08:10:07 crc kubenswrapper[4664]: I1003 08:10:07.283075 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 08:10:07 crc kubenswrapper[4664]: I1003 08:10:07.302365 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 08:10:07 crc kubenswrapper[4664]: I1003 08:10:07.313429 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:10:07 crc kubenswrapper[4664]: I1003 08:10:07.321142 4664 scope.go:117] "RemoveContainer" containerID="288a4092fd889c4bfc68e6659997e8fe06bbf3f67cd75e26f921c8e8b07f6632" Oct 03 08:10:07 crc kubenswrapper[4664]: I1003 08:10:07.336785 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 08:10:07 crc kubenswrapper[4664]: I1003 08:10:07.425749 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39f05918-4374-41a2-803f-2a3b930a87ee-run-httpd\") pod \"ceilometer-0\" (UID: \"39f05918-4374-41a2-803f-2a3b930a87ee\") " pod="openstack/ceilometer-0" Oct 03 08:10:07 crc kubenswrapper[4664]: I1003 08:10:07.425957 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39f05918-4374-41a2-803f-2a3b930a87ee-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"39f05918-4374-41a2-803f-2a3b930a87ee\") " pod="openstack/ceilometer-0" Oct 03 08:10:07 crc kubenswrapper[4664]: I1003 08:10:07.426102 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh4m5\" (UniqueName: \"kubernetes.io/projected/39f05918-4374-41a2-803f-2a3b930a87ee-kube-api-access-fh4m5\") pod \"ceilometer-0\" (UID: \"39f05918-4374-41a2-803f-2a3b930a87ee\") " pod="openstack/ceilometer-0" Oct 03 08:10:07 crc kubenswrapper[4664]: I1003 08:10:07.426253 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39f05918-4374-41a2-803f-2a3b930a87ee-log-httpd\") pod \"ceilometer-0\" (UID: \"39f05918-4374-41a2-803f-2a3b930a87ee\") " pod="openstack/ceilometer-0" Oct 03 08:10:07 crc kubenswrapper[4664]: I1003 08:10:07.426297 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39f05918-4374-41a2-803f-2a3b930a87ee-scripts\") pod \"ceilometer-0\" (UID: \"39f05918-4374-41a2-803f-2a3b930a87ee\") " pod="openstack/ceilometer-0" Oct 03 08:10:07 crc kubenswrapper[4664]: I1003 08:10:07.426329 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/39f05918-4374-41a2-803f-2a3b930a87ee-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"39f05918-4374-41a2-803f-2a3b930a87ee\") " pod="openstack/ceilometer-0" Oct 03 08:10:07 crc kubenswrapper[4664]: I1003 08:10:07.426405 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39f05918-4374-41a2-803f-2a3b930a87ee-config-data\") pod \"ceilometer-0\" (UID: \"39f05918-4374-41a2-803f-2a3b930a87ee\") " pod="openstack/ceilometer-0" Oct 03 08:10:07 crc kubenswrapper[4664]: I1003 08:10:07.528867 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39f05918-4374-41a2-803f-2a3b930a87ee-run-httpd\") pod \"ceilometer-0\" (UID: \"39f05918-4374-41a2-803f-2a3b930a87ee\") " pod="openstack/ceilometer-0" Oct 03 08:10:07 crc kubenswrapper[4664]: I1003 08:10:07.528942 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39f05918-4374-41a2-803f-2a3b930a87ee-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"39f05918-4374-41a2-803f-2a3b930a87ee\") " pod="openstack/ceilometer-0" Oct 03 08:10:07 crc kubenswrapper[4664]: I1003 08:10:07.528985 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fh4m5\" (UniqueName: \"kubernetes.io/projected/39f05918-4374-41a2-803f-2a3b930a87ee-kube-api-access-fh4m5\") pod \"ceilometer-0\" (UID: \"39f05918-4374-41a2-803f-2a3b930a87ee\") " pod="openstack/ceilometer-0" Oct 03 08:10:07 crc kubenswrapper[4664]: I1003 08:10:07.529028 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39f05918-4374-41a2-803f-2a3b930a87ee-log-httpd\") pod \"ceilometer-0\" (UID: \"39f05918-4374-41a2-803f-2a3b930a87ee\") " pod="openstack/ceilometer-0" Oct 03 08:10:07 crc kubenswrapper[4664]: I1003 08:10:07.529048 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39f05918-4374-41a2-803f-2a3b930a87ee-scripts\") pod \"ceilometer-0\" (UID: \"39f05918-4374-41a2-803f-2a3b930a87ee\") " pod="openstack/ceilometer-0" Oct 03 08:10:07 crc kubenswrapper[4664]: I1003 08:10:07.529068 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/39f05918-4374-41a2-803f-2a3b930a87ee-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"39f05918-4374-41a2-803f-2a3b930a87ee\") " pod="openstack/ceilometer-0" Oct 03 08:10:07 crc kubenswrapper[4664]: I1003 08:10:07.529098 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39f05918-4374-41a2-803f-2a3b930a87ee-config-data\") pod \"ceilometer-0\" (UID: \"39f05918-4374-41a2-803f-2a3b930a87ee\") " pod="openstack/ceilometer-0" Oct 03 08:10:07 crc kubenswrapper[4664]: I1003 08:10:07.530006 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39f05918-4374-41a2-803f-2a3b930a87ee-log-httpd\") pod \"ceilometer-0\" (UID: \"39f05918-4374-41a2-803f-2a3b930a87ee\") " pod="openstack/ceilometer-0" Oct 03 08:10:07 crc kubenswrapper[4664]: I1003 08:10:07.530053 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39f05918-4374-41a2-803f-2a3b930a87ee-run-httpd\") pod \"ceilometer-0\" (UID: \"39f05918-4374-41a2-803f-2a3b930a87ee\") " pod="openstack/ceilometer-0" Oct 03 08:10:07 crc kubenswrapper[4664]: I1003 08:10:07.540140 4664 scope.go:117] "RemoveContainer" containerID="9ae2ef0d9c43e7e6df456cbd87079e52b75a8230fbc13b17fa002c29763e687b" Oct 03 08:10:07 crc kubenswrapper[4664]: I1003 08:10:07.541130 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39f05918-4374-41a2-803f-2a3b930a87ee-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"39f05918-4374-41a2-803f-2a3b930a87ee\") " pod="openstack/ceilometer-0" Oct 03 08:10:07 crc kubenswrapper[4664]: I1003 08:10:07.543504 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39f05918-4374-41a2-803f-2a3b930a87ee-scripts\") pod \"ceilometer-0\" (UID: \"39f05918-4374-41a2-803f-2a3b930a87ee\") " pod="openstack/ceilometer-0" Oct 03 08:10:07 crc kubenswrapper[4664]: I1003 08:10:07.563107 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/39f05918-4374-41a2-803f-2a3b930a87ee-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"39f05918-4374-41a2-803f-2a3b930a87ee\") " pod="openstack/ceilometer-0" Oct 03 08:10:07 crc kubenswrapper[4664]: I1003 08:10:07.572281 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39f05918-4374-41a2-803f-2a3b930a87ee-config-data\") pod \"ceilometer-0\" (UID: \"39f05918-4374-41a2-803f-2a3b930a87ee\") " pod="openstack/ceilometer-0" Oct 03 08:10:07 crc kubenswrapper[4664]: I1003 08:10:07.589377 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh4m5\" (UniqueName: \"kubernetes.io/projected/39f05918-4374-41a2-803f-2a3b930a87ee-kube-api-access-fh4m5\") pod \"ceilometer-0\" (UID: \"39f05918-4374-41a2-803f-2a3b930a87ee\") " pod="openstack/ceilometer-0" Oct 03 08:10:07 crc kubenswrapper[4664]: I1003 08:10:07.840684 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 08:10:07 crc kubenswrapper[4664]: I1003 08:10:07.891526 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e259c239-66b0-409a-8819-91430929950a" path="/var/lib/kubelet/pods/e259c239-66b0-409a-8819-91430929950a/volumes" Oct 03 08:10:08 crc kubenswrapper[4664]: I1003 08:10:08.181169 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 03 08:10:08 crc kubenswrapper[4664]: I1003 08:10:08.211220 4664 generic.go:334] "Generic (PLEG): container finished" podID="baf2ab95-9e7c-4e4f-b653-c4dbb00b0745" containerID="a166b452207f8454dd4f6d38f6173d2e37f64c6aa48d74c3e1e4cdcc87843d0c" exitCode=0 Oct 03 08:10:08 crc kubenswrapper[4664]: I1003 08:10:08.211302 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-4q9vk" event={"ID":"baf2ab95-9e7c-4e4f-b653-c4dbb00b0745","Type":"ContainerDied","Data":"a166b452207f8454dd4f6d38f6173d2e37f64c6aa48d74c3e1e4cdcc87843d0c"} Oct 03 08:10:08 crc kubenswrapper[4664]: I1003 08:10:08.231764 4664 generic.go:334] "Generic (PLEG): container finished" podID="76e6be14-99be-4c39-8b45-43cf4dd937c5" containerID="121c0f76bc2bdf81c4167dd3c3b09a06365ffdc73b78ef105096b7a9e8cf653a" exitCode=0 Oct 03 08:10:08 crc kubenswrapper[4664]: I1003 08:10:08.231852 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-n7nck" event={"ID":"76e6be14-99be-4c39-8b45-43cf4dd937c5","Type":"ContainerDied","Data":"121c0f76bc2bdf81c4167dd3c3b09a06365ffdc73b78ef105096b7a9e8cf653a"} Oct 03 08:10:08 crc kubenswrapper[4664]: I1003 08:10:08.231883 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-n7nck" event={"ID":"76e6be14-99be-4c39-8b45-43cf4dd937c5","Type":"ContainerStarted","Data":"f57fa68dfe68d122266301e2dc35efc6359b00577c901a8ba0096cb5f721eda4"} Oct 03 08:10:08 crc kubenswrapper[4664]: I1003 08:10:08.233767 4664 generic.go:334] "Generic (PLEG): container finished" podID="1af25c5f-f748-4b42-8c46-f7929cb72bdf" containerID="4ac161b7ccb2f318cb332bf0e1727ea22d29eb30c89b84332c224f983685b3fa" exitCode=0 Oct 03 08:10:08 crc kubenswrapper[4664]: I1003 08:10:08.233814 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-zp2sz" event={"ID":"1af25c5f-f748-4b42-8c46-f7929cb72bdf","Type":"ContainerDied","Data":"4ac161b7ccb2f318cb332bf0e1727ea22d29eb30c89b84332c224f983685b3fa"} Oct 03 08:10:08 crc kubenswrapper[4664]: I1003 08:10:08.233829 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-zp2sz" event={"ID":"1af25c5f-f748-4b42-8c46-f7929cb72bdf","Type":"ContainerStarted","Data":"d90708c6ecf7cbfac991b2bbb4bee73e5e2e703300302773145b52efc41b84b0"} Oct 03 08:10:08 crc kubenswrapper[4664]: I1003 08:10:08.266698 4664 generic.go:334] "Generic (PLEG): container finished" podID="30ce3373-ef30-4727-b57f-5be7963d1892" containerID="b67e64d8a85e712979c8209e18f5938ff206128e2c3d1b5097e5fa9956960c1c" exitCode=137 Oct 03 08:10:08 crc kubenswrapper[4664]: I1003 08:10:08.267559 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76644f9584-br5jb" event={"ID":"30ce3373-ef30-4727-b57f-5be7963d1892","Type":"ContainerDied","Data":"b67e64d8a85e712979c8209e18f5938ff206128e2c3d1b5097e5fa9956960c1c"} Oct 03 08:10:08 crc kubenswrapper[4664]: I1003 08:10:08.400118 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:10:08 crc kubenswrapper[4664]: W1003 08:10:08.400437 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39f05918_4374_41a2_803f_2a3b930a87ee.slice/crio-f2faa20a780834a4882ff6d753089e5042a28bc35d2df831f88ac4e02b0aa686 WatchSource:0}: Error finding container f2faa20a780834a4882ff6d753089e5042a28bc35d2df831f88ac4e02b0aa686: Status 404 returned error can't find the container with id f2faa20a780834a4882ff6d753089e5042a28bc35d2df831f88ac4e02b0aa686 Oct 03 08:10:08 crc kubenswrapper[4664]: I1003 08:10:08.580486 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76644f9584-br5jb" Oct 03 08:10:08 crc kubenswrapper[4664]: I1003 08:10:08.658987 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/30ce3373-ef30-4727-b57f-5be7963d1892-horizon-secret-key\") pod \"30ce3373-ef30-4727-b57f-5be7963d1892\" (UID: \"30ce3373-ef30-4727-b57f-5be7963d1892\") " Oct 03 08:10:08 crc kubenswrapper[4664]: I1003 08:10:08.659104 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30ce3373-ef30-4727-b57f-5be7963d1892-logs\") pod \"30ce3373-ef30-4727-b57f-5be7963d1892\" (UID: \"30ce3373-ef30-4727-b57f-5be7963d1892\") " Oct 03 08:10:08 crc kubenswrapper[4664]: I1003 08:10:08.659153 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/30ce3373-ef30-4727-b57f-5be7963d1892-horizon-tls-certs\") pod \"30ce3373-ef30-4727-b57f-5be7963d1892\" (UID: \"30ce3373-ef30-4727-b57f-5be7963d1892\") " Oct 03 08:10:08 crc kubenswrapper[4664]: I1003 08:10:08.659298 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwxlz\" (UniqueName: \"kubernetes.io/projected/30ce3373-ef30-4727-b57f-5be7963d1892-kube-api-access-qwxlz\") pod \"30ce3373-ef30-4727-b57f-5be7963d1892\" (UID: \"30ce3373-ef30-4727-b57f-5be7963d1892\") " Oct 03 08:10:08 crc kubenswrapper[4664]: I1003 08:10:08.659325 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/30ce3373-ef30-4727-b57f-5be7963d1892-config-data\") pod \"30ce3373-ef30-4727-b57f-5be7963d1892\" (UID: \"30ce3373-ef30-4727-b57f-5be7963d1892\") " Oct 03 08:10:08 crc kubenswrapper[4664]: I1003 08:10:08.659350 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30ce3373-ef30-4727-b57f-5be7963d1892-combined-ca-bundle\") pod \"30ce3373-ef30-4727-b57f-5be7963d1892\" (UID: \"30ce3373-ef30-4727-b57f-5be7963d1892\") " Oct 03 08:10:08 crc kubenswrapper[4664]: I1003 08:10:08.659381 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/30ce3373-ef30-4727-b57f-5be7963d1892-scripts\") pod \"30ce3373-ef30-4727-b57f-5be7963d1892\" (UID: \"30ce3373-ef30-4727-b57f-5be7963d1892\") " Oct 03 08:10:08 crc kubenswrapper[4664]: I1003 08:10:08.659500 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30ce3373-ef30-4727-b57f-5be7963d1892-logs" (OuterVolumeSpecName: "logs") pod "30ce3373-ef30-4727-b57f-5be7963d1892" (UID: "30ce3373-ef30-4727-b57f-5be7963d1892"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:10:08 crc kubenswrapper[4664]: I1003 08:10:08.660130 4664 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30ce3373-ef30-4727-b57f-5be7963d1892-logs\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:08 crc kubenswrapper[4664]: I1003 08:10:08.670845 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30ce3373-ef30-4727-b57f-5be7963d1892-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "30ce3373-ef30-4727-b57f-5be7963d1892" (UID: "30ce3373-ef30-4727-b57f-5be7963d1892"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:10:08 crc kubenswrapper[4664]: I1003 08:10:08.670884 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30ce3373-ef30-4727-b57f-5be7963d1892-kube-api-access-qwxlz" (OuterVolumeSpecName: "kube-api-access-qwxlz") pod "30ce3373-ef30-4727-b57f-5be7963d1892" (UID: "30ce3373-ef30-4727-b57f-5be7963d1892"). InnerVolumeSpecName "kube-api-access-qwxlz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:10:08 crc kubenswrapper[4664]: I1003 08:10:08.739207 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30ce3373-ef30-4727-b57f-5be7963d1892-scripts" (OuterVolumeSpecName: "scripts") pod "30ce3373-ef30-4727-b57f-5be7963d1892" (UID: "30ce3373-ef30-4727-b57f-5be7963d1892"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:10:08 crc kubenswrapper[4664]: I1003 08:10:08.744197 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30ce3373-ef30-4727-b57f-5be7963d1892-config-data" (OuterVolumeSpecName: "config-data") pod "30ce3373-ef30-4727-b57f-5be7963d1892" (UID: "30ce3373-ef30-4727-b57f-5be7963d1892"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:10:08 crc kubenswrapper[4664]: I1003 08:10:08.753602 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30ce3373-ef30-4727-b57f-5be7963d1892-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30ce3373-ef30-4727-b57f-5be7963d1892" (UID: "30ce3373-ef30-4727-b57f-5be7963d1892"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:10:08 crc kubenswrapper[4664]: I1003 08:10:08.765076 4664 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/30ce3373-ef30-4727-b57f-5be7963d1892-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:08 crc kubenswrapper[4664]: I1003 08:10:08.765106 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwxlz\" (UniqueName: \"kubernetes.io/projected/30ce3373-ef30-4727-b57f-5be7963d1892-kube-api-access-qwxlz\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:08 crc kubenswrapper[4664]: I1003 08:10:08.765120 4664 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/30ce3373-ef30-4727-b57f-5be7963d1892-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:08 crc kubenswrapper[4664]: I1003 08:10:08.765131 4664 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30ce3373-ef30-4727-b57f-5be7963d1892-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:08 crc kubenswrapper[4664]: I1003 08:10:08.765139 4664 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/30ce3373-ef30-4727-b57f-5be7963d1892-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:08 crc kubenswrapper[4664]: I1003 08:10:08.766107 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 08:10:08 crc kubenswrapper[4664]: I1003 08:10:08.766379 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c7b4a287-6826-4b8f-945a-aad1d1deb92a" containerName="glance-log" containerID="cri-o://54d3fbc4e473e2f16d19aae198c89dc0d7dee395d64fa9560554f9bb24e82498" gracePeriod=30 Oct 03 08:10:08 crc kubenswrapper[4664]: I1003 08:10:08.766452 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c7b4a287-6826-4b8f-945a-aad1d1deb92a" containerName="glance-httpd" containerID="cri-o://6e101e3f9ad77dfaaff43805f29a5ac8937b6ae5f4dcde279efd7cf4b3c253a0" gracePeriod=30 Oct 03 08:10:08 crc kubenswrapper[4664]: I1003 08:10:08.789650 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30ce3373-ef30-4727-b57f-5be7963d1892-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "30ce3373-ef30-4727-b57f-5be7963d1892" (UID: "30ce3373-ef30-4727-b57f-5be7963d1892"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:10:08 crc kubenswrapper[4664]: I1003 08:10:08.867849 4664 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/30ce3373-ef30-4727-b57f-5be7963d1892-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:09 crc kubenswrapper[4664]: I1003 08:10:09.075930 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:10:09 crc kubenswrapper[4664]: I1003 08:10:09.278131 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"39f05918-4374-41a2-803f-2a3b930a87ee","Type":"ContainerStarted","Data":"f928b8b80226cdb896d8632a787444b44896de53c86391c49bde5f2936d2cd13"} Oct 03 08:10:09 crc kubenswrapper[4664]: I1003 08:10:09.279588 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"39f05918-4374-41a2-803f-2a3b930a87ee","Type":"ContainerStarted","Data":"f2faa20a780834a4882ff6d753089e5042a28bc35d2df831f88ac4e02b0aa686"} Oct 03 08:10:09 crc kubenswrapper[4664]: I1003 08:10:09.281000 4664 generic.go:334] "Generic (PLEG): container finished" podID="6837bc3c-3e78-491c-8f0c-377955560009" containerID="eb1303c4f6e4d5ecb19909b618518eb2db4beabcd25acd9db5c40270d97f1986" exitCode=0 Oct 03 08:10:09 crc kubenswrapper[4664]: I1003 08:10:09.281078 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6837bc3c-3e78-491c-8f0c-377955560009","Type":"ContainerDied","Data":"eb1303c4f6e4d5ecb19909b618518eb2db4beabcd25acd9db5c40270d97f1986"} Oct 03 08:10:09 crc kubenswrapper[4664]: I1003 08:10:09.283747 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76644f9584-br5jb" event={"ID":"30ce3373-ef30-4727-b57f-5be7963d1892","Type":"ContainerDied","Data":"95b8c3354ba1366154c5515dcc0c60816b0b8b99bd0cfe357cac70a70d8a6930"} Oct 03 08:10:09 crc kubenswrapper[4664]: I1003 08:10:09.283783 4664 scope.go:117] "RemoveContainer" containerID="78ce76bbf2905152072cab64efc16b2aac3c660597a68006c974ce630c008830" Oct 03 08:10:09 crc kubenswrapper[4664]: I1003 08:10:09.283896 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76644f9584-br5jb" Oct 03 08:10:09 crc kubenswrapper[4664]: I1003 08:10:09.298345 4664 generic.go:334] "Generic (PLEG): container finished" podID="c7b4a287-6826-4b8f-945a-aad1d1deb92a" containerID="54d3fbc4e473e2f16d19aae198c89dc0d7dee395d64fa9560554f9bb24e82498" exitCode=143 Oct 03 08:10:09 crc kubenswrapper[4664]: I1003 08:10:09.298479 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c7b4a287-6826-4b8f-945a-aad1d1deb92a","Type":"ContainerDied","Data":"54d3fbc4e473e2f16d19aae198c89dc0d7dee395d64fa9560554f9bb24e82498"} Oct 03 08:10:09 crc kubenswrapper[4664]: I1003 08:10:09.339772 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-76644f9584-br5jb"] Oct 03 08:10:09 crc kubenswrapper[4664]: I1003 08:10:09.350394 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-76644f9584-br5jb"] Oct 03 08:10:09 crc kubenswrapper[4664]: I1003 08:10:09.521867 4664 scope.go:117] "RemoveContainer" containerID="b67e64d8a85e712979c8209e18f5938ff206128e2c3d1b5097e5fa9956960c1c" Oct 03 08:10:09 crc kubenswrapper[4664]: I1003 08:10:09.804149 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-4q9vk" Oct 03 08:10:09 crc kubenswrapper[4664]: I1003 08:10:09.913051 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xm9nn\" (UniqueName: \"kubernetes.io/projected/baf2ab95-9e7c-4e4f-b653-c4dbb00b0745-kube-api-access-xm9nn\") pod \"baf2ab95-9e7c-4e4f-b653-c4dbb00b0745\" (UID: \"baf2ab95-9e7c-4e4f-b653-c4dbb00b0745\") " Oct 03 08:10:09 crc kubenswrapper[4664]: I1003 08:10:09.925086 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/baf2ab95-9e7c-4e4f-b653-c4dbb00b0745-kube-api-access-xm9nn" (OuterVolumeSpecName: "kube-api-access-xm9nn") pod "baf2ab95-9e7c-4e4f-b653-c4dbb00b0745" (UID: "baf2ab95-9e7c-4e4f-b653-c4dbb00b0745"). InnerVolumeSpecName "kube-api-access-xm9nn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:10:09 crc kubenswrapper[4664]: I1003 08:10:09.940444 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-n7nck" Oct 03 08:10:09 crc kubenswrapper[4664]: I1003 08:10:09.951729 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 08:10:09 crc kubenswrapper[4664]: I1003 08:10:09.955695 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30ce3373-ef30-4727-b57f-5be7963d1892" path="/var/lib/kubelet/pods/30ce3373-ef30-4727-b57f-5be7963d1892/volumes" Oct 03 08:10:09 crc kubenswrapper[4664]: I1003 08:10:09.967389 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zp2sz" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.015279 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6837bc3c-3e78-491c-8f0c-377955560009-logs\") pod \"6837bc3c-3e78-491c-8f0c-377955560009\" (UID: \"6837bc3c-3e78-491c-8f0c-377955560009\") " Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.015333 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6837bc3c-3e78-491c-8f0c-377955560009-public-tls-certs\") pod \"6837bc3c-3e78-491c-8f0c-377955560009\" (UID: \"6837bc3c-3e78-491c-8f0c-377955560009\") " Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.015357 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p494m\" (UniqueName: \"kubernetes.io/projected/6837bc3c-3e78-491c-8f0c-377955560009-kube-api-access-p494m\") pod \"6837bc3c-3e78-491c-8f0c-377955560009\" (UID: \"6837bc3c-3e78-491c-8f0c-377955560009\") " Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.015489 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6837bc3c-3e78-491c-8f0c-377955560009-scripts\") pod \"6837bc3c-3e78-491c-8f0c-377955560009\" (UID: \"6837bc3c-3e78-491c-8f0c-377955560009\") " Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.015519 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62slg\" (UniqueName: \"kubernetes.io/projected/76e6be14-99be-4c39-8b45-43cf4dd937c5-kube-api-access-62slg\") pod \"76e6be14-99be-4c39-8b45-43cf4dd937c5\" (UID: \"76e6be14-99be-4c39-8b45-43cf4dd937c5\") " Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.015565 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6837bc3c-3e78-491c-8f0c-377955560009-combined-ca-bundle\") pod \"6837bc3c-3e78-491c-8f0c-377955560009\" (UID: \"6837bc3c-3e78-491c-8f0c-377955560009\") " Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.015633 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"6837bc3c-3e78-491c-8f0c-377955560009\" (UID: \"6837bc3c-3e78-491c-8f0c-377955560009\") " Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.015707 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6837bc3c-3e78-491c-8f0c-377955560009-httpd-run\") pod \"6837bc3c-3e78-491c-8f0c-377955560009\" (UID: \"6837bc3c-3e78-491c-8f0c-377955560009\") " Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.015767 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6837bc3c-3e78-491c-8f0c-377955560009-config-data\") pod \"6837bc3c-3e78-491c-8f0c-377955560009\" (UID: \"6837bc3c-3e78-491c-8f0c-377955560009\") " Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.016174 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xm9nn\" (UniqueName: \"kubernetes.io/projected/baf2ab95-9e7c-4e4f-b653-c4dbb00b0745-kube-api-access-xm9nn\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.018035 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6837bc3c-3e78-491c-8f0c-377955560009-logs" (OuterVolumeSpecName: "logs") pod "6837bc3c-3e78-491c-8f0c-377955560009" (UID: "6837bc3c-3e78-491c-8f0c-377955560009"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.026416 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "6837bc3c-3e78-491c-8f0c-377955560009" (UID: "6837bc3c-3e78-491c-8f0c-377955560009"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.029585 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6837bc3c-3e78-491c-8f0c-377955560009-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6837bc3c-3e78-491c-8f0c-377955560009" (UID: "6837bc3c-3e78-491c-8f0c-377955560009"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.049790 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6837bc3c-3e78-491c-8f0c-377955560009-kube-api-access-p494m" (OuterVolumeSpecName: "kube-api-access-p494m") pod "6837bc3c-3e78-491c-8f0c-377955560009" (UID: "6837bc3c-3e78-491c-8f0c-377955560009"). InnerVolumeSpecName "kube-api-access-p494m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.052950 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6837bc3c-3e78-491c-8f0c-377955560009-scripts" (OuterVolumeSpecName: "scripts") pod "6837bc3c-3e78-491c-8f0c-377955560009" (UID: "6837bc3c-3e78-491c-8f0c-377955560009"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.095642 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76e6be14-99be-4c39-8b45-43cf4dd937c5-kube-api-access-62slg" (OuterVolumeSpecName: "kube-api-access-62slg") pod "76e6be14-99be-4c39-8b45-43cf4dd937c5" (UID: "76e6be14-99be-4c39-8b45-43cf4dd937c5"). InnerVolumeSpecName "kube-api-access-62slg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.102835 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6837bc3c-3e78-491c-8f0c-377955560009-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6837bc3c-3e78-491c-8f0c-377955560009" (UID: "6837bc3c-3e78-491c-8f0c-377955560009"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.111805 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6837bc3c-3e78-491c-8f0c-377955560009-config-data" (OuterVolumeSpecName: "config-data") pod "6837bc3c-3e78-491c-8f0c-377955560009" (UID: "6837bc3c-3e78-491c-8f0c-377955560009"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.121366 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4v7q\" (UniqueName: \"kubernetes.io/projected/1af25c5f-f748-4b42-8c46-f7929cb72bdf-kube-api-access-p4v7q\") pod \"1af25c5f-f748-4b42-8c46-f7929cb72bdf\" (UID: \"1af25c5f-f748-4b42-8c46-f7929cb72bdf\") " Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.122027 4664 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6837bc3c-3e78-491c-8f0c-377955560009-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.122058 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62slg\" (UniqueName: \"kubernetes.io/projected/76e6be14-99be-4c39-8b45-43cf4dd937c5-kube-api-access-62slg\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.122073 4664 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6837bc3c-3e78-491c-8f0c-377955560009-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.122098 4664 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.122112 4664 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6837bc3c-3e78-491c-8f0c-377955560009-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.122122 4664 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6837bc3c-3e78-491c-8f0c-377955560009-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.122130 4664 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6837bc3c-3e78-491c-8f0c-377955560009-logs\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.122138 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p494m\" (UniqueName: \"kubernetes.io/projected/6837bc3c-3e78-491c-8f0c-377955560009-kube-api-access-p494m\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.128514 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1af25c5f-f748-4b42-8c46-f7929cb72bdf-kube-api-access-p4v7q" (OuterVolumeSpecName: "kube-api-access-p4v7q") pod "1af25c5f-f748-4b42-8c46-f7929cb72bdf" (UID: "1af25c5f-f748-4b42-8c46-f7929cb72bdf"). InnerVolumeSpecName "kube-api-access-p4v7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.144116 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6837bc3c-3e78-491c-8f0c-377955560009-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6837bc3c-3e78-491c-8f0c-377955560009" (UID: "6837bc3c-3e78-491c-8f0c-377955560009"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.164255 4664 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.224271 4664 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6837bc3c-3e78-491c-8f0c-377955560009-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.224318 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4v7q\" (UniqueName: \"kubernetes.io/projected/1af25c5f-f748-4b42-8c46-f7929cb72bdf-kube-api-access-p4v7q\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.224335 4664 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.321398 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-zp2sz" event={"ID":"1af25c5f-f748-4b42-8c46-f7929cb72bdf","Type":"ContainerDied","Data":"d90708c6ecf7cbfac991b2bbb4bee73e5e2e703300302773145b52efc41b84b0"} Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.321435 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zp2sz" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.321447 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d90708c6ecf7cbfac991b2bbb4bee73e5e2e703300302773145b52efc41b84b0" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.322826 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-n7nck" event={"ID":"76e6be14-99be-4c39-8b45-43cf4dd937c5","Type":"ContainerDied","Data":"f57fa68dfe68d122266301e2dc35efc6359b00577c901a8ba0096cb5f721eda4"} Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.322867 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f57fa68dfe68d122266301e2dc35efc6359b00577c901a8ba0096cb5f721eda4" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.322850 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-n7nck" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.325183 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6837bc3c-3e78-491c-8f0c-377955560009","Type":"ContainerDied","Data":"e2f8ee4ceeae5c30394f605c54391c20196dd76da65ac3552edb6244a3d1ab10"} Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.325382 4664 scope.go:117] "RemoveContainer" containerID="eb1303c4f6e4d5ecb19909b618518eb2db4beabcd25acd9db5c40270d97f1986" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.325627 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.340281 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"39f05918-4374-41a2-803f-2a3b930a87ee","Type":"ContainerStarted","Data":"cfffdb7c698401b7554da6b4b7ee73035e758a8e8536f34355c78d3d3fdd0ccd"} Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.342315 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-4q9vk" event={"ID":"baf2ab95-9e7c-4e4f-b653-c4dbb00b0745","Type":"ContainerDied","Data":"73f0b8f74947e35e4c4af6500cff9f8b8f05000063c226f9dd70a003885ed614"} Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.342347 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73f0b8f74947e35e4c4af6500cff9f8b8f05000063c226f9dd70a003885ed614" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.342387 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-4q9vk" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.444985 4664 scope.go:117] "RemoveContainer" containerID="2de90cfa4619e4a13019591892950d05a8b7cfea2d32a02ef57d42d8c4658c76" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.482774 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.516941 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.526600 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 08:10:10 crc kubenswrapper[4664]: E1003 08:10:10.527063 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76e6be14-99be-4c39-8b45-43cf4dd937c5" containerName="mariadb-database-create" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.527075 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="76e6be14-99be-4c39-8b45-43cf4dd937c5" containerName="mariadb-database-create" Oct 03 08:10:10 crc kubenswrapper[4664]: E1003 08:10:10.527098 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1af25c5f-f748-4b42-8c46-f7929cb72bdf" containerName="mariadb-database-create" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.527105 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="1af25c5f-f748-4b42-8c46-f7929cb72bdf" containerName="mariadb-database-create" Oct 03 08:10:10 crc kubenswrapper[4664]: E1003 08:10:10.527121 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baf2ab95-9e7c-4e4f-b653-c4dbb00b0745" containerName="mariadb-database-create" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.527127 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="baf2ab95-9e7c-4e4f-b653-c4dbb00b0745" containerName="mariadb-database-create" Oct 03 08:10:10 crc kubenswrapper[4664]: E1003 08:10:10.527143 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30ce3373-ef30-4727-b57f-5be7963d1892" containerName="horizon" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.527148 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="30ce3373-ef30-4727-b57f-5be7963d1892" containerName="horizon" Oct 03 08:10:10 crc kubenswrapper[4664]: E1003 08:10:10.527164 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6837bc3c-3e78-491c-8f0c-377955560009" containerName="glance-log" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.527172 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="6837bc3c-3e78-491c-8f0c-377955560009" containerName="glance-log" Oct 03 08:10:10 crc kubenswrapper[4664]: E1003 08:10:10.527184 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6837bc3c-3e78-491c-8f0c-377955560009" containerName="glance-httpd" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.527191 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="6837bc3c-3e78-491c-8f0c-377955560009" containerName="glance-httpd" Oct 03 08:10:10 crc kubenswrapper[4664]: E1003 08:10:10.527207 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30ce3373-ef30-4727-b57f-5be7963d1892" containerName="horizon-log" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.527213 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="30ce3373-ef30-4727-b57f-5be7963d1892" containerName="horizon-log" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.527391 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="6837bc3c-3e78-491c-8f0c-377955560009" containerName="glance-httpd" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.527402 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="6837bc3c-3e78-491c-8f0c-377955560009" containerName="glance-log" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.527417 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="1af25c5f-f748-4b42-8c46-f7929cb72bdf" containerName="mariadb-database-create" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.527428 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="30ce3373-ef30-4727-b57f-5be7963d1892" containerName="horizon" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.527437 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="baf2ab95-9e7c-4e4f-b653-c4dbb00b0745" containerName="mariadb-database-create" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.527450 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="30ce3373-ef30-4727-b57f-5be7963d1892" containerName="horizon-log" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.527462 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="76e6be14-99be-4c39-8b45-43cf4dd937c5" containerName="mariadb-database-create" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.528456 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.531713 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.531885 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.536284 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.635482 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f9a018c-ae4c-475f-a813-9b4cf0e51f49-config-data\") pod \"glance-default-external-api-0\" (UID: \"8f9a018c-ae4c-475f-a813-9b4cf0e51f49\") " pod="openstack/glance-default-external-api-0" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.635525 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f9a018c-ae4c-475f-a813-9b4cf0e51f49-logs\") pod \"glance-default-external-api-0\" (UID: \"8f9a018c-ae4c-475f-a813-9b4cf0e51f49\") " pod="openstack/glance-default-external-api-0" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.635550 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"8f9a018c-ae4c-475f-a813-9b4cf0e51f49\") " pod="openstack/glance-default-external-api-0" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.635868 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f9a018c-ae4c-475f-a813-9b4cf0e51f49-scripts\") pod \"glance-default-external-api-0\" (UID: \"8f9a018c-ae4c-475f-a813-9b4cf0e51f49\") " pod="openstack/glance-default-external-api-0" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.635924 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgm2q\" (UniqueName: \"kubernetes.io/projected/8f9a018c-ae4c-475f-a813-9b4cf0e51f49-kube-api-access-sgm2q\") pod \"glance-default-external-api-0\" (UID: \"8f9a018c-ae4c-475f-a813-9b4cf0e51f49\") " pod="openstack/glance-default-external-api-0" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.635974 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f9a018c-ae4c-475f-a813-9b4cf0e51f49-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8f9a018c-ae4c-475f-a813-9b4cf0e51f49\") " pod="openstack/glance-default-external-api-0" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.636019 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f9a018c-ae4c-475f-a813-9b4cf0e51f49-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8f9a018c-ae4c-475f-a813-9b4cf0e51f49\") " pod="openstack/glance-default-external-api-0" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.636059 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8f9a018c-ae4c-475f-a813-9b4cf0e51f49-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8f9a018c-ae4c-475f-a813-9b4cf0e51f49\") " pod="openstack/glance-default-external-api-0" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.737623 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f9a018c-ae4c-475f-a813-9b4cf0e51f49-config-data\") pod \"glance-default-external-api-0\" (UID: \"8f9a018c-ae4c-475f-a813-9b4cf0e51f49\") " pod="openstack/glance-default-external-api-0" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.737681 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f9a018c-ae4c-475f-a813-9b4cf0e51f49-logs\") pod \"glance-default-external-api-0\" (UID: \"8f9a018c-ae4c-475f-a813-9b4cf0e51f49\") " pod="openstack/glance-default-external-api-0" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.737712 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"8f9a018c-ae4c-475f-a813-9b4cf0e51f49\") " pod="openstack/glance-default-external-api-0" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.737749 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f9a018c-ae4c-475f-a813-9b4cf0e51f49-scripts\") pod \"glance-default-external-api-0\" (UID: \"8f9a018c-ae4c-475f-a813-9b4cf0e51f49\") " pod="openstack/glance-default-external-api-0" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.737798 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgm2q\" (UniqueName: \"kubernetes.io/projected/8f9a018c-ae4c-475f-a813-9b4cf0e51f49-kube-api-access-sgm2q\") pod \"glance-default-external-api-0\" (UID: \"8f9a018c-ae4c-475f-a813-9b4cf0e51f49\") " pod="openstack/glance-default-external-api-0" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.737854 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f9a018c-ae4c-475f-a813-9b4cf0e51f49-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8f9a018c-ae4c-475f-a813-9b4cf0e51f49\") " pod="openstack/glance-default-external-api-0" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.737898 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f9a018c-ae4c-475f-a813-9b4cf0e51f49-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8f9a018c-ae4c-475f-a813-9b4cf0e51f49\") " pod="openstack/glance-default-external-api-0" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.737939 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8f9a018c-ae4c-475f-a813-9b4cf0e51f49-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8f9a018c-ae4c-475f-a813-9b4cf0e51f49\") " pod="openstack/glance-default-external-api-0" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.738351 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8f9a018c-ae4c-475f-a813-9b4cf0e51f49-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8f9a018c-ae4c-475f-a813-9b4cf0e51f49\") " pod="openstack/glance-default-external-api-0" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.738379 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f9a018c-ae4c-475f-a813-9b4cf0e51f49-logs\") pod \"glance-default-external-api-0\" (UID: \"8f9a018c-ae4c-475f-a813-9b4cf0e51f49\") " pod="openstack/glance-default-external-api-0" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.738694 4664 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"8f9a018c-ae4c-475f-a813-9b4cf0e51f49\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.757743 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f9a018c-ae4c-475f-a813-9b4cf0e51f49-scripts\") pod \"glance-default-external-api-0\" (UID: \"8f9a018c-ae4c-475f-a813-9b4cf0e51f49\") " pod="openstack/glance-default-external-api-0" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.757993 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f9a018c-ae4c-475f-a813-9b4cf0e51f49-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8f9a018c-ae4c-475f-a813-9b4cf0e51f49\") " pod="openstack/glance-default-external-api-0" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.758066 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f9a018c-ae4c-475f-a813-9b4cf0e51f49-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8f9a018c-ae4c-475f-a813-9b4cf0e51f49\") " pod="openstack/glance-default-external-api-0" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.758144 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgm2q\" (UniqueName: \"kubernetes.io/projected/8f9a018c-ae4c-475f-a813-9b4cf0e51f49-kube-api-access-sgm2q\") pod \"glance-default-external-api-0\" (UID: \"8f9a018c-ae4c-475f-a813-9b4cf0e51f49\") " pod="openstack/glance-default-external-api-0" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.763707 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f9a018c-ae4c-475f-a813-9b4cf0e51f49-config-data\") pod \"glance-default-external-api-0\" (UID: \"8f9a018c-ae4c-475f-a813-9b4cf0e51f49\") " pod="openstack/glance-default-external-api-0" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.780511 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"8f9a018c-ae4c-475f-a813-9b4cf0e51f49\") " pod="openstack/glance-default-external-api-0" Oct 03 08:10:10 crc kubenswrapper[4664]: I1003 08:10:10.932086 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 08:10:11 crc kubenswrapper[4664]: I1003 08:10:11.356050 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"39f05918-4374-41a2-803f-2a3b930a87ee","Type":"ContainerStarted","Data":"36887796a526656751e45bad450d532ec401e7a929887b8453a70f02ffeef72c"} Oct 03 08:10:11 crc kubenswrapper[4664]: I1003 08:10:11.565791 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 08:10:11 crc kubenswrapper[4664]: I1003 08:10:11.887171 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6837bc3c-3e78-491c-8f0c-377955560009" path="/var/lib/kubelet/pods/6837bc3c-3e78-491c-8f0c-377955560009/volumes" Oct 03 08:10:12 crc kubenswrapper[4664]: I1003 08:10:12.375524 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8f9a018c-ae4c-475f-a813-9b4cf0e51f49","Type":"ContainerStarted","Data":"ed93160e6b90bbd35ae9964775b94f7087afd8ef669766257b87cdd511eccce2"} Oct 03 08:10:12 crc kubenswrapper[4664]: I1003 08:10:12.375955 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8f9a018c-ae4c-475f-a813-9b4cf0e51f49","Type":"ContainerStarted","Data":"8ff20fceb756dae3a01289710e39755f3a1f02467fb75bcbc66810a549e20b13"} Oct 03 08:10:12 crc kubenswrapper[4664]: I1003 08:10:12.387174 4664 generic.go:334] "Generic (PLEG): container finished" podID="c7b4a287-6826-4b8f-945a-aad1d1deb92a" containerID="6e101e3f9ad77dfaaff43805f29a5ac8937b6ae5f4dcde279efd7cf4b3c253a0" exitCode=0 Oct 03 08:10:12 crc kubenswrapper[4664]: I1003 08:10:12.387911 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c7b4a287-6826-4b8f-945a-aad1d1deb92a","Type":"ContainerDied","Data":"6e101e3f9ad77dfaaff43805f29a5ac8937b6ae5f4dcde279efd7cf4b3c253a0"} Oct 03 08:10:12 crc kubenswrapper[4664]: I1003 08:10:12.393468 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"39f05918-4374-41a2-803f-2a3b930a87ee","Type":"ContainerStarted","Data":"9b3c49fc642f898589dc4e475b0a98f5594788af3de7d74fd192f5b797a2ac69"} Oct 03 08:10:12 crc kubenswrapper[4664]: I1003 08:10:12.393745 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="39f05918-4374-41a2-803f-2a3b930a87ee" containerName="ceilometer-central-agent" containerID="cri-o://f928b8b80226cdb896d8632a787444b44896de53c86391c49bde5f2936d2cd13" gracePeriod=30 Oct 03 08:10:12 crc kubenswrapper[4664]: I1003 08:10:12.393817 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 08:10:12 crc kubenswrapper[4664]: I1003 08:10:12.393852 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="39f05918-4374-41a2-803f-2a3b930a87ee" containerName="sg-core" containerID="cri-o://36887796a526656751e45bad450d532ec401e7a929887b8453a70f02ffeef72c" gracePeriod=30 Oct 03 08:10:12 crc kubenswrapper[4664]: I1003 08:10:12.393856 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="39f05918-4374-41a2-803f-2a3b930a87ee" containerName="proxy-httpd" containerID="cri-o://9b3c49fc642f898589dc4e475b0a98f5594788af3de7d74fd192f5b797a2ac69" gracePeriod=30 Oct 03 08:10:12 crc kubenswrapper[4664]: I1003 08:10:12.393959 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="39f05918-4374-41a2-803f-2a3b930a87ee" containerName="ceilometer-notification-agent" containerID="cri-o://cfffdb7c698401b7554da6b4b7ee73035e758a8e8536f34355c78d3d3fdd0ccd" gracePeriod=30 Oct 03 08:10:12 crc kubenswrapper[4664]: I1003 08:10:12.557208 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 08:10:12 crc kubenswrapper[4664]: I1003 08:10:12.585211 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.252216829 podStartE2EDuration="5.585191255s" podCreationTimestamp="2025-10-03 08:10:07 +0000 UTC" firstStartedPulling="2025-10-03 08:10:08.402901217 +0000 UTC m=+1309.224091707" lastFinishedPulling="2025-10-03 08:10:11.735875643 +0000 UTC m=+1312.557066133" observedRunningTime="2025-10-03 08:10:12.432349238 +0000 UTC m=+1313.253539748" watchObservedRunningTime="2025-10-03 08:10:12.585191255 +0000 UTC m=+1313.406381745" Oct 03 08:10:12 crc kubenswrapper[4664]: I1003 08:10:12.686874 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7b4a287-6826-4b8f-945a-aad1d1deb92a-combined-ca-bundle\") pod \"c7b4a287-6826-4b8f-945a-aad1d1deb92a\" (UID: \"c7b4a287-6826-4b8f-945a-aad1d1deb92a\") " Oct 03 08:10:12 crc kubenswrapper[4664]: I1003 08:10:12.687474 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7b4a287-6826-4b8f-945a-aad1d1deb92a-internal-tls-certs\") pod \"c7b4a287-6826-4b8f-945a-aad1d1deb92a\" (UID: \"c7b4a287-6826-4b8f-945a-aad1d1deb92a\") " Oct 03 08:10:12 crc kubenswrapper[4664]: I1003 08:10:12.687519 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7b4a287-6826-4b8f-945a-aad1d1deb92a-logs\") pod \"c7b4a287-6826-4b8f-945a-aad1d1deb92a\" (UID: \"c7b4a287-6826-4b8f-945a-aad1d1deb92a\") " Oct 03 08:10:12 crc kubenswrapper[4664]: I1003 08:10:12.687679 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7b4a287-6826-4b8f-945a-aad1d1deb92a-scripts\") pod \"c7b4a287-6826-4b8f-945a-aad1d1deb92a\" (UID: \"c7b4a287-6826-4b8f-945a-aad1d1deb92a\") " Oct 03 08:10:12 crc kubenswrapper[4664]: I1003 08:10:12.687722 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"c7b4a287-6826-4b8f-945a-aad1d1deb92a\" (UID: \"c7b4a287-6826-4b8f-945a-aad1d1deb92a\") " Oct 03 08:10:12 crc kubenswrapper[4664]: I1003 08:10:12.687765 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7b4a287-6826-4b8f-945a-aad1d1deb92a-config-data\") pod \"c7b4a287-6826-4b8f-945a-aad1d1deb92a\" (UID: \"c7b4a287-6826-4b8f-945a-aad1d1deb92a\") " Oct 03 08:10:12 crc kubenswrapper[4664]: I1003 08:10:12.687853 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-np7r7\" (UniqueName: \"kubernetes.io/projected/c7b4a287-6826-4b8f-945a-aad1d1deb92a-kube-api-access-np7r7\") pod \"c7b4a287-6826-4b8f-945a-aad1d1deb92a\" (UID: \"c7b4a287-6826-4b8f-945a-aad1d1deb92a\") " Oct 03 08:10:12 crc kubenswrapper[4664]: I1003 08:10:12.687893 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7b4a287-6826-4b8f-945a-aad1d1deb92a-httpd-run\") pod \"c7b4a287-6826-4b8f-945a-aad1d1deb92a\" (UID: \"c7b4a287-6826-4b8f-945a-aad1d1deb92a\") " Oct 03 08:10:12 crc kubenswrapper[4664]: I1003 08:10:12.689037 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7b4a287-6826-4b8f-945a-aad1d1deb92a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c7b4a287-6826-4b8f-945a-aad1d1deb92a" (UID: "c7b4a287-6826-4b8f-945a-aad1d1deb92a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:10:12 crc kubenswrapper[4664]: I1003 08:10:12.692117 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7b4a287-6826-4b8f-945a-aad1d1deb92a-logs" (OuterVolumeSpecName: "logs") pod "c7b4a287-6826-4b8f-945a-aad1d1deb92a" (UID: "c7b4a287-6826-4b8f-945a-aad1d1deb92a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:10:12 crc kubenswrapper[4664]: I1003 08:10:12.696938 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "c7b4a287-6826-4b8f-945a-aad1d1deb92a" (UID: "c7b4a287-6826-4b8f-945a-aad1d1deb92a"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 08:10:12 crc kubenswrapper[4664]: I1003 08:10:12.700826 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7b4a287-6826-4b8f-945a-aad1d1deb92a-kube-api-access-np7r7" (OuterVolumeSpecName: "kube-api-access-np7r7") pod "c7b4a287-6826-4b8f-945a-aad1d1deb92a" (UID: "c7b4a287-6826-4b8f-945a-aad1d1deb92a"). InnerVolumeSpecName "kube-api-access-np7r7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:10:12 crc kubenswrapper[4664]: I1003 08:10:12.707801 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7b4a287-6826-4b8f-945a-aad1d1deb92a-scripts" (OuterVolumeSpecName: "scripts") pod "c7b4a287-6826-4b8f-945a-aad1d1deb92a" (UID: "c7b4a287-6826-4b8f-945a-aad1d1deb92a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:10:12 crc kubenswrapper[4664]: I1003 08:10:12.730833 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7b4a287-6826-4b8f-945a-aad1d1deb92a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c7b4a287-6826-4b8f-945a-aad1d1deb92a" (UID: "c7b4a287-6826-4b8f-945a-aad1d1deb92a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:10:12 crc kubenswrapper[4664]: I1003 08:10:12.758958 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7b4a287-6826-4b8f-945a-aad1d1deb92a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c7b4a287-6826-4b8f-945a-aad1d1deb92a" (UID: "c7b4a287-6826-4b8f-945a-aad1d1deb92a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:10:12 crc kubenswrapper[4664]: I1003 08:10:12.760988 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7b4a287-6826-4b8f-945a-aad1d1deb92a-config-data" (OuterVolumeSpecName: "config-data") pod "c7b4a287-6826-4b8f-945a-aad1d1deb92a" (UID: "c7b4a287-6826-4b8f-945a-aad1d1deb92a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:10:12 crc kubenswrapper[4664]: I1003 08:10:12.790172 4664 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7b4a287-6826-4b8f-945a-aad1d1deb92a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:12 crc kubenswrapper[4664]: I1003 08:10:12.790221 4664 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7b4a287-6826-4b8f-945a-aad1d1deb92a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:12 crc kubenswrapper[4664]: I1003 08:10:12.790235 4664 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7b4a287-6826-4b8f-945a-aad1d1deb92a-logs\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:12 crc kubenswrapper[4664]: I1003 08:10:12.790245 4664 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7b4a287-6826-4b8f-945a-aad1d1deb92a-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:12 crc kubenswrapper[4664]: I1003 08:10:12.790283 4664 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 03 08:10:12 crc kubenswrapper[4664]: I1003 08:10:12.790298 4664 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7b4a287-6826-4b8f-945a-aad1d1deb92a-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:12 crc kubenswrapper[4664]: I1003 08:10:12.790310 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-np7r7\" (UniqueName: \"kubernetes.io/projected/c7b4a287-6826-4b8f-945a-aad1d1deb92a-kube-api-access-np7r7\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:12 crc kubenswrapper[4664]: I1003 08:10:12.790323 4664 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7b4a287-6826-4b8f-945a-aad1d1deb92a-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:12 crc kubenswrapper[4664]: I1003 08:10:12.810467 4664 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 03 08:10:12 crc kubenswrapper[4664]: I1003 08:10:12.892457 4664 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.206720 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.299428 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39f05918-4374-41a2-803f-2a3b930a87ee-run-httpd\") pod \"39f05918-4374-41a2-803f-2a3b930a87ee\" (UID: \"39f05918-4374-41a2-803f-2a3b930a87ee\") " Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.299505 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fh4m5\" (UniqueName: \"kubernetes.io/projected/39f05918-4374-41a2-803f-2a3b930a87ee-kube-api-access-fh4m5\") pod \"39f05918-4374-41a2-803f-2a3b930a87ee\" (UID: \"39f05918-4374-41a2-803f-2a3b930a87ee\") " Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.299569 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39f05918-4374-41a2-803f-2a3b930a87ee-config-data\") pod \"39f05918-4374-41a2-803f-2a3b930a87ee\" (UID: \"39f05918-4374-41a2-803f-2a3b930a87ee\") " Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.299720 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/39f05918-4374-41a2-803f-2a3b930a87ee-sg-core-conf-yaml\") pod \"39f05918-4374-41a2-803f-2a3b930a87ee\" (UID: \"39f05918-4374-41a2-803f-2a3b930a87ee\") " Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.299808 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39f05918-4374-41a2-803f-2a3b930a87ee-log-httpd\") pod \"39f05918-4374-41a2-803f-2a3b930a87ee\" (UID: \"39f05918-4374-41a2-803f-2a3b930a87ee\") " Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.300162 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39f05918-4374-41a2-803f-2a3b930a87ee-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "39f05918-4374-41a2-803f-2a3b930a87ee" (UID: "39f05918-4374-41a2-803f-2a3b930a87ee"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.300293 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39f05918-4374-41a2-803f-2a3b930a87ee-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "39f05918-4374-41a2-803f-2a3b930a87ee" (UID: "39f05918-4374-41a2-803f-2a3b930a87ee"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.300399 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39f05918-4374-41a2-803f-2a3b930a87ee-scripts\") pod \"39f05918-4374-41a2-803f-2a3b930a87ee\" (UID: \"39f05918-4374-41a2-803f-2a3b930a87ee\") " Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.300845 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39f05918-4374-41a2-803f-2a3b930a87ee-combined-ca-bundle\") pod \"39f05918-4374-41a2-803f-2a3b930a87ee\" (UID: \"39f05918-4374-41a2-803f-2a3b930a87ee\") " Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.301576 4664 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39f05918-4374-41a2-803f-2a3b930a87ee-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.301636 4664 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39f05918-4374-41a2-803f-2a3b930a87ee-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.304804 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39f05918-4374-41a2-803f-2a3b930a87ee-scripts" (OuterVolumeSpecName: "scripts") pod "39f05918-4374-41a2-803f-2a3b930a87ee" (UID: "39f05918-4374-41a2-803f-2a3b930a87ee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.307746 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39f05918-4374-41a2-803f-2a3b930a87ee-kube-api-access-fh4m5" (OuterVolumeSpecName: "kube-api-access-fh4m5") pod "39f05918-4374-41a2-803f-2a3b930a87ee" (UID: "39f05918-4374-41a2-803f-2a3b930a87ee"). InnerVolumeSpecName "kube-api-access-fh4m5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.331126 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39f05918-4374-41a2-803f-2a3b930a87ee-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "39f05918-4374-41a2-803f-2a3b930a87ee" (UID: "39f05918-4374-41a2-803f-2a3b930a87ee"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.378694 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39f05918-4374-41a2-803f-2a3b930a87ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "39f05918-4374-41a2-803f-2a3b930a87ee" (UID: "39f05918-4374-41a2-803f-2a3b930a87ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.403123 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8f9a018c-ae4c-475f-a813-9b4cf0e51f49","Type":"ContainerStarted","Data":"8f69ceed6b237f9d8f45885d9ac8670fe22c3ab4f1bd8482d4da70bb0a8dbe35"} Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.404033 4664 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/39f05918-4374-41a2-803f-2a3b930a87ee-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.404132 4664 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39f05918-4374-41a2-803f-2a3b930a87ee-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.404146 4664 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39f05918-4374-41a2-803f-2a3b930a87ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.404157 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fh4m5\" (UniqueName: \"kubernetes.io/projected/39f05918-4374-41a2-803f-2a3b930a87ee-kube-api-access-fh4m5\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.405812 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39f05918-4374-41a2-803f-2a3b930a87ee-config-data" (OuterVolumeSpecName: "config-data") pod "39f05918-4374-41a2-803f-2a3b930a87ee" (UID: "39f05918-4374-41a2-803f-2a3b930a87ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.406425 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c7b4a287-6826-4b8f-945a-aad1d1deb92a","Type":"ContainerDied","Data":"ab3021c9bd21ce0c840f0cb79809e87d17c6f0d3f5331fb566d3f13df065f5c7"} Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.406500 4664 scope.go:117] "RemoveContainer" containerID="6e101e3f9ad77dfaaff43805f29a5ac8937b6ae5f4dcde279efd7cf4b3c253a0" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.406457 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.410634 4664 generic.go:334] "Generic (PLEG): container finished" podID="39f05918-4374-41a2-803f-2a3b930a87ee" containerID="9b3c49fc642f898589dc4e475b0a98f5594788af3de7d74fd192f5b797a2ac69" exitCode=0 Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.410657 4664 generic.go:334] "Generic (PLEG): container finished" podID="39f05918-4374-41a2-803f-2a3b930a87ee" containerID="36887796a526656751e45bad450d532ec401e7a929887b8453a70f02ffeef72c" exitCode=2 Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.410665 4664 generic.go:334] "Generic (PLEG): container finished" podID="39f05918-4374-41a2-803f-2a3b930a87ee" containerID="cfffdb7c698401b7554da6b4b7ee73035e758a8e8536f34355c78d3d3fdd0ccd" exitCode=0 Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.410672 4664 generic.go:334] "Generic (PLEG): container finished" podID="39f05918-4374-41a2-803f-2a3b930a87ee" containerID="f928b8b80226cdb896d8632a787444b44896de53c86391c49bde5f2936d2cd13" exitCode=0 Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.410695 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"39f05918-4374-41a2-803f-2a3b930a87ee","Type":"ContainerDied","Data":"9b3c49fc642f898589dc4e475b0a98f5594788af3de7d74fd192f5b797a2ac69"} Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.410735 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"39f05918-4374-41a2-803f-2a3b930a87ee","Type":"ContainerDied","Data":"36887796a526656751e45bad450d532ec401e7a929887b8453a70f02ffeef72c"} Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.410752 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"39f05918-4374-41a2-803f-2a3b930a87ee","Type":"ContainerDied","Data":"cfffdb7c698401b7554da6b4b7ee73035e758a8e8536f34355c78d3d3fdd0ccd"} Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.410768 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"39f05918-4374-41a2-803f-2a3b930a87ee","Type":"ContainerDied","Data":"f928b8b80226cdb896d8632a787444b44896de53c86391c49bde5f2936d2cd13"} Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.410779 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"39f05918-4374-41a2-803f-2a3b930a87ee","Type":"ContainerDied","Data":"f2faa20a780834a4882ff6d753089e5042a28bc35d2df831f88ac4e02b0aa686"} Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.410923 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.436056 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.4360355 podStartE2EDuration="3.4360355s" podCreationTimestamp="2025-10-03 08:10:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:10:13.422243129 +0000 UTC m=+1314.243433629" watchObservedRunningTime="2025-10-03 08:10:13.4360355 +0000 UTC m=+1314.257226010" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.495745 4664 scope.go:117] "RemoveContainer" containerID="54d3fbc4e473e2f16d19aae198c89dc0d7dee395d64fa9560554f9bb24e82498" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.502105 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.505821 4664 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39f05918-4374-41a2-803f-2a3b930a87ee-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.511536 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.515763 4664 scope.go:117] "RemoveContainer" containerID="9b3c49fc642f898589dc4e475b0a98f5594788af3de7d74fd192f5b797a2ac69" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.523758 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.534565 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.545785 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:10:13 crc kubenswrapper[4664]: E1003 08:10:13.546288 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7b4a287-6826-4b8f-945a-aad1d1deb92a" containerName="glance-log" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.546323 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7b4a287-6826-4b8f-945a-aad1d1deb92a" containerName="glance-log" Oct 03 08:10:13 crc kubenswrapper[4664]: E1003 08:10:13.546343 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7b4a287-6826-4b8f-945a-aad1d1deb92a" containerName="glance-httpd" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.546351 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7b4a287-6826-4b8f-945a-aad1d1deb92a" containerName="glance-httpd" Oct 03 08:10:13 crc kubenswrapper[4664]: E1003 08:10:13.546383 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39f05918-4374-41a2-803f-2a3b930a87ee" containerName="sg-core" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.546392 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="39f05918-4374-41a2-803f-2a3b930a87ee" containerName="sg-core" Oct 03 08:10:13 crc kubenswrapper[4664]: E1003 08:10:13.546408 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39f05918-4374-41a2-803f-2a3b930a87ee" containerName="proxy-httpd" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.546416 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="39f05918-4374-41a2-803f-2a3b930a87ee" containerName="proxy-httpd" Oct 03 08:10:13 crc kubenswrapper[4664]: E1003 08:10:13.546457 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39f05918-4374-41a2-803f-2a3b930a87ee" containerName="ceilometer-central-agent" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.546465 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="39f05918-4374-41a2-803f-2a3b930a87ee" containerName="ceilometer-central-agent" Oct 03 08:10:13 crc kubenswrapper[4664]: E1003 08:10:13.546476 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39f05918-4374-41a2-803f-2a3b930a87ee" containerName="ceilometer-notification-agent" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.546484 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="39f05918-4374-41a2-803f-2a3b930a87ee" containerName="ceilometer-notification-agent" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.546712 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="39f05918-4374-41a2-803f-2a3b930a87ee" containerName="ceilometer-notification-agent" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.546740 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="39f05918-4374-41a2-803f-2a3b930a87ee" containerName="proxy-httpd" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.546758 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="39f05918-4374-41a2-803f-2a3b930a87ee" containerName="sg-core" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.546774 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7b4a287-6826-4b8f-945a-aad1d1deb92a" containerName="glance-httpd" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.546793 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="39f05918-4374-41a2-803f-2a3b930a87ee" containerName="ceilometer-central-agent" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.546804 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7b4a287-6826-4b8f-945a-aad1d1deb92a" containerName="glance-log" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.547496 4664 scope.go:117] "RemoveContainer" containerID="36887796a526656751e45bad450d532ec401e7a929887b8453a70f02ffeef72c" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.550274 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.552748 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.552748 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.559697 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.572244 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.574539 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.577098 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.577226 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.588199 4664 scope.go:117] "RemoveContainer" containerID="cfffdb7c698401b7554da6b4b7ee73035e758a8e8536f34355c78d3d3fdd0ccd" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.593867 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.617040 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dbb7\" (UniqueName: \"kubernetes.io/projected/42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab-kube-api-access-5dbb7\") pod \"ceilometer-0\" (UID: \"42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab\") " pod="openstack/ceilometer-0" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.617152 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab\") " pod="openstack/ceilometer-0" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.617211 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab\") " pod="openstack/ceilometer-0" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.617239 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab-run-httpd\") pod \"ceilometer-0\" (UID: \"42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab\") " pod="openstack/ceilometer-0" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.617311 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab-scripts\") pod \"ceilometer-0\" (UID: \"42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab\") " pod="openstack/ceilometer-0" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.617444 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab-config-data\") pod \"ceilometer-0\" (UID: \"42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab\") " pod="openstack/ceilometer-0" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.617487 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab-log-httpd\") pod \"ceilometer-0\" (UID: \"42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab\") " pod="openstack/ceilometer-0" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.642928 4664 scope.go:117] "RemoveContainer" containerID="f928b8b80226cdb896d8632a787444b44896de53c86391c49bde5f2936d2cd13" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.661902 4664 scope.go:117] "RemoveContainer" containerID="9b3c49fc642f898589dc4e475b0a98f5594788af3de7d74fd192f5b797a2ac69" Oct 03 08:10:13 crc kubenswrapper[4664]: E1003 08:10:13.662338 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b3c49fc642f898589dc4e475b0a98f5594788af3de7d74fd192f5b797a2ac69\": container with ID starting with 9b3c49fc642f898589dc4e475b0a98f5594788af3de7d74fd192f5b797a2ac69 not found: ID does not exist" containerID="9b3c49fc642f898589dc4e475b0a98f5594788af3de7d74fd192f5b797a2ac69" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.662385 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b3c49fc642f898589dc4e475b0a98f5594788af3de7d74fd192f5b797a2ac69"} err="failed to get container status \"9b3c49fc642f898589dc4e475b0a98f5594788af3de7d74fd192f5b797a2ac69\": rpc error: code = NotFound desc = could not find container \"9b3c49fc642f898589dc4e475b0a98f5594788af3de7d74fd192f5b797a2ac69\": container with ID starting with 9b3c49fc642f898589dc4e475b0a98f5594788af3de7d74fd192f5b797a2ac69 not found: ID does not exist" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.662414 4664 scope.go:117] "RemoveContainer" containerID="36887796a526656751e45bad450d532ec401e7a929887b8453a70f02ffeef72c" Oct 03 08:10:13 crc kubenswrapper[4664]: E1003 08:10:13.662692 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36887796a526656751e45bad450d532ec401e7a929887b8453a70f02ffeef72c\": container with ID starting with 36887796a526656751e45bad450d532ec401e7a929887b8453a70f02ffeef72c not found: ID does not exist" containerID="36887796a526656751e45bad450d532ec401e7a929887b8453a70f02ffeef72c" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.662746 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36887796a526656751e45bad450d532ec401e7a929887b8453a70f02ffeef72c"} err="failed to get container status \"36887796a526656751e45bad450d532ec401e7a929887b8453a70f02ffeef72c\": rpc error: code = NotFound desc = could not find container \"36887796a526656751e45bad450d532ec401e7a929887b8453a70f02ffeef72c\": container with ID starting with 36887796a526656751e45bad450d532ec401e7a929887b8453a70f02ffeef72c not found: ID does not exist" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.662768 4664 scope.go:117] "RemoveContainer" containerID="cfffdb7c698401b7554da6b4b7ee73035e758a8e8536f34355c78d3d3fdd0ccd" Oct 03 08:10:13 crc kubenswrapper[4664]: E1003 08:10:13.663111 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfffdb7c698401b7554da6b4b7ee73035e758a8e8536f34355c78d3d3fdd0ccd\": container with ID starting with cfffdb7c698401b7554da6b4b7ee73035e758a8e8536f34355c78d3d3fdd0ccd not found: ID does not exist" containerID="cfffdb7c698401b7554da6b4b7ee73035e758a8e8536f34355c78d3d3fdd0ccd" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.663800 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfffdb7c698401b7554da6b4b7ee73035e758a8e8536f34355c78d3d3fdd0ccd"} err="failed to get container status \"cfffdb7c698401b7554da6b4b7ee73035e758a8e8536f34355c78d3d3fdd0ccd\": rpc error: code = NotFound desc = could not find container \"cfffdb7c698401b7554da6b4b7ee73035e758a8e8536f34355c78d3d3fdd0ccd\": container with ID starting with cfffdb7c698401b7554da6b4b7ee73035e758a8e8536f34355c78d3d3fdd0ccd not found: ID does not exist" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.663825 4664 scope.go:117] "RemoveContainer" containerID="f928b8b80226cdb896d8632a787444b44896de53c86391c49bde5f2936d2cd13" Oct 03 08:10:13 crc kubenswrapper[4664]: E1003 08:10:13.664802 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f928b8b80226cdb896d8632a787444b44896de53c86391c49bde5f2936d2cd13\": container with ID starting with f928b8b80226cdb896d8632a787444b44896de53c86391c49bde5f2936d2cd13 not found: ID does not exist" containerID="f928b8b80226cdb896d8632a787444b44896de53c86391c49bde5f2936d2cd13" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.664829 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f928b8b80226cdb896d8632a787444b44896de53c86391c49bde5f2936d2cd13"} err="failed to get container status \"f928b8b80226cdb896d8632a787444b44896de53c86391c49bde5f2936d2cd13\": rpc error: code = NotFound desc = could not find container \"f928b8b80226cdb896d8632a787444b44896de53c86391c49bde5f2936d2cd13\": container with ID starting with f928b8b80226cdb896d8632a787444b44896de53c86391c49bde5f2936d2cd13 not found: ID does not exist" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.664843 4664 scope.go:117] "RemoveContainer" containerID="9b3c49fc642f898589dc4e475b0a98f5594788af3de7d74fd192f5b797a2ac69" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.665972 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b3c49fc642f898589dc4e475b0a98f5594788af3de7d74fd192f5b797a2ac69"} err="failed to get container status \"9b3c49fc642f898589dc4e475b0a98f5594788af3de7d74fd192f5b797a2ac69\": rpc error: code = NotFound desc = could not find container \"9b3c49fc642f898589dc4e475b0a98f5594788af3de7d74fd192f5b797a2ac69\": container with ID starting with 9b3c49fc642f898589dc4e475b0a98f5594788af3de7d74fd192f5b797a2ac69 not found: ID does not exist" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.666000 4664 scope.go:117] "RemoveContainer" containerID="36887796a526656751e45bad450d532ec401e7a929887b8453a70f02ffeef72c" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.666435 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36887796a526656751e45bad450d532ec401e7a929887b8453a70f02ffeef72c"} err="failed to get container status \"36887796a526656751e45bad450d532ec401e7a929887b8453a70f02ffeef72c\": rpc error: code = NotFound desc = could not find container \"36887796a526656751e45bad450d532ec401e7a929887b8453a70f02ffeef72c\": container with ID starting with 36887796a526656751e45bad450d532ec401e7a929887b8453a70f02ffeef72c not found: ID does not exist" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.666461 4664 scope.go:117] "RemoveContainer" containerID="cfffdb7c698401b7554da6b4b7ee73035e758a8e8536f34355c78d3d3fdd0ccd" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.666791 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfffdb7c698401b7554da6b4b7ee73035e758a8e8536f34355c78d3d3fdd0ccd"} err="failed to get container status \"cfffdb7c698401b7554da6b4b7ee73035e758a8e8536f34355c78d3d3fdd0ccd\": rpc error: code = NotFound desc = could not find container \"cfffdb7c698401b7554da6b4b7ee73035e758a8e8536f34355c78d3d3fdd0ccd\": container with ID starting with cfffdb7c698401b7554da6b4b7ee73035e758a8e8536f34355c78d3d3fdd0ccd not found: ID does not exist" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.666828 4664 scope.go:117] "RemoveContainer" containerID="f928b8b80226cdb896d8632a787444b44896de53c86391c49bde5f2936d2cd13" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.667140 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f928b8b80226cdb896d8632a787444b44896de53c86391c49bde5f2936d2cd13"} err="failed to get container status \"f928b8b80226cdb896d8632a787444b44896de53c86391c49bde5f2936d2cd13\": rpc error: code = NotFound desc = could not find container \"f928b8b80226cdb896d8632a787444b44896de53c86391c49bde5f2936d2cd13\": container with ID starting with f928b8b80226cdb896d8632a787444b44896de53c86391c49bde5f2936d2cd13 not found: ID does not exist" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.667169 4664 scope.go:117] "RemoveContainer" containerID="9b3c49fc642f898589dc4e475b0a98f5594788af3de7d74fd192f5b797a2ac69" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.667481 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b3c49fc642f898589dc4e475b0a98f5594788af3de7d74fd192f5b797a2ac69"} err="failed to get container status \"9b3c49fc642f898589dc4e475b0a98f5594788af3de7d74fd192f5b797a2ac69\": rpc error: code = NotFound desc = could not find container \"9b3c49fc642f898589dc4e475b0a98f5594788af3de7d74fd192f5b797a2ac69\": container with ID starting with 9b3c49fc642f898589dc4e475b0a98f5594788af3de7d74fd192f5b797a2ac69 not found: ID does not exist" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.667510 4664 scope.go:117] "RemoveContainer" containerID="36887796a526656751e45bad450d532ec401e7a929887b8453a70f02ffeef72c" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.668647 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36887796a526656751e45bad450d532ec401e7a929887b8453a70f02ffeef72c"} err="failed to get container status \"36887796a526656751e45bad450d532ec401e7a929887b8453a70f02ffeef72c\": rpc error: code = NotFound desc = could not find container \"36887796a526656751e45bad450d532ec401e7a929887b8453a70f02ffeef72c\": container with ID starting with 36887796a526656751e45bad450d532ec401e7a929887b8453a70f02ffeef72c not found: ID does not exist" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.668670 4664 scope.go:117] "RemoveContainer" containerID="cfffdb7c698401b7554da6b4b7ee73035e758a8e8536f34355c78d3d3fdd0ccd" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.673673 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfffdb7c698401b7554da6b4b7ee73035e758a8e8536f34355c78d3d3fdd0ccd"} err="failed to get container status \"cfffdb7c698401b7554da6b4b7ee73035e758a8e8536f34355c78d3d3fdd0ccd\": rpc error: code = NotFound desc = could not find container \"cfffdb7c698401b7554da6b4b7ee73035e758a8e8536f34355c78d3d3fdd0ccd\": container with ID starting with cfffdb7c698401b7554da6b4b7ee73035e758a8e8536f34355c78d3d3fdd0ccd not found: ID does not exist" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.673719 4664 scope.go:117] "RemoveContainer" containerID="f928b8b80226cdb896d8632a787444b44896de53c86391c49bde5f2936d2cd13" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.673995 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f928b8b80226cdb896d8632a787444b44896de53c86391c49bde5f2936d2cd13"} err="failed to get container status \"f928b8b80226cdb896d8632a787444b44896de53c86391c49bde5f2936d2cd13\": rpc error: code = NotFound desc = could not find container \"f928b8b80226cdb896d8632a787444b44896de53c86391c49bde5f2936d2cd13\": container with ID starting with f928b8b80226cdb896d8632a787444b44896de53c86391c49bde5f2936d2cd13 not found: ID does not exist" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.674025 4664 scope.go:117] "RemoveContainer" containerID="9b3c49fc642f898589dc4e475b0a98f5594788af3de7d74fd192f5b797a2ac69" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.674408 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b3c49fc642f898589dc4e475b0a98f5594788af3de7d74fd192f5b797a2ac69"} err="failed to get container status \"9b3c49fc642f898589dc4e475b0a98f5594788af3de7d74fd192f5b797a2ac69\": rpc error: code = NotFound desc = could not find container \"9b3c49fc642f898589dc4e475b0a98f5594788af3de7d74fd192f5b797a2ac69\": container with ID starting with 9b3c49fc642f898589dc4e475b0a98f5594788af3de7d74fd192f5b797a2ac69 not found: ID does not exist" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.674436 4664 scope.go:117] "RemoveContainer" containerID="36887796a526656751e45bad450d532ec401e7a929887b8453a70f02ffeef72c" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.674682 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36887796a526656751e45bad450d532ec401e7a929887b8453a70f02ffeef72c"} err="failed to get container status \"36887796a526656751e45bad450d532ec401e7a929887b8453a70f02ffeef72c\": rpc error: code = NotFound desc = could not find container \"36887796a526656751e45bad450d532ec401e7a929887b8453a70f02ffeef72c\": container with ID starting with 36887796a526656751e45bad450d532ec401e7a929887b8453a70f02ffeef72c not found: ID does not exist" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.674701 4664 scope.go:117] "RemoveContainer" containerID="cfffdb7c698401b7554da6b4b7ee73035e758a8e8536f34355c78d3d3fdd0ccd" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.674872 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfffdb7c698401b7554da6b4b7ee73035e758a8e8536f34355c78d3d3fdd0ccd"} err="failed to get container status \"cfffdb7c698401b7554da6b4b7ee73035e758a8e8536f34355c78d3d3fdd0ccd\": rpc error: code = NotFound desc = could not find container \"cfffdb7c698401b7554da6b4b7ee73035e758a8e8536f34355c78d3d3fdd0ccd\": container with ID starting with cfffdb7c698401b7554da6b4b7ee73035e758a8e8536f34355c78d3d3fdd0ccd not found: ID does not exist" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.674890 4664 scope.go:117] "RemoveContainer" containerID="f928b8b80226cdb896d8632a787444b44896de53c86391c49bde5f2936d2cd13" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.675044 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f928b8b80226cdb896d8632a787444b44896de53c86391c49bde5f2936d2cd13"} err="failed to get container status \"f928b8b80226cdb896d8632a787444b44896de53c86391c49bde5f2936d2cd13\": rpc error: code = NotFound desc = could not find container \"f928b8b80226cdb896d8632a787444b44896de53c86391c49bde5f2936d2cd13\": container with ID starting with f928b8b80226cdb896d8632a787444b44896de53c86391c49bde5f2936d2cd13 not found: ID does not exist" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.719036 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab-config-data\") pod \"ceilometer-0\" (UID: \"42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab\") " pod="openstack/ceilometer-0" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.719113 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab-log-httpd\") pod \"ceilometer-0\" (UID: \"42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab\") " pod="openstack/ceilometer-0" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.719167 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44bbde12-29da-4681-a413-58bd1db590e9-logs\") pod \"glance-default-internal-api-0\" (UID: \"44bbde12-29da-4681-a413-58bd1db590e9\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.719265 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/44bbde12-29da-4681-a413-58bd1db590e9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"44bbde12-29da-4681-a413-58bd1db590e9\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.719307 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dbb7\" (UniqueName: \"kubernetes.io/projected/42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab-kube-api-access-5dbb7\") pod \"ceilometer-0\" (UID: \"42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab\") " pod="openstack/ceilometer-0" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.719336 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"44bbde12-29da-4681-a413-58bd1db590e9\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.719357 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhsmn\" (UniqueName: \"kubernetes.io/projected/44bbde12-29da-4681-a413-58bd1db590e9-kube-api-access-fhsmn\") pod \"glance-default-internal-api-0\" (UID: \"44bbde12-29da-4681-a413-58bd1db590e9\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.719382 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44bbde12-29da-4681-a413-58bd1db590e9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"44bbde12-29da-4681-a413-58bd1db590e9\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.719416 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab\") " pod="openstack/ceilometer-0" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.719452 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab\") " pod="openstack/ceilometer-0" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.719479 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab-run-httpd\") pod \"ceilometer-0\" (UID: \"42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab\") " pod="openstack/ceilometer-0" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.719522 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab-scripts\") pod \"ceilometer-0\" (UID: \"42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab\") " pod="openstack/ceilometer-0" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.719625 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44bbde12-29da-4681-a413-58bd1db590e9-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"44bbde12-29da-4681-a413-58bd1db590e9\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.719664 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44bbde12-29da-4681-a413-58bd1db590e9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"44bbde12-29da-4681-a413-58bd1db590e9\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.719740 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44bbde12-29da-4681-a413-58bd1db590e9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"44bbde12-29da-4681-a413-58bd1db590e9\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.720747 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab-log-httpd\") pod \"ceilometer-0\" (UID: \"42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab\") " pod="openstack/ceilometer-0" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.721316 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab-run-httpd\") pod \"ceilometer-0\" (UID: \"42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab\") " pod="openstack/ceilometer-0" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.724978 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab\") " pod="openstack/ceilometer-0" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.730814 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab-config-data\") pod \"ceilometer-0\" (UID: \"42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab\") " pod="openstack/ceilometer-0" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.731593 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab-scripts\") pod \"ceilometer-0\" (UID: \"42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab\") " pod="openstack/ceilometer-0" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.731411 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab\") " pod="openstack/ceilometer-0" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.744467 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dbb7\" (UniqueName: \"kubernetes.io/projected/42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab-kube-api-access-5dbb7\") pod \"ceilometer-0\" (UID: \"42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab\") " pod="openstack/ceilometer-0" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.816773 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-30c0-account-create-6g9b4"] Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.818263 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-30c0-account-create-6g9b4" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.820844 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/44bbde12-29da-4681-a413-58bd1db590e9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"44bbde12-29da-4681-a413-58bd1db590e9\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.820889 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"44bbde12-29da-4681-a413-58bd1db590e9\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.820909 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhsmn\" (UniqueName: \"kubernetes.io/projected/44bbde12-29da-4681-a413-58bd1db590e9-kube-api-access-fhsmn\") pod \"glance-default-internal-api-0\" (UID: \"44bbde12-29da-4681-a413-58bd1db590e9\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.820927 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44bbde12-29da-4681-a413-58bd1db590e9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"44bbde12-29da-4681-a413-58bd1db590e9\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.820975 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44bbde12-29da-4681-a413-58bd1db590e9-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"44bbde12-29da-4681-a413-58bd1db590e9\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.820991 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44bbde12-29da-4681-a413-58bd1db590e9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"44bbde12-29da-4681-a413-58bd1db590e9\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.821018 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44bbde12-29da-4681-a413-58bd1db590e9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"44bbde12-29da-4681-a413-58bd1db590e9\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.821064 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44bbde12-29da-4681-a413-58bd1db590e9-logs\") pod \"glance-default-internal-api-0\" (UID: \"44bbde12-29da-4681-a413-58bd1db590e9\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.821493 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44bbde12-29da-4681-a413-58bd1db590e9-logs\") pod \"glance-default-internal-api-0\" (UID: \"44bbde12-29da-4681-a413-58bd1db590e9\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.821769 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/44bbde12-29da-4681-a413-58bd1db590e9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"44bbde12-29da-4681-a413-58bd1db590e9\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.822171 4664 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"44bbde12-29da-4681-a413-58bd1db590e9\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.826476 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-30c0-account-create-6g9b4"] Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.827707 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.835686 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44bbde12-29da-4681-a413-58bd1db590e9-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"44bbde12-29da-4681-a413-58bd1db590e9\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.836301 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44bbde12-29da-4681-a413-58bd1db590e9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"44bbde12-29da-4681-a413-58bd1db590e9\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.836985 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44bbde12-29da-4681-a413-58bd1db590e9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"44bbde12-29da-4681-a413-58bd1db590e9\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.837545 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44bbde12-29da-4681-a413-58bd1db590e9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"44bbde12-29da-4681-a413-58bd1db590e9\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.852294 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhsmn\" (UniqueName: \"kubernetes.io/projected/44bbde12-29da-4681-a413-58bd1db590e9-kube-api-access-fhsmn\") pod \"glance-default-internal-api-0\" (UID: \"44bbde12-29da-4681-a413-58bd1db590e9\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.875097 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.879114 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"44bbde12-29da-4681-a413-58bd1db590e9\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.896151 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.923594 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86lc6\" (UniqueName: \"kubernetes.io/projected/569e62d3-ff5a-483b-914c-c17b6b1c35da-kube-api-access-86lc6\") pod \"nova-api-30c0-account-create-6g9b4\" (UID: \"569e62d3-ff5a-483b-914c-c17b6b1c35da\") " pod="openstack/nova-api-30c0-account-create-6g9b4" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.924425 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39f05918-4374-41a2-803f-2a3b930a87ee" path="/var/lib/kubelet/pods/39f05918-4374-41a2-803f-2a3b930a87ee/volumes" Oct 03 08:10:13 crc kubenswrapper[4664]: I1003 08:10:13.925463 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7b4a287-6826-4b8f-945a-aad1d1deb92a" path="/var/lib/kubelet/pods/c7b4a287-6826-4b8f-945a-aad1d1deb92a/volumes" Oct 03 08:10:14 crc kubenswrapper[4664]: I1003 08:10:14.026161 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86lc6\" (UniqueName: \"kubernetes.io/projected/569e62d3-ff5a-483b-914c-c17b6b1c35da-kube-api-access-86lc6\") pod \"nova-api-30c0-account-create-6g9b4\" (UID: \"569e62d3-ff5a-483b-914c-c17b6b1c35da\") " pod="openstack/nova-api-30c0-account-create-6g9b4" Oct 03 08:10:14 crc kubenswrapper[4664]: I1003 08:10:14.049397 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86lc6\" (UniqueName: \"kubernetes.io/projected/569e62d3-ff5a-483b-914c-c17b6b1c35da-kube-api-access-86lc6\") pod \"nova-api-30c0-account-create-6g9b4\" (UID: \"569e62d3-ff5a-483b-914c-c17b6b1c35da\") " pod="openstack/nova-api-30c0-account-create-6g9b4" Oct 03 08:10:14 crc kubenswrapper[4664]: I1003 08:10:14.153859 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-30c0-account-create-6g9b4" Oct 03 08:10:14 crc kubenswrapper[4664]: I1003 08:10:14.476638 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:10:14 crc kubenswrapper[4664]: I1003 08:10:14.508429 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 08:10:14 crc kubenswrapper[4664]: I1003 08:10:14.646909 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-30c0-account-create-6g9b4"] Oct 03 08:10:15 crc kubenswrapper[4664]: I1003 08:10:15.448002 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab","Type":"ContainerStarted","Data":"f188b6abfe92d2768973c04603502f1b475972abd8c6e285eab68c55ec751d08"} Oct 03 08:10:15 crc kubenswrapper[4664]: I1003 08:10:15.448357 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab","Type":"ContainerStarted","Data":"2233cd1e8c8678774f3751d00c0073508770c661016916d9687241b1e42696b6"} Oct 03 08:10:15 crc kubenswrapper[4664]: I1003 08:10:15.450321 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"44bbde12-29da-4681-a413-58bd1db590e9","Type":"ContainerStarted","Data":"9d0d771c02bba7affc2c9fef4376803bf0c20dbdf8090a1a37d287c16364aaaf"} Oct 03 08:10:15 crc kubenswrapper[4664]: I1003 08:10:15.450376 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"44bbde12-29da-4681-a413-58bd1db590e9","Type":"ContainerStarted","Data":"ce5f1ca95af8abb82af3464c81a49249f00c374f9ea6e6de19d79a9f4aa607b4"} Oct 03 08:10:15 crc kubenswrapper[4664]: I1003 08:10:15.462218 4664 generic.go:334] "Generic (PLEG): container finished" podID="569e62d3-ff5a-483b-914c-c17b6b1c35da" containerID="1efed958dfcf921e8fc2ef0cf2a433973765b9179ebc906bf355b06b51e5e0d5" exitCode=0 Oct 03 08:10:15 crc kubenswrapper[4664]: I1003 08:10:15.462263 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-30c0-account-create-6g9b4" event={"ID":"569e62d3-ff5a-483b-914c-c17b6b1c35da","Type":"ContainerDied","Data":"1efed958dfcf921e8fc2ef0cf2a433973765b9179ebc906bf355b06b51e5e0d5"} Oct 03 08:10:15 crc kubenswrapper[4664]: I1003 08:10:15.462288 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-30c0-account-create-6g9b4" event={"ID":"569e62d3-ff5a-483b-914c-c17b6b1c35da","Type":"ContainerStarted","Data":"9667c0122b4cccd75f07e881225736718eb410778699233497345a00de322d69"} Oct 03 08:10:16 crc kubenswrapper[4664]: I1003 08:10:16.477003 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"44bbde12-29da-4681-a413-58bd1db590e9","Type":"ContainerStarted","Data":"c190eb87178226c80bb65185d40aee8fb9357a96816323b73966c2eb0f14fa5e"} Oct 03 08:10:16 crc kubenswrapper[4664]: I1003 08:10:16.481319 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab","Type":"ContainerStarted","Data":"30fe7c9880f63ac84624ccc8ee76c8ac21903321376dd9b77df690f2041e332a"} Oct 03 08:10:16 crc kubenswrapper[4664]: I1003 08:10:16.504347 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.504320366 podStartE2EDuration="3.504320366s" podCreationTimestamp="2025-10-03 08:10:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:10:16.496396565 +0000 UTC m=+1317.317587055" watchObservedRunningTime="2025-10-03 08:10:16.504320366 +0000 UTC m=+1317.325510866" Oct 03 08:10:16 crc kubenswrapper[4664]: I1003 08:10:16.757465 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:10:16 crc kubenswrapper[4664]: I1003 08:10:16.919358 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-30c0-account-create-6g9b4" Oct 03 08:10:16 crc kubenswrapper[4664]: I1003 08:10:16.990706 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86lc6\" (UniqueName: \"kubernetes.io/projected/569e62d3-ff5a-483b-914c-c17b6b1c35da-kube-api-access-86lc6\") pod \"569e62d3-ff5a-483b-914c-c17b6b1c35da\" (UID: \"569e62d3-ff5a-483b-914c-c17b6b1c35da\") " Oct 03 08:10:16 crc kubenswrapper[4664]: I1003 08:10:16.996383 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/569e62d3-ff5a-483b-914c-c17b6b1c35da-kube-api-access-86lc6" (OuterVolumeSpecName: "kube-api-access-86lc6") pod "569e62d3-ff5a-483b-914c-c17b6b1c35da" (UID: "569e62d3-ff5a-483b-914c-c17b6b1c35da"). InnerVolumeSpecName "kube-api-access-86lc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:10:17 crc kubenswrapper[4664]: I1003 08:10:17.093074 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86lc6\" (UniqueName: \"kubernetes.io/projected/569e62d3-ff5a-483b-914c-c17b6b1c35da-kube-api-access-86lc6\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:17 crc kubenswrapper[4664]: I1003 08:10:17.491558 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-30c0-account-create-6g9b4" event={"ID":"569e62d3-ff5a-483b-914c-c17b6b1c35da","Type":"ContainerDied","Data":"9667c0122b4cccd75f07e881225736718eb410778699233497345a00de322d69"} Oct 03 08:10:17 crc kubenswrapper[4664]: I1003 08:10:17.491807 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9667c0122b4cccd75f07e881225736718eb410778699233497345a00de322d69" Oct 03 08:10:17 crc kubenswrapper[4664]: I1003 08:10:17.491584 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-30c0-account-create-6g9b4" Oct 03 08:10:17 crc kubenswrapper[4664]: I1003 08:10:17.493674 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab","Type":"ContainerStarted","Data":"7332f16d2429511eedc1167896cb35e925c52dc35d8803544a2573bca86b1b41"} Oct 03 08:10:18 crc kubenswrapper[4664]: I1003 08:10:18.504045 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab","Type":"ContainerStarted","Data":"862b2c3e159a2dfe1c941d93d4118ca59a80b43da9c11926d1a151979cbedf07"} Oct 03 08:10:18 crc kubenswrapper[4664]: I1003 08:10:18.504641 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 08:10:18 crc kubenswrapper[4664]: I1003 08:10:18.504674 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab" containerName="ceilometer-central-agent" containerID="cri-o://f188b6abfe92d2768973c04603502f1b475972abd8c6e285eab68c55ec751d08" gracePeriod=30 Oct 03 08:10:18 crc kubenswrapper[4664]: I1003 08:10:18.504700 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab" containerName="proxy-httpd" containerID="cri-o://862b2c3e159a2dfe1c941d93d4118ca59a80b43da9c11926d1a151979cbedf07" gracePeriod=30 Oct 03 08:10:18 crc kubenswrapper[4664]: I1003 08:10:18.504710 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab" containerName="ceilometer-notification-agent" containerID="cri-o://30fe7c9880f63ac84624ccc8ee76c8ac21903321376dd9b77df690f2041e332a" gracePeriod=30 Oct 03 08:10:18 crc kubenswrapper[4664]: I1003 08:10:18.504683 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab" containerName="sg-core" containerID="cri-o://7332f16d2429511eedc1167896cb35e925c52dc35d8803544a2573bca86b1b41" gracePeriod=30 Oct 03 08:10:18 crc kubenswrapper[4664]: I1003 08:10:18.543028 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.939538517 podStartE2EDuration="5.543004413s" podCreationTimestamp="2025-10-03 08:10:13 +0000 UTC" firstStartedPulling="2025-10-03 08:10:14.495887378 +0000 UTC m=+1315.317077868" lastFinishedPulling="2025-10-03 08:10:18.099353274 +0000 UTC m=+1318.920543764" observedRunningTime="2025-10-03 08:10:18.536908156 +0000 UTC m=+1319.358098666" watchObservedRunningTime="2025-10-03 08:10:18.543004413 +0000 UTC m=+1319.364194923" Oct 03 08:10:19 crc kubenswrapper[4664]: I1003 08:10:19.515724 4664 generic.go:334] "Generic (PLEG): container finished" podID="42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab" containerID="862b2c3e159a2dfe1c941d93d4118ca59a80b43da9c11926d1a151979cbedf07" exitCode=0 Oct 03 08:10:19 crc kubenswrapper[4664]: I1003 08:10:19.516001 4664 generic.go:334] "Generic (PLEG): container finished" podID="42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab" containerID="7332f16d2429511eedc1167896cb35e925c52dc35d8803544a2573bca86b1b41" exitCode=2 Oct 03 08:10:19 crc kubenswrapper[4664]: I1003 08:10:19.515934 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab","Type":"ContainerDied","Data":"862b2c3e159a2dfe1c941d93d4118ca59a80b43da9c11926d1a151979cbedf07"} Oct 03 08:10:19 crc kubenswrapper[4664]: I1003 08:10:19.516015 4664 generic.go:334] "Generic (PLEG): container finished" podID="42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab" containerID="30fe7c9880f63ac84624ccc8ee76c8ac21903321376dd9b77df690f2041e332a" exitCode=0 Oct 03 08:10:19 crc kubenswrapper[4664]: I1003 08:10:19.516066 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab","Type":"ContainerDied","Data":"7332f16d2429511eedc1167896cb35e925c52dc35d8803544a2573bca86b1b41"} Oct 03 08:10:19 crc kubenswrapper[4664]: I1003 08:10:19.516081 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab","Type":"ContainerDied","Data":"30fe7c9880f63ac84624ccc8ee76c8ac21903321376dd9b77df690f2041e332a"} Oct 03 08:10:20 crc kubenswrapper[4664]: I1003 08:10:20.932973 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 03 08:10:20 crc kubenswrapper[4664]: I1003 08:10:20.933339 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 03 08:10:20 crc kubenswrapper[4664]: I1003 08:10:20.978050 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 03 08:10:20 crc kubenswrapper[4664]: I1003 08:10:20.982575 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 03 08:10:21 crc kubenswrapper[4664]: I1003 08:10:21.548475 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 03 08:10:21 crc kubenswrapper[4664]: I1003 08:10:21.548781 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 03 08:10:23 crc kubenswrapper[4664]: I1003 08:10:23.875486 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 03 08:10:23 crc kubenswrapper[4664]: I1003 08:10:23.876027 4664 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 08:10:23 crc kubenswrapper[4664]: I1003 08:10:23.893474 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 03 08:10:23 crc kubenswrapper[4664]: I1003 08:10:23.896342 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 03 08:10:23 crc kubenswrapper[4664]: I1003 08:10:23.896395 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 03 08:10:23 crc kubenswrapper[4664]: I1003 08:10:23.941222 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 03 08:10:23 crc kubenswrapper[4664]: I1003 08:10:23.973098 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 03 08:10:24 crc kubenswrapper[4664]: I1003 08:10:24.030422 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-fa05-account-create-vslm4"] Oct 03 08:10:24 crc kubenswrapper[4664]: E1003 08:10:24.030834 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="569e62d3-ff5a-483b-914c-c17b6b1c35da" containerName="mariadb-account-create" Oct 03 08:10:24 crc kubenswrapper[4664]: I1003 08:10:24.030849 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="569e62d3-ff5a-483b-914c-c17b6b1c35da" containerName="mariadb-account-create" Oct 03 08:10:24 crc kubenswrapper[4664]: I1003 08:10:24.031031 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="569e62d3-ff5a-483b-914c-c17b6b1c35da" containerName="mariadb-account-create" Oct 03 08:10:24 crc kubenswrapper[4664]: I1003 08:10:24.031719 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-fa05-account-create-vslm4" Oct 03 08:10:24 crc kubenswrapper[4664]: I1003 08:10:24.036132 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 03 08:10:24 crc kubenswrapper[4664]: I1003 08:10:24.047969 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-fa05-account-create-vslm4"] Oct 03 08:10:24 crc kubenswrapper[4664]: I1003 08:10:24.137006 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kr84\" (UniqueName: \"kubernetes.io/projected/53a3e54d-79e2-48c5-84df-499f3ac97222-kube-api-access-6kr84\") pod \"nova-cell0-fa05-account-create-vslm4\" (UID: \"53a3e54d-79e2-48c5-84df-499f3ac97222\") " pod="openstack/nova-cell0-fa05-account-create-vslm4" Oct 03 08:10:24 crc kubenswrapper[4664]: I1003 08:10:24.234856 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-4ca0-account-create-8x7r2"] Oct 03 08:10:24 crc kubenswrapper[4664]: I1003 08:10:24.236556 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4ca0-account-create-8x7r2" Oct 03 08:10:24 crc kubenswrapper[4664]: I1003 08:10:24.238895 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kr84\" (UniqueName: \"kubernetes.io/projected/53a3e54d-79e2-48c5-84df-499f3ac97222-kube-api-access-6kr84\") pod \"nova-cell0-fa05-account-create-vslm4\" (UID: \"53a3e54d-79e2-48c5-84df-499f3ac97222\") " pod="openstack/nova-cell0-fa05-account-create-vslm4" Oct 03 08:10:24 crc kubenswrapper[4664]: I1003 08:10:24.238988 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 03 08:10:24 crc kubenswrapper[4664]: I1003 08:10:24.244740 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-4ca0-account-create-8x7r2"] Oct 03 08:10:24 crc kubenswrapper[4664]: I1003 08:10:24.259973 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kr84\" (UniqueName: \"kubernetes.io/projected/53a3e54d-79e2-48c5-84df-499f3ac97222-kube-api-access-6kr84\") pod \"nova-cell0-fa05-account-create-vslm4\" (UID: \"53a3e54d-79e2-48c5-84df-499f3ac97222\") " pod="openstack/nova-cell0-fa05-account-create-vslm4" Oct 03 08:10:24 crc kubenswrapper[4664]: I1003 08:10:24.340512 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xjlf\" (UniqueName: \"kubernetes.io/projected/17d3f5d3-82e5-4954-83a6-a7e0255df488-kube-api-access-4xjlf\") pod \"nova-cell1-4ca0-account-create-8x7r2\" (UID: \"17d3f5d3-82e5-4954-83a6-a7e0255df488\") " pod="openstack/nova-cell1-4ca0-account-create-8x7r2" Oct 03 08:10:24 crc kubenswrapper[4664]: I1003 08:10:24.362085 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-fa05-account-create-vslm4" Oct 03 08:10:24 crc kubenswrapper[4664]: I1003 08:10:24.442817 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xjlf\" (UniqueName: \"kubernetes.io/projected/17d3f5d3-82e5-4954-83a6-a7e0255df488-kube-api-access-4xjlf\") pod \"nova-cell1-4ca0-account-create-8x7r2\" (UID: \"17d3f5d3-82e5-4954-83a6-a7e0255df488\") " pod="openstack/nova-cell1-4ca0-account-create-8x7r2" Oct 03 08:10:24 crc kubenswrapper[4664]: I1003 08:10:24.472488 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xjlf\" (UniqueName: \"kubernetes.io/projected/17d3f5d3-82e5-4954-83a6-a7e0255df488-kube-api-access-4xjlf\") pod \"nova-cell1-4ca0-account-create-8x7r2\" (UID: \"17d3f5d3-82e5-4954-83a6-a7e0255df488\") " pod="openstack/nova-cell1-4ca0-account-create-8x7r2" Oct 03 08:10:24 crc kubenswrapper[4664]: I1003 08:10:24.586265 4664 generic.go:334] "Generic (PLEG): container finished" podID="42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab" containerID="f188b6abfe92d2768973c04603502f1b475972abd8c6e285eab68c55ec751d08" exitCode=0 Oct 03 08:10:24 crc kubenswrapper[4664]: I1003 08:10:24.587444 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab","Type":"ContainerDied","Data":"f188b6abfe92d2768973c04603502f1b475972abd8c6e285eab68c55ec751d08"} Oct 03 08:10:24 crc kubenswrapper[4664]: I1003 08:10:24.588568 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 03 08:10:24 crc kubenswrapper[4664]: I1003 08:10:24.588595 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 03 08:10:24 crc kubenswrapper[4664]: I1003 08:10:24.611866 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4ca0-account-create-8x7r2" Oct 03 08:10:24 crc kubenswrapper[4664]: I1003 08:10:24.852482 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-fa05-account-create-vslm4"] Oct 03 08:10:24 crc kubenswrapper[4664]: W1003 08:10:24.859130 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53a3e54d_79e2_48c5_84df_499f3ac97222.slice/crio-02528db2c600286d74c598bc98070aa26945b1e1c04301b3c3cf8e6c531ff23a WatchSource:0}: Error finding container 02528db2c600286d74c598bc98070aa26945b1e1c04301b3c3cf8e6c531ff23a: Status 404 returned error can't find the container with id 02528db2c600286d74c598bc98070aa26945b1e1c04301b3c3cf8e6c531ff23a Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.079461 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.086360 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-4ca0-account-create-8x7r2"] Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.162718 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dbb7\" (UniqueName: \"kubernetes.io/projected/42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab-kube-api-access-5dbb7\") pod \"42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab\" (UID: \"42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab\") " Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.162815 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab-log-httpd\") pod \"42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab\" (UID: \"42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab\") " Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.162870 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab-config-data\") pod \"42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab\" (UID: \"42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab\") " Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.162894 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab-combined-ca-bundle\") pod \"42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab\" (UID: \"42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab\") " Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.162966 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab-sg-core-conf-yaml\") pod \"42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab\" (UID: \"42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab\") " Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.163046 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab-run-httpd\") pod \"42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab\" (UID: \"42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab\") " Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.163149 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab-scripts\") pod \"42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab\" (UID: \"42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab\") " Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.166373 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab" (UID: "42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.167069 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab" (UID: "42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.170057 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab-scripts" (OuterVolumeSpecName: "scripts") pod "42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab" (UID: "42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.170919 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab-kube-api-access-5dbb7" (OuterVolumeSpecName: "kube-api-access-5dbb7") pod "42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab" (UID: "42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab"). InnerVolumeSpecName "kube-api-access-5dbb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.197269 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab" (UID: "42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.257237 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab" (UID: "42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.265101 4664 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.265137 4664 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.265148 4664 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.265157 4664 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.265165 4664 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.265174 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dbb7\" (UniqueName: \"kubernetes.io/projected/42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab-kube-api-access-5dbb7\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.305766 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab-config-data" (OuterVolumeSpecName: "config-data") pod "42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab" (UID: "42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.366904 4664 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.596340 4664 generic.go:334] "Generic (PLEG): container finished" podID="17d3f5d3-82e5-4954-83a6-a7e0255df488" containerID="457c614fa57ae92ea61f3660bf6ce50cb90ad8ec656c3e162df330f5da85a486" exitCode=0 Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.596575 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4ca0-account-create-8x7r2" event={"ID":"17d3f5d3-82e5-4954-83a6-a7e0255df488","Type":"ContainerDied","Data":"457c614fa57ae92ea61f3660bf6ce50cb90ad8ec656c3e162df330f5da85a486"} Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.596750 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4ca0-account-create-8x7r2" event={"ID":"17d3f5d3-82e5-4954-83a6-a7e0255df488","Type":"ContainerStarted","Data":"9cc94325fbaf7847f8121c71ca6b0f1a198d1341e6e644e9be0f851ac53f4721"} Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.599238 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab","Type":"ContainerDied","Data":"2233cd1e8c8678774f3751d00c0073508770c661016916d9687241b1e42696b6"} Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.599278 4664 scope.go:117] "RemoveContainer" containerID="862b2c3e159a2dfe1c941d93d4118ca59a80b43da9c11926d1a151979cbedf07" Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.599483 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.601963 4664 generic.go:334] "Generic (PLEG): container finished" podID="53a3e54d-79e2-48c5-84df-499f3ac97222" containerID="6c1a4104f1eb041df233e6d4b76bb980a6742eb3acd7a3585ff5f731ec33c810" exitCode=0 Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.602961 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-fa05-account-create-vslm4" event={"ID":"53a3e54d-79e2-48c5-84df-499f3ac97222","Type":"ContainerDied","Data":"6c1a4104f1eb041df233e6d4b76bb980a6742eb3acd7a3585ff5f731ec33c810"} Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.602993 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-fa05-account-create-vslm4" event={"ID":"53a3e54d-79e2-48c5-84df-499f3ac97222","Type":"ContainerStarted","Data":"02528db2c600286d74c598bc98070aa26945b1e1c04301b3c3cf8e6c531ff23a"} Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.624107 4664 scope.go:117] "RemoveContainer" containerID="7332f16d2429511eedc1167896cb35e925c52dc35d8803544a2573bca86b1b41" Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.649225 4664 scope.go:117] "RemoveContainer" containerID="30fe7c9880f63ac84624ccc8ee76c8ac21903321376dd9b77df690f2041e332a" Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.660859 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.670724 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.677362 4664 scope.go:117] "RemoveContainer" containerID="f188b6abfe92d2768973c04603502f1b475972abd8c6e285eab68c55ec751d08" Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.694312 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:10:25 crc kubenswrapper[4664]: E1003 08:10:25.694850 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab" containerName="ceilometer-central-agent" Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.694875 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab" containerName="ceilometer-central-agent" Oct 03 08:10:25 crc kubenswrapper[4664]: E1003 08:10:25.694929 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab" containerName="proxy-httpd" Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.694938 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab" containerName="proxy-httpd" Oct 03 08:10:25 crc kubenswrapper[4664]: E1003 08:10:25.694956 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab" containerName="ceilometer-notification-agent" Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.694963 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab" containerName="ceilometer-notification-agent" Oct 03 08:10:25 crc kubenswrapper[4664]: E1003 08:10:25.694971 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab" containerName="sg-core" Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.694977 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab" containerName="sg-core" Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.695169 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab" containerName="proxy-httpd" Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.695184 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab" containerName="ceilometer-central-agent" Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.695207 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab" containerName="ceilometer-notification-agent" Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.695216 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab" containerName="sg-core" Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.697114 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.703430 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.707148 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.707673 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.776463 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27s9p\" (UniqueName: \"kubernetes.io/projected/696b55ca-18fe-412e-a3d9-08c9105679b5-kube-api-access-27s9p\") pod \"ceilometer-0\" (UID: \"696b55ca-18fe-412e-a3d9-08c9105679b5\") " pod="openstack/ceilometer-0" Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.776530 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/696b55ca-18fe-412e-a3d9-08c9105679b5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"696b55ca-18fe-412e-a3d9-08c9105679b5\") " pod="openstack/ceilometer-0" Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.776576 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/696b55ca-18fe-412e-a3d9-08c9105679b5-run-httpd\") pod \"ceilometer-0\" (UID: \"696b55ca-18fe-412e-a3d9-08c9105679b5\") " pod="openstack/ceilometer-0" Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.776615 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/696b55ca-18fe-412e-a3d9-08c9105679b5-scripts\") pod \"ceilometer-0\" (UID: \"696b55ca-18fe-412e-a3d9-08c9105679b5\") " pod="openstack/ceilometer-0" Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.776714 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/696b55ca-18fe-412e-a3d9-08c9105679b5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"696b55ca-18fe-412e-a3d9-08c9105679b5\") " pod="openstack/ceilometer-0" Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.776765 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/696b55ca-18fe-412e-a3d9-08c9105679b5-log-httpd\") pod \"ceilometer-0\" (UID: \"696b55ca-18fe-412e-a3d9-08c9105679b5\") " pod="openstack/ceilometer-0" Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.776815 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/696b55ca-18fe-412e-a3d9-08c9105679b5-config-data\") pod \"ceilometer-0\" (UID: \"696b55ca-18fe-412e-a3d9-08c9105679b5\") " pod="openstack/ceilometer-0" Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.877783 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/696b55ca-18fe-412e-a3d9-08c9105679b5-log-httpd\") pod \"ceilometer-0\" (UID: \"696b55ca-18fe-412e-a3d9-08c9105679b5\") " pod="openstack/ceilometer-0" Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.877835 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/696b55ca-18fe-412e-a3d9-08c9105679b5-config-data\") pod \"ceilometer-0\" (UID: \"696b55ca-18fe-412e-a3d9-08c9105679b5\") " pod="openstack/ceilometer-0" Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.877933 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27s9p\" (UniqueName: \"kubernetes.io/projected/696b55ca-18fe-412e-a3d9-08c9105679b5-kube-api-access-27s9p\") pod \"ceilometer-0\" (UID: \"696b55ca-18fe-412e-a3d9-08c9105679b5\") " pod="openstack/ceilometer-0" Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.877967 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/696b55ca-18fe-412e-a3d9-08c9105679b5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"696b55ca-18fe-412e-a3d9-08c9105679b5\") " pod="openstack/ceilometer-0" Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.878009 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/696b55ca-18fe-412e-a3d9-08c9105679b5-run-httpd\") pod \"ceilometer-0\" (UID: \"696b55ca-18fe-412e-a3d9-08c9105679b5\") " pod="openstack/ceilometer-0" Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.878035 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/696b55ca-18fe-412e-a3d9-08c9105679b5-scripts\") pod \"ceilometer-0\" (UID: \"696b55ca-18fe-412e-a3d9-08c9105679b5\") " pod="openstack/ceilometer-0" Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.878101 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/696b55ca-18fe-412e-a3d9-08c9105679b5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"696b55ca-18fe-412e-a3d9-08c9105679b5\") " pod="openstack/ceilometer-0" Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.878271 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/696b55ca-18fe-412e-a3d9-08c9105679b5-log-httpd\") pod \"ceilometer-0\" (UID: \"696b55ca-18fe-412e-a3d9-08c9105679b5\") " pod="openstack/ceilometer-0" Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.879384 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/696b55ca-18fe-412e-a3d9-08c9105679b5-run-httpd\") pod \"ceilometer-0\" (UID: \"696b55ca-18fe-412e-a3d9-08c9105679b5\") " pod="openstack/ceilometer-0" Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.883493 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/696b55ca-18fe-412e-a3d9-08c9105679b5-scripts\") pod \"ceilometer-0\" (UID: \"696b55ca-18fe-412e-a3d9-08c9105679b5\") " pod="openstack/ceilometer-0" Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.883732 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/696b55ca-18fe-412e-a3d9-08c9105679b5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"696b55ca-18fe-412e-a3d9-08c9105679b5\") " pod="openstack/ceilometer-0" Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.883879 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/696b55ca-18fe-412e-a3d9-08c9105679b5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"696b55ca-18fe-412e-a3d9-08c9105679b5\") " pod="openstack/ceilometer-0" Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.887629 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/696b55ca-18fe-412e-a3d9-08c9105679b5-config-data\") pod \"ceilometer-0\" (UID: \"696b55ca-18fe-412e-a3d9-08c9105679b5\") " pod="openstack/ceilometer-0" Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.900338 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27s9p\" (UniqueName: \"kubernetes.io/projected/696b55ca-18fe-412e-a3d9-08c9105679b5-kube-api-access-27s9p\") pod \"ceilometer-0\" (UID: \"696b55ca-18fe-412e-a3d9-08c9105679b5\") " pod="openstack/ceilometer-0" Oct 03 08:10:25 crc kubenswrapper[4664]: I1003 08:10:25.920581 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab" path="/var/lib/kubelet/pods/42c9b4c4-f252-4cc8-8e2e-f66c1e7268ab/volumes" Oct 03 08:10:26 crc kubenswrapper[4664]: I1003 08:10:26.030503 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 08:10:26 crc kubenswrapper[4664]: I1003 08:10:26.488887 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:10:26 crc kubenswrapper[4664]: I1003 08:10:26.651992 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"696b55ca-18fe-412e-a3d9-08c9105679b5","Type":"ContainerStarted","Data":"996fbce09d75f070649475cc59053687dc07073bf3abecff4d0c7541fb50de69"} Oct 03 08:10:26 crc kubenswrapper[4664]: I1003 08:10:26.653368 4664 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 08:10:26 crc kubenswrapper[4664]: I1003 08:10:26.653395 4664 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 08:10:26 crc kubenswrapper[4664]: I1003 08:10:26.788291 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 03 08:10:26 crc kubenswrapper[4664]: I1003 08:10:26.792705 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 03 08:10:27 crc kubenswrapper[4664]: I1003 08:10:27.105431 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-fa05-account-create-vslm4" Oct 03 08:10:27 crc kubenswrapper[4664]: I1003 08:10:27.152340 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4ca0-account-create-8x7r2" Oct 03 08:10:27 crc kubenswrapper[4664]: I1003 08:10:27.247392 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kr84\" (UniqueName: \"kubernetes.io/projected/53a3e54d-79e2-48c5-84df-499f3ac97222-kube-api-access-6kr84\") pod \"53a3e54d-79e2-48c5-84df-499f3ac97222\" (UID: \"53a3e54d-79e2-48c5-84df-499f3ac97222\") " Oct 03 08:10:27 crc kubenswrapper[4664]: I1003 08:10:27.247692 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xjlf\" (UniqueName: \"kubernetes.io/projected/17d3f5d3-82e5-4954-83a6-a7e0255df488-kube-api-access-4xjlf\") pod \"17d3f5d3-82e5-4954-83a6-a7e0255df488\" (UID: \"17d3f5d3-82e5-4954-83a6-a7e0255df488\") " Oct 03 08:10:27 crc kubenswrapper[4664]: I1003 08:10:27.266940 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17d3f5d3-82e5-4954-83a6-a7e0255df488-kube-api-access-4xjlf" (OuterVolumeSpecName: "kube-api-access-4xjlf") pod "17d3f5d3-82e5-4954-83a6-a7e0255df488" (UID: "17d3f5d3-82e5-4954-83a6-a7e0255df488"). InnerVolumeSpecName "kube-api-access-4xjlf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:10:27 crc kubenswrapper[4664]: I1003 08:10:27.275411 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53a3e54d-79e2-48c5-84df-499f3ac97222-kube-api-access-6kr84" (OuterVolumeSpecName: "kube-api-access-6kr84") pod "53a3e54d-79e2-48c5-84df-499f3ac97222" (UID: "53a3e54d-79e2-48c5-84df-499f3ac97222"). InnerVolumeSpecName "kube-api-access-6kr84". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:10:27 crc kubenswrapper[4664]: I1003 08:10:27.351227 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xjlf\" (UniqueName: \"kubernetes.io/projected/17d3f5d3-82e5-4954-83a6-a7e0255df488-kube-api-access-4xjlf\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:27 crc kubenswrapper[4664]: I1003 08:10:27.351273 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kr84\" (UniqueName: \"kubernetes.io/projected/53a3e54d-79e2-48c5-84df-499f3ac97222-kube-api-access-6kr84\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:27 crc kubenswrapper[4664]: I1003 08:10:27.668814 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"696b55ca-18fe-412e-a3d9-08c9105679b5","Type":"ContainerStarted","Data":"7f9b6652747d6f92d5a95cbae57ff217f7c78d4c55a1e2b1a27f5b2bf381d162"} Oct 03 08:10:27 crc kubenswrapper[4664]: I1003 08:10:27.672324 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-fa05-account-create-vslm4" event={"ID":"53a3e54d-79e2-48c5-84df-499f3ac97222","Type":"ContainerDied","Data":"02528db2c600286d74c598bc98070aa26945b1e1c04301b3c3cf8e6c531ff23a"} Oct 03 08:10:27 crc kubenswrapper[4664]: I1003 08:10:27.672379 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02528db2c600286d74c598bc98070aa26945b1e1c04301b3c3cf8e6c531ff23a" Oct 03 08:10:27 crc kubenswrapper[4664]: I1003 08:10:27.672454 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-fa05-account-create-vslm4" Oct 03 08:10:27 crc kubenswrapper[4664]: I1003 08:10:27.681273 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4ca0-account-create-8x7r2" event={"ID":"17d3f5d3-82e5-4954-83a6-a7e0255df488","Type":"ContainerDied","Data":"9cc94325fbaf7847f8121c71ca6b0f1a198d1341e6e644e9be0f851ac53f4721"} Oct 03 08:10:27 crc kubenswrapper[4664]: I1003 08:10:27.681308 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4ca0-account-create-8x7r2" Oct 03 08:10:27 crc kubenswrapper[4664]: I1003 08:10:27.681326 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9cc94325fbaf7847f8121c71ca6b0f1a198d1341e6e644e9be0f851ac53f4721" Oct 03 08:10:28 crc kubenswrapper[4664]: I1003 08:10:28.260542 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:10:28 crc kubenswrapper[4664]: I1003 08:10:28.691197 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"696b55ca-18fe-412e-a3d9-08c9105679b5","Type":"ContainerStarted","Data":"097f1f1b13aec9d559338467d7769be79fa8ad097710bc1c90c778655b76a858"} Oct 03 08:10:29 crc kubenswrapper[4664]: I1003 08:10:29.372350 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-c6qmc"] Oct 03 08:10:29 crc kubenswrapper[4664]: E1003 08:10:29.373235 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53a3e54d-79e2-48c5-84df-499f3ac97222" containerName="mariadb-account-create" Oct 03 08:10:29 crc kubenswrapper[4664]: I1003 08:10:29.373261 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="53a3e54d-79e2-48c5-84df-499f3ac97222" containerName="mariadb-account-create" Oct 03 08:10:29 crc kubenswrapper[4664]: E1003 08:10:29.373310 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17d3f5d3-82e5-4954-83a6-a7e0255df488" containerName="mariadb-account-create" Oct 03 08:10:29 crc kubenswrapper[4664]: I1003 08:10:29.373322 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="17d3f5d3-82e5-4954-83a6-a7e0255df488" containerName="mariadb-account-create" Oct 03 08:10:29 crc kubenswrapper[4664]: I1003 08:10:29.373560 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="53a3e54d-79e2-48c5-84df-499f3ac97222" containerName="mariadb-account-create" Oct 03 08:10:29 crc kubenswrapper[4664]: I1003 08:10:29.373709 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="17d3f5d3-82e5-4954-83a6-a7e0255df488" containerName="mariadb-account-create" Oct 03 08:10:29 crc kubenswrapper[4664]: I1003 08:10:29.374746 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-c6qmc" Oct 03 08:10:29 crc kubenswrapper[4664]: I1003 08:10:29.383342 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 03 08:10:29 crc kubenswrapper[4664]: I1003 08:10:29.384250 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 03 08:10:29 crc kubenswrapper[4664]: I1003 08:10:29.397559 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-6x775" Oct 03 08:10:29 crc kubenswrapper[4664]: I1003 08:10:29.417550 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-c6qmc"] Oct 03 08:10:29 crc kubenswrapper[4664]: I1003 08:10:29.507390 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt8qd\" (UniqueName: \"kubernetes.io/projected/4958ebda-1932-42bd-825b-c64ac09c50ac-kube-api-access-bt8qd\") pod \"nova-cell0-conductor-db-sync-c6qmc\" (UID: \"4958ebda-1932-42bd-825b-c64ac09c50ac\") " pod="openstack/nova-cell0-conductor-db-sync-c6qmc" Oct 03 08:10:29 crc kubenswrapper[4664]: I1003 08:10:29.507465 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4958ebda-1932-42bd-825b-c64ac09c50ac-config-data\") pod \"nova-cell0-conductor-db-sync-c6qmc\" (UID: \"4958ebda-1932-42bd-825b-c64ac09c50ac\") " pod="openstack/nova-cell0-conductor-db-sync-c6qmc" Oct 03 08:10:29 crc kubenswrapper[4664]: I1003 08:10:29.507656 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4958ebda-1932-42bd-825b-c64ac09c50ac-scripts\") pod \"nova-cell0-conductor-db-sync-c6qmc\" (UID: \"4958ebda-1932-42bd-825b-c64ac09c50ac\") " pod="openstack/nova-cell0-conductor-db-sync-c6qmc" Oct 03 08:10:29 crc kubenswrapper[4664]: I1003 08:10:29.507935 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4958ebda-1932-42bd-825b-c64ac09c50ac-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-c6qmc\" (UID: \"4958ebda-1932-42bd-825b-c64ac09c50ac\") " pod="openstack/nova-cell0-conductor-db-sync-c6qmc" Oct 03 08:10:29 crc kubenswrapper[4664]: I1003 08:10:29.610167 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bt8qd\" (UniqueName: \"kubernetes.io/projected/4958ebda-1932-42bd-825b-c64ac09c50ac-kube-api-access-bt8qd\") pod \"nova-cell0-conductor-db-sync-c6qmc\" (UID: \"4958ebda-1932-42bd-825b-c64ac09c50ac\") " pod="openstack/nova-cell0-conductor-db-sync-c6qmc" Oct 03 08:10:29 crc kubenswrapper[4664]: I1003 08:10:29.610217 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4958ebda-1932-42bd-825b-c64ac09c50ac-config-data\") pod \"nova-cell0-conductor-db-sync-c6qmc\" (UID: \"4958ebda-1932-42bd-825b-c64ac09c50ac\") " pod="openstack/nova-cell0-conductor-db-sync-c6qmc" Oct 03 08:10:29 crc kubenswrapper[4664]: I1003 08:10:29.610253 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4958ebda-1932-42bd-825b-c64ac09c50ac-scripts\") pod \"nova-cell0-conductor-db-sync-c6qmc\" (UID: \"4958ebda-1932-42bd-825b-c64ac09c50ac\") " pod="openstack/nova-cell0-conductor-db-sync-c6qmc" Oct 03 08:10:29 crc kubenswrapper[4664]: I1003 08:10:29.610291 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4958ebda-1932-42bd-825b-c64ac09c50ac-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-c6qmc\" (UID: \"4958ebda-1932-42bd-825b-c64ac09c50ac\") " pod="openstack/nova-cell0-conductor-db-sync-c6qmc" Oct 03 08:10:29 crc kubenswrapper[4664]: I1003 08:10:29.614520 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4958ebda-1932-42bd-825b-c64ac09c50ac-scripts\") pod \"nova-cell0-conductor-db-sync-c6qmc\" (UID: \"4958ebda-1932-42bd-825b-c64ac09c50ac\") " pod="openstack/nova-cell0-conductor-db-sync-c6qmc" Oct 03 08:10:29 crc kubenswrapper[4664]: I1003 08:10:29.614764 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4958ebda-1932-42bd-825b-c64ac09c50ac-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-c6qmc\" (UID: \"4958ebda-1932-42bd-825b-c64ac09c50ac\") " pod="openstack/nova-cell0-conductor-db-sync-c6qmc" Oct 03 08:10:29 crc kubenswrapper[4664]: I1003 08:10:29.618495 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4958ebda-1932-42bd-825b-c64ac09c50ac-config-data\") pod \"nova-cell0-conductor-db-sync-c6qmc\" (UID: \"4958ebda-1932-42bd-825b-c64ac09c50ac\") " pod="openstack/nova-cell0-conductor-db-sync-c6qmc" Oct 03 08:10:29 crc kubenswrapper[4664]: I1003 08:10:29.632443 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt8qd\" (UniqueName: \"kubernetes.io/projected/4958ebda-1932-42bd-825b-c64ac09c50ac-kube-api-access-bt8qd\") pod \"nova-cell0-conductor-db-sync-c6qmc\" (UID: \"4958ebda-1932-42bd-825b-c64ac09c50ac\") " pod="openstack/nova-cell0-conductor-db-sync-c6qmc" Oct 03 08:10:29 crc kubenswrapper[4664]: I1003 08:10:29.705386 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"696b55ca-18fe-412e-a3d9-08c9105679b5","Type":"ContainerStarted","Data":"fff1da56d61a9d451416ba0945b4cc7f39659521b94424c60b9afdb59d99da7f"} Oct 03 08:10:29 crc kubenswrapper[4664]: I1003 08:10:29.726582 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-c6qmc" Oct 03 08:10:30 crc kubenswrapper[4664]: W1003 08:10:30.231867 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4958ebda_1932_42bd_825b_c64ac09c50ac.slice/crio-51f52781b9b7f0deae5803452563dad019242e40ec3041b48c4fa27a46ee2f32 WatchSource:0}: Error finding container 51f52781b9b7f0deae5803452563dad019242e40ec3041b48c4fa27a46ee2f32: Status 404 returned error can't find the container with id 51f52781b9b7f0deae5803452563dad019242e40ec3041b48c4fa27a46ee2f32 Oct 03 08:10:30 crc kubenswrapper[4664]: I1003 08:10:30.236553 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-c6qmc"] Oct 03 08:10:30 crc kubenswrapper[4664]: I1003 08:10:30.720331 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"696b55ca-18fe-412e-a3d9-08c9105679b5","Type":"ContainerStarted","Data":"db4bb8aa598d827d955b3bc9b6f7ebadf3f62212d9a12b841c07ecaa340b427b"} Oct 03 08:10:30 crc kubenswrapper[4664]: I1003 08:10:30.720695 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 08:10:30 crc kubenswrapper[4664]: I1003 08:10:30.720721 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="696b55ca-18fe-412e-a3d9-08c9105679b5" containerName="ceilometer-central-agent" containerID="cri-o://7f9b6652747d6f92d5a95cbae57ff217f7c78d4c55a1e2b1a27f5b2bf381d162" gracePeriod=30 Oct 03 08:10:30 crc kubenswrapper[4664]: I1003 08:10:30.720747 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="696b55ca-18fe-412e-a3d9-08c9105679b5" containerName="proxy-httpd" containerID="cri-o://db4bb8aa598d827d955b3bc9b6f7ebadf3f62212d9a12b841c07ecaa340b427b" gracePeriod=30 Oct 03 08:10:30 crc kubenswrapper[4664]: I1003 08:10:30.720750 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="696b55ca-18fe-412e-a3d9-08c9105679b5" containerName="ceilometer-notification-agent" containerID="cri-o://097f1f1b13aec9d559338467d7769be79fa8ad097710bc1c90c778655b76a858" gracePeriod=30 Oct 03 08:10:30 crc kubenswrapper[4664]: I1003 08:10:30.721018 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="696b55ca-18fe-412e-a3d9-08c9105679b5" containerName="sg-core" containerID="cri-o://fff1da56d61a9d451416ba0945b4cc7f39659521b94424c60b9afdb59d99da7f" gracePeriod=30 Oct 03 08:10:30 crc kubenswrapper[4664]: I1003 08:10:30.722527 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-c6qmc" event={"ID":"4958ebda-1932-42bd-825b-c64ac09c50ac","Type":"ContainerStarted","Data":"51f52781b9b7f0deae5803452563dad019242e40ec3041b48c4fa27a46ee2f32"} Oct 03 08:10:30 crc kubenswrapper[4664]: I1003 08:10:30.749860 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.276523914 podStartE2EDuration="5.749842383s" podCreationTimestamp="2025-10-03 08:10:25 +0000 UTC" firstStartedPulling="2025-10-03 08:10:26.49797336 +0000 UTC m=+1327.319163850" lastFinishedPulling="2025-10-03 08:10:29.971291829 +0000 UTC m=+1330.792482319" observedRunningTime="2025-10-03 08:10:30.743489138 +0000 UTC m=+1331.564679658" watchObservedRunningTime="2025-10-03 08:10:30.749842383 +0000 UTC m=+1331.571032873" Oct 03 08:10:31 crc kubenswrapper[4664]: I1003 08:10:31.737218 4664 generic.go:334] "Generic (PLEG): container finished" podID="696b55ca-18fe-412e-a3d9-08c9105679b5" containerID="db4bb8aa598d827d955b3bc9b6f7ebadf3f62212d9a12b841c07ecaa340b427b" exitCode=0 Oct 03 08:10:31 crc kubenswrapper[4664]: I1003 08:10:31.737497 4664 generic.go:334] "Generic (PLEG): container finished" podID="696b55ca-18fe-412e-a3d9-08c9105679b5" containerID="fff1da56d61a9d451416ba0945b4cc7f39659521b94424c60b9afdb59d99da7f" exitCode=2 Oct 03 08:10:31 crc kubenswrapper[4664]: I1003 08:10:31.737509 4664 generic.go:334] "Generic (PLEG): container finished" podID="696b55ca-18fe-412e-a3d9-08c9105679b5" containerID="097f1f1b13aec9d559338467d7769be79fa8ad097710bc1c90c778655b76a858" exitCode=0 Oct 03 08:10:31 crc kubenswrapper[4664]: I1003 08:10:31.737307 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"696b55ca-18fe-412e-a3d9-08c9105679b5","Type":"ContainerDied","Data":"db4bb8aa598d827d955b3bc9b6f7ebadf3f62212d9a12b841c07ecaa340b427b"} Oct 03 08:10:31 crc kubenswrapper[4664]: I1003 08:10:31.737561 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"696b55ca-18fe-412e-a3d9-08c9105679b5","Type":"ContainerDied","Data":"fff1da56d61a9d451416ba0945b4cc7f39659521b94424c60b9afdb59d99da7f"} Oct 03 08:10:31 crc kubenswrapper[4664]: I1003 08:10:31.737586 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"696b55ca-18fe-412e-a3d9-08c9105679b5","Type":"ContainerDied","Data":"097f1f1b13aec9d559338467d7769be79fa8ad097710bc1c90c778655b76a858"} Oct 03 08:10:37 crc kubenswrapper[4664]: I1003 08:10:37.804559 4664 generic.go:334] "Generic (PLEG): container finished" podID="696b55ca-18fe-412e-a3d9-08c9105679b5" containerID="7f9b6652747d6f92d5a95cbae57ff217f7c78d4c55a1e2b1a27f5b2bf381d162" exitCode=0 Oct 03 08:10:37 crc kubenswrapper[4664]: I1003 08:10:37.804774 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"696b55ca-18fe-412e-a3d9-08c9105679b5","Type":"ContainerDied","Data":"7f9b6652747d6f92d5a95cbae57ff217f7c78d4c55a1e2b1a27f5b2bf381d162"} Oct 03 08:10:37 crc kubenswrapper[4664]: I1003 08:10:37.810752 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-c6qmc" event={"ID":"4958ebda-1932-42bd-825b-c64ac09c50ac","Type":"ContainerStarted","Data":"e199afdf389fdeda61244d3d59d662461641ef71f62fd5a6d8cb8c80371c855e"} Oct 03 08:10:37 crc kubenswrapper[4664]: I1003 08:10:37.838840 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-c6qmc" podStartSLOduration=1.6680129209999999 podStartE2EDuration="8.838815383s" podCreationTimestamp="2025-10-03 08:10:29 +0000 UTC" firstStartedPulling="2025-10-03 08:10:30.237505635 +0000 UTC m=+1331.058696125" lastFinishedPulling="2025-10-03 08:10:37.408308087 +0000 UTC m=+1338.229498587" observedRunningTime="2025-10-03 08:10:37.833019964 +0000 UTC m=+1338.654210474" watchObservedRunningTime="2025-10-03 08:10:37.838815383 +0000 UTC m=+1338.660005873" Oct 03 08:10:37 crc kubenswrapper[4664]: I1003 08:10:37.962966 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 08:10:38 crc kubenswrapper[4664]: I1003 08:10:38.001827 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/696b55ca-18fe-412e-a3d9-08c9105679b5-scripts\") pod \"696b55ca-18fe-412e-a3d9-08c9105679b5\" (UID: \"696b55ca-18fe-412e-a3d9-08c9105679b5\") " Oct 03 08:10:38 crc kubenswrapper[4664]: I1003 08:10:38.001951 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/696b55ca-18fe-412e-a3d9-08c9105679b5-sg-core-conf-yaml\") pod \"696b55ca-18fe-412e-a3d9-08c9105679b5\" (UID: \"696b55ca-18fe-412e-a3d9-08c9105679b5\") " Oct 03 08:10:38 crc kubenswrapper[4664]: I1003 08:10:38.002004 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/696b55ca-18fe-412e-a3d9-08c9105679b5-run-httpd\") pod \"696b55ca-18fe-412e-a3d9-08c9105679b5\" (UID: \"696b55ca-18fe-412e-a3d9-08c9105679b5\") " Oct 03 08:10:38 crc kubenswrapper[4664]: I1003 08:10:38.002064 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27s9p\" (UniqueName: \"kubernetes.io/projected/696b55ca-18fe-412e-a3d9-08c9105679b5-kube-api-access-27s9p\") pod \"696b55ca-18fe-412e-a3d9-08c9105679b5\" (UID: \"696b55ca-18fe-412e-a3d9-08c9105679b5\") " Oct 03 08:10:38 crc kubenswrapper[4664]: I1003 08:10:38.002114 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/696b55ca-18fe-412e-a3d9-08c9105679b5-log-httpd\") pod \"696b55ca-18fe-412e-a3d9-08c9105679b5\" (UID: \"696b55ca-18fe-412e-a3d9-08c9105679b5\") " Oct 03 08:10:38 crc kubenswrapper[4664]: I1003 08:10:38.002142 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/696b55ca-18fe-412e-a3d9-08c9105679b5-config-data\") pod \"696b55ca-18fe-412e-a3d9-08c9105679b5\" (UID: \"696b55ca-18fe-412e-a3d9-08c9105679b5\") " Oct 03 08:10:38 crc kubenswrapper[4664]: I1003 08:10:38.002178 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/696b55ca-18fe-412e-a3d9-08c9105679b5-combined-ca-bundle\") pod \"696b55ca-18fe-412e-a3d9-08c9105679b5\" (UID: \"696b55ca-18fe-412e-a3d9-08c9105679b5\") " Oct 03 08:10:38 crc kubenswrapper[4664]: I1003 08:10:38.003305 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/696b55ca-18fe-412e-a3d9-08c9105679b5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "696b55ca-18fe-412e-a3d9-08c9105679b5" (UID: "696b55ca-18fe-412e-a3d9-08c9105679b5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:10:38 crc kubenswrapper[4664]: I1003 08:10:38.013574 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/696b55ca-18fe-412e-a3d9-08c9105679b5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "696b55ca-18fe-412e-a3d9-08c9105679b5" (UID: "696b55ca-18fe-412e-a3d9-08c9105679b5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:10:38 crc kubenswrapper[4664]: I1003 08:10:38.043194 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/696b55ca-18fe-412e-a3d9-08c9105679b5-scripts" (OuterVolumeSpecName: "scripts") pod "696b55ca-18fe-412e-a3d9-08c9105679b5" (UID: "696b55ca-18fe-412e-a3d9-08c9105679b5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:10:38 crc kubenswrapper[4664]: I1003 08:10:38.045063 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/696b55ca-18fe-412e-a3d9-08c9105679b5-kube-api-access-27s9p" (OuterVolumeSpecName: "kube-api-access-27s9p") pod "696b55ca-18fe-412e-a3d9-08c9105679b5" (UID: "696b55ca-18fe-412e-a3d9-08c9105679b5"). InnerVolumeSpecName "kube-api-access-27s9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:10:38 crc kubenswrapper[4664]: I1003 08:10:38.049939 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/696b55ca-18fe-412e-a3d9-08c9105679b5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "696b55ca-18fe-412e-a3d9-08c9105679b5" (UID: "696b55ca-18fe-412e-a3d9-08c9105679b5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:10:38 crc kubenswrapper[4664]: I1003 08:10:38.105268 4664 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/696b55ca-18fe-412e-a3d9-08c9105679b5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:38 crc kubenswrapper[4664]: I1003 08:10:38.105738 4664 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/696b55ca-18fe-412e-a3d9-08c9105679b5-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:38 crc kubenswrapper[4664]: I1003 08:10:38.105779 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27s9p\" (UniqueName: \"kubernetes.io/projected/696b55ca-18fe-412e-a3d9-08c9105679b5-kube-api-access-27s9p\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:38 crc kubenswrapper[4664]: I1003 08:10:38.105796 4664 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/696b55ca-18fe-412e-a3d9-08c9105679b5-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:38 crc kubenswrapper[4664]: I1003 08:10:38.105807 4664 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/696b55ca-18fe-412e-a3d9-08c9105679b5-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:38 crc kubenswrapper[4664]: I1003 08:10:38.106979 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/696b55ca-18fe-412e-a3d9-08c9105679b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "696b55ca-18fe-412e-a3d9-08c9105679b5" (UID: "696b55ca-18fe-412e-a3d9-08c9105679b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:10:38 crc kubenswrapper[4664]: I1003 08:10:38.134065 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/696b55ca-18fe-412e-a3d9-08c9105679b5-config-data" (OuterVolumeSpecName: "config-data") pod "696b55ca-18fe-412e-a3d9-08c9105679b5" (UID: "696b55ca-18fe-412e-a3d9-08c9105679b5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:10:38 crc kubenswrapper[4664]: I1003 08:10:38.208484 4664 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/696b55ca-18fe-412e-a3d9-08c9105679b5-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:38 crc kubenswrapper[4664]: I1003 08:10:38.209017 4664 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/696b55ca-18fe-412e-a3d9-08c9105679b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:38 crc kubenswrapper[4664]: I1003 08:10:38.822932 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"696b55ca-18fe-412e-a3d9-08c9105679b5","Type":"ContainerDied","Data":"996fbce09d75f070649475cc59053687dc07073bf3abecff4d0c7541fb50de69"} Oct 03 08:10:38 crc kubenswrapper[4664]: I1003 08:10:38.823693 4664 scope.go:117] "RemoveContainer" containerID="db4bb8aa598d827d955b3bc9b6f7ebadf3f62212d9a12b841c07ecaa340b427b" Oct 03 08:10:38 crc kubenswrapper[4664]: I1003 08:10:38.823000 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 08:10:38 crc kubenswrapper[4664]: I1003 08:10:38.856944 4664 scope.go:117] "RemoveContainer" containerID="fff1da56d61a9d451416ba0945b4cc7f39659521b94424c60b9afdb59d99da7f" Oct 03 08:10:38 crc kubenswrapper[4664]: I1003 08:10:38.881807 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:10:38 crc kubenswrapper[4664]: I1003 08:10:38.901873 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:10:38 crc kubenswrapper[4664]: I1003 08:10:38.914689 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:10:38 crc kubenswrapper[4664]: E1003 08:10:38.915402 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="696b55ca-18fe-412e-a3d9-08c9105679b5" containerName="sg-core" Oct 03 08:10:38 crc kubenswrapper[4664]: I1003 08:10:38.915436 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="696b55ca-18fe-412e-a3d9-08c9105679b5" containerName="sg-core" Oct 03 08:10:38 crc kubenswrapper[4664]: E1003 08:10:38.915457 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="696b55ca-18fe-412e-a3d9-08c9105679b5" containerName="ceilometer-central-agent" Oct 03 08:10:38 crc kubenswrapper[4664]: I1003 08:10:38.915465 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="696b55ca-18fe-412e-a3d9-08c9105679b5" containerName="ceilometer-central-agent" Oct 03 08:10:38 crc kubenswrapper[4664]: E1003 08:10:38.915479 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="696b55ca-18fe-412e-a3d9-08c9105679b5" containerName="ceilometer-notification-agent" Oct 03 08:10:38 crc kubenswrapper[4664]: I1003 08:10:38.915486 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="696b55ca-18fe-412e-a3d9-08c9105679b5" containerName="ceilometer-notification-agent" Oct 03 08:10:38 crc kubenswrapper[4664]: E1003 08:10:38.915536 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="696b55ca-18fe-412e-a3d9-08c9105679b5" containerName="proxy-httpd" Oct 03 08:10:38 crc kubenswrapper[4664]: I1003 08:10:38.915544 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="696b55ca-18fe-412e-a3d9-08c9105679b5" containerName="proxy-httpd" Oct 03 08:10:38 crc kubenswrapper[4664]: I1003 08:10:38.915401 4664 scope.go:117] "RemoveContainer" containerID="097f1f1b13aec9d559338467d7769be79fa8ad097710bc1c90c778655b76a858" Oct 03 08:10:38 crc kubenswrapper[4664]: I1003 08:10:38.915792 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="696b55ca-18fe-412e-a3d9-08c9105679b5" containerName="proxy-httpd" Oct 03 08:10:38 crc kubenswrapper[4664]: I1003 08:10:38.915826 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="696b55ca-18fe-412e-a3d9-08c9105679b5" containerName="ceilometer-central-agent" Oct 03 08:10:38 crc kubenswrapper[4664]: I1003 08:10:38.915835 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="696b55ca-18fe-412e-a3d9-08c9105679b5" containerName="sg-core" Oct 03 08:10:38 crc kubenswrapper[4664]: I1003 08:10:38.915844 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="696b55ca-18fe-412e-a3d9-08c9105679b5" containerName="ceilometer-notification-agent" Oct 03 08:10:38 crc kubenswrapper[4664]: I1003 08:10:38.918804 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 08:10:38 crc kubenswrapper[4664]: I1003 08:10:38.925365 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 08:10:38 crc kubenswrapper[4664]: I1003 08:10:38.925511 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 08:10:38 crc kubenswrapper[4664]: I1003 08:10:38.927858 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:10:38 crc kubenswrapper[4664]: I1003 08:10:38.964370 4664 scope.go:117] "RemoveContainer" containerID="7f9b6652747d6f92d5a95cbae57ff217f7c78d4c55a1e2b1a27f5b2bf381d162" Oct 03 08:10:39 crc kubenswrapper[4664]: I1003 08:10:39.136494 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c98f4aa-c34f-4ccb-9198-d2daf975e5df-scripts\") pod \"ceilometer-0\" (UID: \"9c98f4aa-c34f-4ccb-9198-d2daf975e5df\") " pod="openstack/ceilometer-0" Oct 03 08:10:39 crc kubenswrapper[4664]: I1003 08:10:39.136560 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9c98f4aa-c34f-4ccb-9198-d2daf975e5df-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9c98f4aa-c34f-4ccb-9198-d2daf975e5df\") " pod="openstack/ceilometer-0" Oct 03 08:10:39 crc kubenswrapper[4664]: I1003 08:10:39.136853 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c98f4aa-c34f-4ccb-9198-d2daf975e5df-config-data\") pod \"ceilometer-0\" (UID: \"9c98f4aa-c34f-4ccb-9198-d2daf975e5df\") " pod="openstack/ceilometer-0" Oct 03 08:10:39 crc kubenswrapper[4664]: I1003 08:10:39.137095 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfmwc\" (UniqueName: \"kubernetes.io/projected/9c98f4aa-c34f-4ccb-9198-d2daf975e5df-kube-api-access-nfmwc\") pod \"ceilometer-0\" (UID: \"9c98f4aa-c34f-4ccb-9198-d2daf975e5df\") " pod="openstack/ceilometer-0" Oct 03 08:10:39 crc kubenswrapper[4664]: I1003 08:10:39.137163 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c98f4aa-c34f-4ccb-9198-d2daf975e5df-run-httpd\") pod \"ceilometer-0\" (UID: \"9c98f4aa-c34f-4ccb-9198-d2daf975e5df\") " pod="openstack/ceilometer-0" Oct 03 08:10:39 crc kubenswrapper[4664]: I1003 08:10:39.137407 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c98f4aa-c34f-4ccb-9198-d2daf975e5df-log-httpd\") pod \"ceilometer-0\" (UID: \"9c98f4aa-c34f-4ccb-9198-d2daf975e5df\") " pod="openstack/ceilometer-0" Oct 03 08:10:39 crc kubenswrapper[4664]: I1003 08:10:39.137465 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c98f4aa-c34f-4ccb-9198-d2daf975e5df-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9c98f4aa-c34f-4ccb-9198-d2daf975e5df\") " pod="openstack/ceilometer-0" Oct 03 08:10:39 crc kubenswrapper[4664]: I1003 08:10:39.239332 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c98f4aa-c34f-4ccb-9198-d2daf975e5df-log-httpd\") pod \"ceilometer-0\" (UID: \"9c98f4aa-c34f-4ccb-9198-d2daf975e5df\") " pod="openstack/ceilometer-0" Oct 03 08:10:39 crc kubenswrapper[4664]: I1003 08:10:39.239385 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c98f4aa-c34f-4ccb-9198-d2daf975e5df-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9c98f4aa-c34f-4ccb-9198-d2daf975e5df\") " pod="openstack/ceilometer-0" Oct 03 08:10:39 crc kubenswrapper[4664]: I1003 08:10:39.239440 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c98f4aa-c34f-4ccb-9198-d2daf975e5df-scripts\") pod \"ceilometer-0\" (UID: \"9c98f4aa-c34f-4ccb-9198-d2daf975e5df\") " pod="openstack/ceilometer-0" Oct 03 08:10:39 crc kubenswrapper[4664]: I1003 08:10:39.239463 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9c98f4aa-c34f-4ccb-9198-d2daf975e5df-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9c98f4aa-c34f-4ccb-9198-d2daf975e5df\") " pod="openstack/ceilometer-0" Oct 03 08:10:39 crc kubenswrapper[4664]: I1003 08:10:39.239536 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c98f4aa-c34f-4ccb-9198-d2daf975e5df-config-data\") pod \"ceilometer-0\" (UID: \"9c98f4aa-c34f-4ccb-9198-d2daf975e5df\") " pod="openstack/ceilometer-0" Oct 03 08:10:39 crc kubenswrapper[4664]: I1003 08:10:39.239804 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfmwc\" (UniqueName: \"kubernetes.io/projected/9c98f4aa-c34f-4ccb-9198-d2daf975e5df-kube-api-access-nfmwc\") pod \"ceilometer-0\" (UID: \"9c98f4aa-c34f-4ccb-9198-d2daf975e5df\") " pod="openstack/ceilometer-0" Oct 03 08:10:39 crc kubenswrapper[4664]: I1003 08:10:39.239836 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c98f4aa-c34f-4ccb-9198-d2daf975e5df-run-httpd\") pod \"ceilometer-0\" (UID: \"9c98f4aa-c34f-4ccb-9198-d2daf975e5df\") " pod="openstack/ceilometer-0" Oct 03 08:10:39 crc kubenswrapper[4664]: I1003 08:10:39.240253 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c98f4aa-c34f-4ccb-9198-d2daf975e5df-log-httpd\") pod \"ceilometer-0\" (UID: \"9c98f4aa-c34f-4ccb-9198-d2daf975e5df\") " pod="openstack/ceilometer-0" Oct 03 08:10:39 crc kubenswrapper[4664]: I1003 08:10:39.241068 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c98f4aa-c34f-4ccb-9198-d2daf975e5df-run-httpd\") pod \"ceilometer-0\" (UID: \"9c98f4aa-c34f-4ccb-9198-d2daf975e5df\") " pod="openstack/ceilometer-0" Oct 03 08:10:39 crc kubenswrapper[4664]: I1003 08:10:39.255269 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9c98f4aa-c34f-4ccb-9198-d2daf975e5df-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9c98f4aa-c34f-4ccb-9198-d2daf975e5df\") " pod="openstack/ceilometer-0" Oct 03 08:10:39 crc kubenswrapper[4664]: I1003 08:10:39.255539 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c98f4aa-c34f-4ccb-9198-d2daf975e5df-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9c98f4aa-c34f-4ccb-9198-d2daf975e5df\") " pod="openstack/ceilometer-0" Oct 03 08:10:39 crc kubenswrapper[4664]: I1003 08:10:39.255730 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c98f4aa-c34f-4ccb-9198-d2daf975e5df-scripts\") pod \"ceilometer-0\" (UID: \"9c98f4aa-c34f-4ccb-9198-d2daf975e5df\") " pod="openstack/ceilometer-0" Oct 03 08:10:39 crc kubenswrapper[4664]: I1003 08:10:39.256222 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c98f4aa-c34f-4ccb-9198-d2daf975e5df-config-data\") pod \"ceilometer-0\" (UID: \"9c98f4aa-c34f-4ccb-9198-d2daf975e5df\") " pod="openstack/ceilometer-0" Oct 03 08:10:39 crc kubenswrapper[4664]: I1003 08:10:39.260083 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfmwc\" (UniqueName: \"kubernetes.io/projected/9c98f4aa-c34f-4ccb-9198-d2daf975e5df-kube-api-access-nfmwc\") pod \"ceilometer-0\" (UID: \"9c98f4aa-c34f-4ccb-9198-d2daf975e5df\") " pod="openstack/ceilometer-0" Oct 03 08:10:39 crc kubenswrapper[4664]: I1003 08:10:39.550383 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 08:10:39 crc kubenswrapper[4664]: I1003 08:10:39.895804 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="696b55ca-18fe-412e-a3d9-08c9105679b5" path="/var/lib/kubelet/pods/696b55ca-18fe-412e-a3d9-08c9105679b5/volumes" Oct 03 08:10:40 crc kubenswrapper[4664]: I1003 08:10:40.042525 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:10:40 crc kubenswrapper[4664]: I1003 08:10:40.847219 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c98f4aa-c34f-4ccb-9198-d2daf975e5df","Type":"ContainerStarted","Data":"1d8c80d4d32f8cefbc4f9606e7336a30d4086a19871787c47b81454294e67df6"} Oct 03 08:10:41 crc kubenswrapper[4664]: I1003 08:10:41.864764 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c98f4aa-c34f-4ccb-9198-d2daf975e5df","Type":"ContainerStarted","Data":"ecb74b1639508af304c6b86b47042ebedf8fada82a3eee7bfad3d2c1c67c78c1"} Oct 03 08:10:42 crc kubenswrapper[4664]: I1003 08:10:42.874844 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c98f4aa-c34f-4ccb-9198-d2daf975e5df","Type":"ContainerStarted","Data":"b49ad7c2b6db5d70aaef83dc2b4b5b0b0ed1c6dcce10190324332842a156941d"} Oct 03 08:10:43 crc kubenswrapper[4664]: I1003 08:10:43.897957 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c98f4aa-c34f-4ccb-9198-d2daf975e5df","Type":"ContainerStarted","Data":"62613db899840e1c90b3db2b2802f4bd1d6263fcc10f9206537360bf95541040"} Oct 03 08:10:44 crc kubenswrapper[4664]: I1003 08:10:44.908973 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c98f4aa-c34f-4ccb-9198-d2daf975e5df","Type":"ContainerStarted","Data":"e1d7af77eda09d0f41ed8a469b12a776492acbb0035601478093de2735b67b6a"} Oct 03 08:10:44 crc kubenswrapper[4664]: I1003 08:10:44.909696 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 08:10:44 crc kubenswrapper[4664]: I1003 08:10:44.933938 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.487024195 podStartE2EDuration="6.933914792s" podCreationTimestamp="2025-10-03 08:10:38 +0000 UTC" firstStartedPulling="2025-10-03 08:10:40.057289231 +0000 UTC m=+1340.878479721" lastFinishedPulling="2025-10-03 08:10:44.504179828 +0000 UTC m=+1345.325370318" observedRunningTime="2025-10-03 08:10:44.927010081 +0000 UTC m=+1345.748200571" watchObservedRunningTime="2025-10-03 08:10:44.933914792 +0000 UTC m=+1345.755105282" Oct 03 08:10:49 crc kubenswrapper[4664]: I1003 08:10:49.959925 4664 generic.go:334] "Generic (PLEG): container finished" podID="4958ebda-1932-42bd-825b-c64ac09c50ac" containerID="e199afdf389fdeda61244d3d59d662461641ef71f62fd5a6d8cb8c80371c855e" exitCode=0 Oct 03 08:10:49 crc kubenswrapper[4664]: I1003 08:10:49.960021 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-c6qmc" event={"ID":"4958ebda-1932-42bd-825b-c64ac09c50ac","Type":"ContainerDied","Data":"e199afdf389fdeda61244d3d59d662461641ef71f62fd5a6d8cb8c80371c855e"} Oct 03 08:10:51 crc kubenswrapper[4664]: I1003 08:10:51.396807 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-c6qmc" Oct 03 08:10:51 crc kubenswrapper[4664]: I1003 08:10:51.590343 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bt8qd\" (UniqueName: \"kubernetes.io/projected/4958ebda-1932-42bd-825b-c64ac09c50ac-kube-api-access-bt8qd\") pod \"4958ebda-1932-42bd-825b-c64ac09c50ac\" (UID: \"4958ebda-1932-42bd-825b-c64ac09c50ac\") " Oct 03 08:10:51 crc kubenswrapper[4664]: I1003 08:10:51.590968 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4958ebda-1932-42bd-825b-c64ac09c50ac-config-data\") pod \"4958ebda-1932-42bd-825b-c64ac09c50ac\" (UID: \"4958ebda-1932-42bd-825b-c64ac09c50ac\") " Oct 03 08:10:51 crc kubenswrapper[4664]: I1003 08:10:51.591149 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4958ebda-1932-42bd-825b-c64ac09c50ac-combined-ca-bundle\") pod \"4958ebda-1932-42bd-825b-c64ac09c50ac\" (UID: \"4958ebda-1932-42bd-825b-c64ac09c50ac\") " Oct 03 08:10:51 crc kubenswrapper[4664]: I1003 08:10:51.591190 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4958ebda-1932-42bd-825b-c64ac09c50ac-scripts\") pod \"4958ebda-1932-42bd-825b-c64ac09c50ac\" (UID: \"4958ebda-1932-42bd-825b-c64ac09c50ac\") " Oct 03 08:10:51 crc kubenswrapper[4664]: I1003 08:10:51.600269 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4958ebda-1932-42bd-825b-c64ac09c50ac-scripts" (OuterVolumeSpecName: "scripts") pod "4958ebda-1932-42bd-825b-c64ac09c50ac" (UID: "4958ebda-1932-42bd-825b-c64ac09c50ac"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:10:51 crc kubenswrapper[4664]: I1003 08:10:51.600446 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4958ebda-1932-42bd-825b-c64ac09c50ac-kube-api-access-bt8qd" (OuterVolumeSpecName: "kube-api-access-bt8qd") pod "4958ebda-1932-42bd-825b-c64ac09c50ac" (UID: "4958ebda-1932-42bd-825b-c64ac09c50ac"). InnerVolumeSpecName "kube-api-access-bt8qd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:10:51 crc kubenswrapper[4664]: I1003 08:10:51.626867 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4958ebda-1932-42bd-825b-c64ac09c50ac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4958ebda-1932-42bd-825b-c64ac09c50ac" (UID: "4958ebda-1932-42bd-825b-c64ac09c50ac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:10:51 crc kubenswrapper[4664]: I1003 08:10:51.629198 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4958ebda-1932-42bd-825b-c64ac09c50ac-config-data" (OuterVolumeSpecName: "config-data") pod "4958ebda-1932-42bd-825b-c64ac09c50ac" (UID: "4958ebda-1932-42bd-825b-c64ac09c50ac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:10:51 crc kubenswrapper[4664]: I1003 08:10:51.695016 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bt8qd\" (UniqueName: \"kubernetes.io/projected/4958ebda-1932-42bd-825b-c64ac09c50ac-kube-api-access-bt8qd\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:51 crc kubenswrapper[4664]: I1003 08:10:51.695070 4664 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4958ebda-1932-42bd-825b-c64ac09c50ac-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:51 crc kubenswrapper[4664]: I1003 08:10:51.695086 4664 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4958ebda-1932-42bd-825b-c64ac09c50ac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:51 crc kubenswrapper[4664]: I1003 08:10:51.695098 4664 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4958ebda-1932-42bd-825b-c64ac09c50ac-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:10:51 crc kubenswrapper[4664]: I1003 08:10:51.990478 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-c6qmc" event={"ID":"4958ebda-1932-42bd-825b-c64ac09c50ac","Type":"ContainerDied","Data":"51f52781b9b7f0deae5803452563dad019242e40ec3041b48c4fa27a46ee2f32"} Oct 03 08:10:51 crc kubenswrapper[4664]: I1003 08:10:51.990530 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51f52781b9b7f0deae5803452563dad019242e40ec3041b48c4fa27a46ee2f32" Oct 03 08:10:51 crc kubenswrapper[4664]: I1003 08:10:51.990632 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-c6qmc" Oct 03 08:10:52 crc kubenswrapper[4664]: I1003 08:10:52.097017 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 08:10:52 crc kubenswrapper[4664]: E1003 08:10:52.098098 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4958ebda-1932-42bd-825b-c64ac09c50ac" containerName="nova-cell0-conductor-db-sync" Oct 03 08:10:52 crc kubenswrapper[4664]: I1003 08:10:52.098129 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="4958ebda-1932-42bd-825b-c64ac09c50ac" containerName="nova-cell0-conductor-db-sync" Oct 03 08:10:52 crc kubenswrapper[4664]: I1003 08:10:52.098393 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="4958ebda-1932-42bd-825b-c64ac09c50ac" containerName="nova-cell0-conductor-db-sync" Oct 03 08:10:52 crc kubenswrapper[4664]: I1003 08:10:52.099439 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 03 08:10:52 crc kubenswrapper[4664]: I1003 08:10:52.103304 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 03 08:10:52 crc kubenswrapper[4664]: I1003 08:10:52.103709 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-6x775" Oct 03 08:10:52 crc kubenswrapper[4664]: I1003 08:10:52.112526 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 08:10:52 crc kubenswrapper[4664]: I1003 08:10:52.210834 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/055183e7-9fec-4cb4-858b-3a9f7fabfdcf-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"055183e7-9fec-4cb4-858b-3a9f7fabfdcf\") " pod="openstack/nova-cell0-conductor-0" Oct 03 08:10:52 crc kubenswrapper[4664]: I1003 08:10:52.210931 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxmwh\" (UniqueName: \"kubernetes.io/projected/055183e7-9fec-4cb4-858b-3a9f7fabfdcf-kube-api-access-mxmwh\") pod \"nova-cell0-conductor-0\" (UID: \"055183e7-9fec-4cb4-858b-3a9f7fabfdcf\") " pod="openstack/nova-cell0-conductor-0" Oct 03 08:10:52 crc kubenswrapper[4664]: I1003 08:10:52.211727 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/055183e7-9fec-4cb4-858b-3a9f7fabfdcf-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"055183e7-9fec-4cb4-858b-3a9f7fabfdcf\") " pod="openstack/nova-cell0-conductor-0" Oct 03 08:10:52 crc kubenswrapper[4664]: I1003 08:10:52.313381 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/055183e7-9fec-4cb4-858b-3a9f7fabfdcf-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"055183e7-9fec-4cb4-858b-3a9f7fabfdcf\") " pod="openstack/nova-cell0-conductor-0" Oct 03 08:10:52 crc kubenswrapper[4664]: I1003 08:10:52.313477 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxmwh\" (UniqueName: \"kubernetes.io/projected/055183e7-9fec-4cb4-858b-3a9f7fabfdcf-kube-api-access-mxmwh\") pod \"nova-cell0-conductor-0\" (UID: \"055183e7-9fec-4cb4-858b-3a9f7fabfdcf\") " pod="openstack/nova-cell0-conductor-0" Oct 03 08:10:52 crc kubenswrapper[4664]: I1003 08:10:52.313584 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/055183e7-9fec-4cb4-858b-3a9f7fabfdcf-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"055183e7-9fec-4cb4-858b-3a9f7fabfdcf\") " pod="openstack/nova-cell0-conductor-0" Oct 03 08:10:52 crc kubenswrapper[4664]: I1003 08:10:52.320683 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/055183e7-9fec-4cb4-858b-3a9f7fabfdcf-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"055183e7-9fec-4cb4-858b-3a9f7fabfdcf\") " pod="openstack/nova-cell0-conductor-0" Oct 03 08:10:52 crc kubenswrapper[4664]: I1003 08:10:52.322475 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/055183e7-9fec-4cb4-858b-3a9f7fabfdcf-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"055183e7-9fec-4cb4-858b-3a9f7fabfdcf\") " pod="openstack/nova-cell0-conductor-0" Oct 03 08:10:52 crc kubenswrapper[4664]: I1003 08:10:52.334596 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxmwh\" (UniqueName: \"kubernetes.io/projected/055183e7-9fec-4cb4-858b-3a9f7fabfdcf-kube-api-access-mxmwh\") pod \"nova-cell0-conductor-0\" (UID: \"055183e7-9fec-4cb4-858b-3a9f7fabfdcf\") " pod="openstack/nova-cell0-conductor-0" Oct 03 08:10:52 crc kubenswrapper[4664]: I1003 08:10:52.420518 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 03 08:10:52 crc kubenswrapper[4664]: I1003 08:10:52.877167 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 08:10:53 crc kubenswrapper[4664]: I1003 08:10:53.002973 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"055183e7-9fec-4cb4-858b-3a9f7fabfdcf","Type":"ContainerStarted","Data":"c769833c4d6af6ad6d1c20a1de4ea4552e9b57448e7bf287ea23756249c85334"} Oct 03 08:10:54 crc kubenswrapper[4664]: I1003 08:10:54.016037 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"055183e7-9fec-4cb4-858b-3a9f7fabfdcf","Type":"ContainerStarted","Data":"0cde53b28cc4f7ff81d44fa433f878c62d1d53a5bd0316ebf9fe124a67b19b4d"} Oct 03 08:10:54 crc kubenswrapper[4664]: I1003 08:10:54.017268 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 03 08:10:54 crc kubenswrapper[4664]: I1003 08:10:54.040101 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.040043152 podStartE2EDuration="2.040043152s" podCreationTimestamp="2025-10-03 08:10:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:10:54.036939142 +0000 UTC m=+1354.858129642" watchObservedRunningTime="2025-10-03 08:10:54.040043152 +0000 UTC m=+1354.861233642" Oct 03 08:11:02 crc kubenswrapper[4664]: I1003 08:11:02.450454 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 03 08:11:02 crc kubenswrapper[4664]: I1003 08:11:02.928649 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-srldj"] Oct 03 08:11:02 crc kubenswrapper[4664]: I1003 08:11:02.931258 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-srldj" Oct 03 08:11:02 crc kubenswrapper[4664]: I1003 08:11:02.933496 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 03 08:11:02 crc kubenswrapper[4664]: I1003 08:11:02.933516 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 03 08:11:02 crc kubenswrapper[4664]: I1003 08:11:02.940545 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-srldj"] Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.028060 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lxl5\" (UniqueName: \"kubernetes.io/projected/6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef-kube-api-access-7lxl5\") pod \"nova-cell0-cell-mapping-srldj\" (UID: \"6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef\") " pod="openstack/nova-cell0-cell-mapping-srldj" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.028330 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef-scripts\") pod \"nova-cell0-cell-mapping-srldj\" (UID: \"6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef\") " pod="openstack/nova-cell0-cell-mapping-srldj" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.028486 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef-config-data\") pod \"nova-cell0-cell-mapping-srldj\" (UID: \"6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef\") " pod="openstack/nova-cell0-cell-mapping-srldj" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.028566 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-srldj\" (UID: \"6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef\") " pod="openstack/nova-cell0-cell-mapping-srldj" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.124562 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.126323 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.130793 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lxl5\" (UniqueName: \"kubernetes.io/projected/6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef-kube-api-access-7lxl5\") pod \"nova-cell0-cell-mapping-srldj\" (UID: \"6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef\") " pod="openstack/nova-cell0-cell-mapping-srldj" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.130952 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef-scripts\") pod \"nova-cell0-cell-mapping-srldj\" (UID: \"6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef\") " pod="openstack/nova-cell0-cell-mapping-srldj" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.130996 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef-config-data\") pod \"nova-cell0-cell-mapping-srldj\" (UID: \"6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef\") " pod="openstack/nova-cell0-cell-mapping-srldj" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.131036 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-srldj\" (UID: \"6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef\") " pod="openstack/nova-cell0-cell-mapping-srldj" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.139091 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.143484 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-srldj\" (UID: \"6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef\") " pod="openstack/nova-cell0-cell-mapping-srldj" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.145772 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef-scripts\") pod \"nova-cell0-cell-mapping-srldj\" (UID: \"6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef\") " pod="openstack/nova-cell0-cell-mapping-srldj" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.158840 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.164394 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef-config-data\") pod \"nova-cell0-cell-mapping-srldj\" (UID: \"6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef\") " pod="openstack/nova-cell0-cell-mapping-srldj" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.168446 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lxl5\" (UniqueName: \"kubernetes.io/projected/6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef-kube-api-access-7lxl5\") pod \"nova-cell0-cell-mapping-srldj\" (UID: \"6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef\") " pod="openstack/nova-cell0-cell-mapping-srldj" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.224325 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.225955 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.234918 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.235767 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a\") " pod="openstack/nova-api-0" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.235937 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8557bd77-628d-4cbd-8570-05897fa8495e-config-data\") pod \"nova-scheduler-0\" (UID: \"8557bd77-628d-4cbd-8570-05897fa8495e\") " pod="openstack/nova-scheduler-0" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.236080 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8557bd77-628d-4cbd-8570-05897fa8495e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8557bd77-628d-4cbd-8570-05897fa8495e\") " pod="openstack/nova-scheduler-0" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.236249 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p49g\" (UniqueName: \"kubernetes.io/projected/3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a-kube-api-access-9p49g\") pod \"nova-api-0\" (UID: \"3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a\") " pod="openstack/nova-api-0" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.236365 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a-config-data\") pod \"nova-api-0\" (UID: \"3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a\") " pod="openstack/nova-api-0" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.236405 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a-logs\") pod \"nova-api-0\" (UID: \"3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a\") " pod="openstack/nova-api-0" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.236435 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2p82\" (UniqueName: \"kubernetes.io/projected/8557bd77-628d-4cbd-8570-05897fa8495e-kube-api-access-l2p82\") pod \"nova-scheduler-0\" (UID: \"8557bd77-628d-4cbd-8570-05897fa8495e\") " pod="openstack/nova-scheduler-0" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.263763 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-srldj" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.268793 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.284975 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.293730 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.294032 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.327156 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.339500 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9p49g\" (UniqueName: \"kubernetes.io/projected/3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a-kube-api-access-9p49g\") pod \"nova-api-0\" (UID: \"3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a\") " pod="openstack/nova-api-0" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.339568 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a-config-data\") pod \"nova-api-0\" (UID: \"3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a\") " pod="openstack/nova-api-0" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.339612 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a-logs\") pod \"nova-api-0\" (UID: \"3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a\") " pod="openstack/nova-api-0" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.339639 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2p82\" (UniqueName: \"kubernetes.io/projected/8557bd77-628d-4cbd-8570-05897fa8495e-kube-api-access-l2p82\") pod \"nova-scheduler-0\" (UID: \"8557bd77-628d-4cbd-8570-05897fa8495e\") " pod="openstack/nova-scheduler-0" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.339715 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a\") " pod="openstack/nova-api-0" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.339766 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8557bd77-628d-4cbd-8570-05897fa8495e-config-data\") pod \"nova-scheduler-0\" (UID: \"8557bd77-628d-4cbd-8570-05897fa8495e\") " pod="openstack/nova-scheduler-0" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.339959 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8557bd77-628d-4cbd-8570-05897fa8495e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8557bd77-628d-4cbd-8570-05897fa8495e\") " pod="openstack/nova-scheduler-0" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.349839 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a-logs\") pod \"nova-api-0\" (UID: \"3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a\") " pod="openstack/nova-api-0" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.355309 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8557bd77-628d-4cbd-8570-05897fa8495e-config-data\") pod \"nova-scheduler-0\" (UID: \"8557bd77-628d-4cbd-8570-05897fa8495e\") " pod="openstack/nova-scheduler-0" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.367362 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8557bd77-628d-4cbd-8570-05897fa8495e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8557bd77-628d-4cbd-8570-05897fa8495e\") " pod="openstack/nova-scheduler-0" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.395384 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a-config-data\") pod \"nova-api-0\" (UID: \"3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a\") " pod="openstack/nova-api-0" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.405431 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2p82\" (UniqueName: \"kubernetes.io/projected/8557bd77-628d-4cbd-8570-05897fa8495e-kube-api-access-l2p82\") pod \"nova-scheduler-0\" (UID: \"8557bd77-628d-4cbd-8570-05897fa8495e\") " pod="openstack/nova-scheduler-0" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.405565 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a\") " pod="openstack/nova-api-0" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.414943 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p49g\" (UniqueName: \"kubernetes.io/projected/3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a-kube-api-access-9p49g\") pod \"nova-api-0\" (UID: \"3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a\") " pod="openstack/nova-api-0" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.436683 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-rg57h"] Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.438859 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-rg57h" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.455807 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ca8d00b-c895-4c55-afd7-add414c51d9e-config-data\") pod \"nova-metadata-0\" (UID: \"8ca8d00b-c895-4c55-afd7-add414c51d9e\") " pod="openstack/nova-metadata-0" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.455971 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ca8d00b-c895-4c55-afd7-add414c51d9e-logs\") pod \"nova-metadata-0\" (UID: \"8ca8d00b-c895-4c55-afd7-add414c51d9e\") " pod="openstack/nova-metadata-0" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.456040 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chx6s\" (UniqueName: \"kubernetes.io/projected/8ca8d00b-c895-4c55-afd7-add414c51d9e-kube-api-access-chx6s\") pod \"nova-metadata-0\" (UID: \"8ca8d00b-c895-4c55-afd7-add414c51d9e\") " pod="openstack/nova-metadata-0" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.456082 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ca8d00b-c895-4c55-afd7-add414c51d9e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8ca8d00b-c895-4c55-afd7-add414c51d9e\") " pod="openstack/nova-metadata-0" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.472690 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-rg57h"] Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.494678 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.505369 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.506338 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.511946 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.558092 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ca8d00b-c895-4c55-afd7-add414c51d9e-logs\") pod \"nova-metadata-0\" (UID: \"8ca8d00b-c895-4c55-afd7-add414c51d9e\") " pod="openstack/nova-metadata-0" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.558473 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chx6s\" (UniqueName: \"kubernetes.io/projected/8ca8d00b-c895-4c55-afd7-add414c51d9e-kube-api-access-chx6s\") pod \"nova-metadata-0\" (UID: \"8ca8d00b-c895-4c55-afd7-add414c51d9e\") " pod="openstack/nova-metadata-0" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.559218 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ea47450-c8b3-4b8f-85f0-53a44121a988-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-rg57h\" (UID: \"2ea47450-c8b3-4b8f-85f0-53a44121a988\") " pod="openstack/dnsmasq-dns-865f5d856f-rg57h" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.559255 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ca8d00b-c895-4c55-afd7-add414c51d9e-logs\") pod \"nova-metadata-0\" (UID: \"8ca8d00b-c895-4c55-afd7-add414c51d9e\") " pod="openstack/nova-metadata-0" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.559284 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ca8d00b-c895-4c55-afd7-add414c51d9e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8ca8d00b-c895-4c55-afd7-add414c51d9e\") " pod="openstack/nova-metadata-0" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.559352 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ea47450-c8b3-4b8f-85f0-53a44121a988-dns-svc\") pod \"dnsmasq-dns-865f5d856f-rg57h\" (UID: \"2ea47450-c8b3-4b8f-85f0-53a44121a988\") " pod="openstack/dnsmasq-dns-865f5d856f-rg57h" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.559432 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ea47450-c8b3-4b8f-85f0-53a44121a988-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-rg57h\" (UID: \"2ea47450-c8b3-4b8f-85f0-53a44121a988\") " pod="openstack/dnsmasq-dns-865f5d856f-rg57h" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.559517 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ea47450-c8b3-4b8f-85f0-53a44121a988-config\") pod \"dnsmasq-dns-865f5d856f-rg57h\" (UID: \"2ea47450-c8b3-4b8f-85f0-53a44121a988\") " pod="openstack/dnsmasq-dns-865f5d856f-rg57h" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.559564 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhkgw\" (UniqueName: \"kubernetes.io/projected/2ea47450-c8b3-4b8f-85f0-53a44121a988-kube-api-access-bhkgw\") pod \"dnsmasq-dns-865f5d856f-rg57h\" (UID: \"2ea47450-c8b3-4b8f-85f0-53a44121a988\") " pod="openstack/dnsmasq-dns-865f5d856f-rg57h" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.559596 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ca8d00b-c895-4c55-afd7-add414c51d9e-config-data\") pod \"nova-metadata-0\" (UID: \"8ca8d00b-c895-4c55-afd7-add414c51d9e\") " pod="openstack/nova-metadata-0" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.559676 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2ea47450-c8b3-4b8f-85f0-53a44121a988-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-rg57h\" (UID: \"2ea47450-c8b3-4b8f-85f0-53a44121a988\") " pod="openstack/dnsmasq-dns-865f5d856f-rg57h" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.571185 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.577387 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ca8d00b-c895-4c55-afd7-add414c51d9e-config-data\") pod \"nova-metadata-0\" (UID: \"8ca8d00b-c895-4c55-afd7-add414c51d9e\") " pod="openstack/nova-metadata-0" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.588719 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ca8d00b-c895-4c55-afd7-add414c51d9e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8ca8d00b-c895-4c55-afd7-add414c51d9e\") " pod="openstack/nova-metadata-0" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.600622 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chx6s\" (UniqueName: \"kubernetes.io/projected/8ca8d00b-c895-4c55-afd7-add414c51d9e-kube-api-access-chx6s\") pod \"nova-metadata-0\" (UID: \"8ca8d00b-c895-4c55-afd7-add414c51d9e\") " pod="openstack/nova-metadata-0" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.605953 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.662176 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ea47450-c8b3-4b8f-85f0-53a44121a988-dns-svc\") pod \"dnsmasq-dns-865f5d856f-rg57h\" (UID: \"2ea47450-c8b3-4b8f-85f0-53a44121a988\") " pod="openstack/dnsmasq-dns-865f5d856f-rg57h" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.662378 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cd69772-a7a7-4126-9302-e7aae399ee11-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0cd69772-a7a7-4126-9302-e7aae399ee11\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.662407 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ea47450-c8b3-4b8f-85f0-53a44121a988-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-rg57h\" (UID: \"2ea47450-c8b3-4b8f-85f0-53a44121a988\") " pod="openstack/dnsmasq-dns-865f5d856f-rg57h" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.662445 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ea47450-c8b3-4b8f-85f0-53a44121a988-config\") pod \"dnsmasq-dns-865f5d856f-rg57h\" (UID: \"2ea47450-c8b3-4b8f-85f0-53a44121a988\") " pod="openstack/dnsmasq-dns-865f5d856f-rg57h" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.662471 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhkgw\" (UniqueName: \"kubernetes.io/projected/2ea47450-c8b3-4b8f-85f0-53a44121a988-kube-api-access-bhkgw\") pod \"dnsmasq-dns-865f5d856f-rg57h\" (UID: \"2ea47450-c8b3-4b8f-85f0-53a44121a988\") " pod="openstack/dnsmasq-dns-865f5d856f-rg57h" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.662501 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2ea47450-c8b3-4b8f-85f0-53a44121a988-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-rg57h\" (UID: \"2ea47450-c8b3-4b8f-85f0-53a44121a988\") " pod="openstack/dnsmasq-dns-865f5d856f-rg57h" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.662537 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cd69772-a7a7-4126-9302-e7aae399ee11-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0cd69772-a7a7-4126-9302-e7aae399ee11\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.662577 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t74pm\" (UniqueName: \"kubernetes.io/projected/0cd69772-a7a7-4126-9302-e7aae399ee11-kube-api-access-t74pm\") pod \"nova-cell1-novncproxy-0\" (UID: \"0cd69772-a7a7-4126-9302-e7aae399ee11\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.662622 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ea47450-c8b3-4b8f-85f0-53a44121a988-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-rg57h\" (UID: \"2ea47450-c8b3-4b8f-85f0-53a44121a988\") " pod="openstack/dnsmasq-dns-865f5d856f-rg57h" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.664296 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ea47450-c8b3-4b8f-85f0-53a44121a988-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-rg57h\" (UID: \"2ea47450-c8b3-4b8f-85f0-53a44121a988\") " pod="openstack/dnsmasq-dns-865f5d856f-rg57h" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.664952 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ea47450-c8b3-4b8f-85f0-53a44121a988-dns-svc\") pod \"dnsmasq-dns-865f5d856f-rg57h\" (UID: \"2ea47450-c8b3-4b8f-85f0-53a44121a988\") " pod="openstack/dnsmasq-dns-865f5d856f-rg57h" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.665514 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ea47450-c8b3-4b8f-85f0-53a44121a988-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-rg57h\" (UID: \"2ea47450-c8b3-4b8f-85f0-53a44121a988\") " pod="openstack/dnsmasq-dns-865f5d856f-rg57h" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.666047 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ea47450-c8b3-4b8f-85f0-53a44121a988-config\") pod \"dnsmasq-dns-865f5d856f-rg57h\" (UID: \"2ea47450-c8b3-4b8f-85f0-53a44121a988\") " pod="openstack/dnsmasq-dns-865f5d856f-rg57h" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.668815 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2ea47450-c8b3-4b8f-85f0-53a44121a988-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-rg57h\" (UID: \"2ea47450-c8b3-4b8f-85f0-53a44121a988\") " pod="openstack/dnsmasq-dns-865f5d856f-rg57h" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.689991 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhkgw\" (UniqueName: \"kubernetes.io/projected/2ea47450-c8b3-4b8f-85f0-53a44121a988-kube-api-access-bhkgw\") pod \"dnsmasq-dns-865f5d856f-rg57h\" (UID: \"2ea47450-c8b3-4b8f-85f0-53a44121a988\") " pod="openstack/dnsmasq-dns-865f5d856f-rg57h" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.765638 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cd69772-a7a7-4126-9302-e7aae399ee11-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0cd69772-a7a7-4126-9302-e7aae399ee11\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.765729 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t74pm\" (UniqueName: \"kubernetes.io/projected/0cd69772-a7a7-4126-9302-e7aae399ee11-kube-api-access-t74pm\") pod \"nova-cell1-novncproxy-0\" (UID: \"0cd69772-a7a7-4126-9302-e7aae399ee11\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.765818 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cd69772-a7a7-4126-9302-e7aae399ee11-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0cd69772-a7a7-4126-9302-e7aae399ee11\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.771670 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cd69772-a7a7-4126-9302-e7aae399ee11-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0cd69772-a7a7-4126-9302-e7aae399ee11\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.773515 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cd69772-a7a7-4126-9302-e7aae399ee11-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0cd69772-a7a7-4126-9302-e7aae399ee11\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.793411 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.805088 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t74pm\" (UniqueName: \"kubernetes.io/projected/0cd69772-a7a7-4126-9302-e7aae399ee11-kube-api-access-t74pm\") pod \"nova-cell1-novncproxy-0\" (UID: \"0cd69772-a7a7-4126-9302-e7aae399ee11\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.809974 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-rg57h" Oct 03 08:11:03 crc kubenswrapper[4664]: I1003 08:11:03.862914 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 08:11:04 crc kubenswrapper[4664]: I1003 08:11:04.015640 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-srldj"] Oct 03 08:11:04 crc kubenswrapper[4664]: I1003 08:11:04.138711 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-srldj" event={"ID":"6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef","Type":"ContainerStarted","Data":"f6031a24aa714ebed026f2bb0d33e6579dc080919b71eb45fd4be22e19d59a2f"} Oct 03 08:11:04 crc kubenswrapper[4664]: I1003 08:11:04.223932 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 08:11:04 crc kubenswrapper[4664]: I1003 08:11:04.237928 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 08:11:04 crc kubenswrapper[4664]: I1003 08:11:04.362468 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7c992"] Oct 03 08:11:04 crc kubenswrapper[4664]: I1003 08:11:04.364660 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7c992" Oct 03 08:11:04 crc kubenswrapper[4664]: I1003 08:11:04.367973 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 03 08:11:04 crc kubenswrapper[4664]: I1003 08:11:04.369201 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 03 08:11:04 crc kubenswrapper[4664]: I1003 08:11:04.384271 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7c992"] Oct 03 08:11:04 crc kubenswrapper[4664]: I1003 08:11:04.387866 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6rq5\" (UniqueName: \"kubernetes.io/projected/bb30837f-4a77-4972-a012-ef0c51b62cb5-kube-api-access-m6rq5\") pod \"nova-cell1-conductor-db-sync-7c992\" (UID: \"bb30837f-4a77-4972-a012-ef0c51b62cb5\") " pod="openstack/nova-cell1-conductor-db-sync-7c992" Oct 03 08:11:04 crc kubenswrapper[4664]: I1003 08:11:04.387910 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb30837f-4a77-4972-a012-ef0c51b62cb5-scripts\") pod \"nova-cell1-conductor-db-sync-7c992\" (UID: \"bb30837f-4a77-4972-a012-ef0c51b62cb5\") " pod="openstack/nova-cell1-conductor-db-sync-7c992" Oct 03 08:11:04 crc kubenswrapper[4664]: I1003 08:11:04.388040 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb30837f-4a77-4972-a012-ef0c51b62cb5-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7c992\" (UID: \"bb30837f-4a77-4972-a012-ef0c51b62cb5\") " pod="openstack/nova-cell1-conductor-db-sync-7c992" Oct 03 08:11:04 crc kubenswrapper[4664]: I1003 08:11:04.388064 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb30837f-4a77-4972-a012-ef0c51b62cb5-config-data\") pod \"nova-cell1-conductor-db-sync-7c992\" (UID: \"bb30837f-4a77-4972-a012-ef0c51b62cb5\") " pod="openstack/nova-cell1-conductor-db-sync-7c992" Oct 03 08:11:04 crc kubenswrapper[4664]: I1003 08:11:04.465268 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 08:11:04 crc kubenswrapper[4664]: I1003 08:11:04.489905 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb30837f-4a77-4972-a012-ef0c51b62cb5-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7c992\" (UID: \"bb30837f-4a77-4972-a012-ef0c51b62cb5\") " pod="openstack/nova-cell1-conductor-db-sync-7c992" Oct 03 08:11:04 crc kubenswrapper[4664]: I1003 08:11:04.490164 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb30837f-4a77-4972-a012-ef0c51b62cb5-config-data\") pod \"nova-cell1-conductor-db-sync-7c992\" (UID: \"bb30837f-4a77-4972-a012-ef0c51b62cb5\") " pod="openstack/nova-cell1-conductor-db-sync-7c992" Oct 03 08:11:04 crc kubenswrapper[4664]: I1003 08:11:04.490297 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6rq5\" (UniqueName: \"kubernetes.io/projected/bb30837f-4a77-4972-a012-ef0c51b62cb5-kube-api-access-m6rq5\") pod \"nova-cell1-conductor-db-sync-7c992\" (UID: \"bb30837f-4a77-4972-a012-ef0c51b62cb5\") " pod="openstack/nova-cell1-conductor-db-sync-7c992" Oct 03 08:11:04 crc kubenswrapper[4664]: I1003 08:11:04.490379 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb30837f-4a77-4972-a012-ef0c51b62cb5-scripts\") pod \"nova-cell1-conductor-db-sync-7c992\" (UID: \"bb30837f-4a77-4972-a012-ef0c51b62cb5\") " pod="openstack/nova-cell1-conductor-db-sync-7c992" Oct 03 08:11:04 crc kubenswrapper[4664]: I1003 08:11:04.497594 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb30837f-4a77-4972-a012-ef0c51b62cb5-config-data\") pod \"nova-cell1-conductor-db-sync-7c992\" (UID: \"bb30837f-4a77-4972-a012-ef0c51b62cb5\") " pod="openstack/nova-cell1-conductor-db-sync-7c992" Oct 03 08:11:04 crc kubenswrapper[4664]: I1003 08:11:04.497591 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb30837f-4a77-4972-a012-ef0c51b62cb5-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7c992\" (UID: \"bb30837f-4a77-4972-a012-ef0c51b62cb5\") " pod="openstack/nova-cell1-conductor-db-sync-7c992" Oct 03 08:11:04 crc kubenswrapper[4664]: I1003 08:11:04.512216 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb30837f-4a77-4972-a012-ef0c51b62cb5-scripts\") pod \"nova-cell1-conductor-db-sync-7c992\" (UID: \"bb30837f-4a77-4972-a012-ef0c51b62cb5\") " pod="openstack/nova-cell1-conductor-db-sync-7c992" Oct 03 08:11:04 crc kubenswrapper[4664]: I1003 08:11:04.513177 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-rg57h"] Oct 03 08:11:04 crc kubenswrapper[4664]: I1003 08:11:04.516151 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6rq5\" (UniqueName: \"kubernetes.io/projected/bb30837f-4a77-4972-a012-ef0c51b62cb5-kube-api-access-m6rq5\") pod \"nova-cell1-conductor-db-sync-7c992\" (UID: \"bb30837f-4a77-4972-a012-ef0c51b62cb5\") " pod="openstack/nova-cell1-conductor-db-sync-7c992" Oct 03 08:11:04 crc kubenswrapper[4664]: W1003 08:11:04.517318 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ea47450_c8b3_4b8f_85f0_53a44121a988.slice/crio-78d73dfe6db2b5b05f368f002c6033be6e2fffa2e003f5c5fe22633ebe1e2c5b WatchSource:0}: Error finding container 78d73dfe6db2b5b05f368f002c6033be6e2fffa2e003f5c5fe22633ebe1e2c5b: Status 404 returned error can't find the container with id 78d73dfe6db2b5b05f368f002c6033be6e2fffa2e003f5c5fe22633ebe1e2c5b Oct 03 08:11:04 crc kubenswrapper[4664]: I1003 08:11:04.698297 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7c992" Oct 03 08:11:04 crc kubenswrapper[4664]: I1003 08:11:04.712529 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 08:11:05 crc kubenswrapper[4664]: I1003 08:11:05.150898 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8ca8d00b-c895-4c55-afd7-add414c51d9e","Type":"ContainerStarted","Data":"6f85f1e8ec87d01d19181d6d6310bfbfbed288e85fdd0ab5ed0858fcf01d74c1"} Oct 03 08:11:05 crc kubenswrapper[4664]: I1003 08:11:05.154556 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-srldj" event={"ID":"6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef","Type":"ContainerStarted","Data":"b0b1c809d633c9aa80a2ccb69d3579a6dc65f481cf89091f33d43f760ee397bc"} Oct 03 08:11:05 crc kubenswrapper[4664]: I1003 08:11:05.156866 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a","Type":"ContainerStarted","Data":"2146ee7a295ba4d67b57644b0689b281cba7deec152e6b7a62f0ee6dac91f3b3"} Oct 03 08:11:05 crc kubenswrapper[4664]: I1003 08:11:05.159360 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0cd69772-a7a7-4126-9302-e7aae399ee11","Type":"ContainerStarted","Data":"e72ba4aeff440834fff4aacd47ec0e91f4be50f2334658e944084a239231443b"} Oct 03 08:11:05 crc kubenswrapper[4664]: I1003 08:11:05.161493 4664 generic.go:334] "Generic (PLEG): container finished" podID="2ea47450-c8b3-4b8f-85f0-53a44121a988" containerID="c3225e3260eea2df07e48298267610627d37cfda8da937b93e1b975ee59bb843" exitCode=0 Oct 03 08:11:05 crc kubenswrapper[4664]: I1003 08:11:05.161661 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-rg57h" event={"ID":"2ea47450-c8b3-4b8f-85f0-53a44121a988","Type":"ContainerDied","Data":"c3225e3260eea2df07e48298267610627d37cfda8da937b93e1b975ee59bb843"} Oct 03 08:11:05 crc kubenswrapper[4664]: I1003 08:11:05.161693 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-rg57h" event={"ID":"2ea47450-c8b3-4b8f-85f0-53a44121a988","Type":"ContainerStarted","Data":"78d73dfe6db2b5b05f368f002c6033be6e2fffa2e003f5c5fe22633ebe1e2c5b"} Oct 03 08:11:05 crc kubenswrapper[4664]: I1003 08:11:05.163318 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8557bd77-628d-4cbd-8570-05897fa8495e","Type":"ContainerStarted","Data":"5567f02afcbdb6769855d30e14868796e2ea93463306e6814a5fd08a616d63f0"} Oct 03 08:11:05 crc kubenswrapper[4664]: I1003 08:11:05.193046 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-srldj" podStartSLOduration=3.193024651 podStartE2EDuration="3.193024651s" podCreationTimestamp="2025-10-03 08:11:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:11:05.172346797 +0000 UTC m=+1365.993537297" watchObservedRunningTime="2025-10-03 08:11:05.193024651 +0000 UTC m=+1366.014215141" Oct 03 08:11:05 crc kubenswrapper[4664]: W1003 08:11:05.235930 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb30837f_4a77_4972_a012_ef0c51b62cb5.slice/crio-acd991a66150336a98f95cd11f4462e106fa5e7278e4071c69e9ce23dd9e1074 WatchSource:0}: Error finding container acd991a66150336a98f95cd11f4462e106fa5e7278e4071c69e9ce23dd9e1074: Status 404 returned error can't find the container with id acd991a66150336a98f95cd11f4462e106fa5e7278e4071c69e9ce23dd9e1074 Oct 03 08:11:05 crc kubenswrapper[4664]: I1003 08:11:05.236378 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7c992"] Oct 03 08:11:06 crc kubenswrapper[4664]: I1003 08:11:06.192405 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7c992" event={"ID":"bb30837f-4a77-4972-a012-ef0c51b62cb5","Type":"ContainerStarted","Data":"6559c0f3b65e30b8ddff55a52fd232a7626191dae7979b6ec4aabe70abbb11e6"} Oct 03 08:11:06 crc kubenswrapper[4664]: I1003 08:11:06.192974 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7c992" event={"ID":"bb30837f-4a77-4972-a012-ef0c51b62cb5","Type":"ContainerStarted","Data":"acd991a66150336a98f95cd11f4462e106fa5e7278e4071c69e9ce23dd9e1074"} Oct 03 08:11:06 crc kubenswrapper[4664]: I1003 08:11:06.199821 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-rg57h" event={"ID":"2ea47450-c8b3-4b8f-85f0-53a44121a988","Type":"ContainerStarted","Data":"c5862623667e86b9f2bdfef90ee1d72f4f75ee5592aecd00629913a74dbaff85"} Oct 03 08:11:06 crc kubenswrapper[4664]: I1003 08:11:06.199865 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-865f5d856f-rg57h" Oct 03 08:11:06 crc kubenswrapper[4664]: I1003 08:11:06.223000 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-7c992" podStartSLOduration=2.2229806500000002 podStartE2EDuration="2.22298065s" podCreationTimestamp="2025-10-03 08:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:11:06.208369394 +0000 UTC m=+1367.029559904" watchObservedRunningTime="2025-10-03 08:11:06.22298065 +0000 UTC m=+1367.044171130" Oct 03 08:11:06 crc kubenswrapper[4664]: I1003 08:11:06.230979 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-865f5d856f-rg57h" podStartSLOduration=3.230961744 podStartE2EDuration="3.230961744s" podCreationTimestamp="2025-10-03 08:11:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:11:06.227658637 +0000 UTC m=+1367.048849137" watchObservedRunningTime="2025-10-03 08:11:06.230961744 +0000 UTC m=+1367.052152224" Oct 03 08:11:06 crc kubenswrapper[4664]: I1003 08:11:06.864020 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 08:11:06 crc kubenswrapper[4664]: I1003 08:11:06.877405 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 08:11:08 crc kubenswrapper[4664]: I1003 08:11:08.220599 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0cd69772-a7a7-4126-9302-e7aae399ee11","Type":"ContainerStarted","Data":"52634ef9946d611a3caff50a56e86b1ac0e9a595f7a33923319fa611ae831287"} Oct 03 08:11:08 crc kubenswrapper[4664]: I1003 08:11:08.220952 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="0cd69772-a7a7-4126-9302-e7aae399ee11" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://52634ef9946d611a3caff50a56e86b1ac0e9a595f7a33923319fa611ae831287" gracePeriod=30 Oct 03 08:11:08 crc kubenswrapper[4664]: I1003 08:11:08.231890 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8557bd77-628d-4cbd-8570-05897fa8495e","Type":"ContainerStarted","Data":"c222bd6eba8f217b0e4185bb409730230461d1d69b2f20b6406ca182f1863e24"} Oct 03 08:11:08 crc kubenswrapper[4664]: I1003 08:11:08.239553 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.173082938 podStartE2EDuration="5.239533342s" podCreationTimestamp="2025-10-03 08:11:03 +0000 UTC" firstStartedPulling="2025-10-03 08:11:04.472012974 +0000 UTC m=+1365.293203464" lastFinishedPulling="2025-10-03 08:11:07.538463378 +0000 UTC m=+1368.359653868" observedRunningTime="2025-10-03 08:11:08.239397348 +0000 UTC m=+1369.060587858" watchObservedRunningTime="2025-10-03 08:11:08.239533342 +0000 UTC m=+1369.060723842" Oct 03 08:11:08 crc kubenswrapper[4664]: I1003 08:11:08.246436 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8ca8d00b-c895-4c55-afd7-add414c51d9e","Type":"ContainerStarted","Data":"e66fb412acce8750bbcf82b71f1085888925806eb83b282705799a0d209f6c8c"} Oct 03 08:11:08 crc kubenswrapper[4664]: I1003 08:11:08.246475 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8ca8d00b-c895-4c55-afd7-add414c51d9e","Type":"ContainerStarted","Data":"2bd29c0215789aed0e5598e84d66401fcb43b598c235b1eef5cc50064ad78561"} Oct 03 08:11:08 crc kubenswrapper[4664]: I1003 08:11:08.246588 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8ca8d00b-c895-4c55-afd7-add414c51d9e" containerName="nova-metadata-log" containerID="cri-o://2bd29c0215789aed0e5598e84d66401fcb43b598c235b1eef5cc50064ad78561" gracePeriod=30 Oct 03 08:11:08 crc kubenswrapper[4664]: I1003 08:11:08.246682 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8ca8d00b-c895-4c55-afd7-add414c51d9e" containerName="nova-metadata-metadata" containerID="cri-o://e66fb412acce8750bbcf82b71f1085888925806eb83b282705799a0d209f6c8c" gracePeriod=30 Oct 03 08:11:08 crc kubenswrapper[4664]: I1003 08:11:08.253487 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a","Type":"ContainerStarted","Data":"af13fdc0e466e0a911754487cc060d068f208abf36f2b1302361c580f1e79d06"} Oct 03 08:11:08 crc kubenswrapper[4664]: I1003 08:11:08.253537 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a","Type":"ContainerStarted","Data":"395d89a827175a810beb99f9780394f57f193611d5d18c103bac8269a8462df4"} Oct 03 08:11:08 crc kubenswrapper[4664]: I1003 08:11:08.261781 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.955653849 podStartE2EDuration="5.261761422s" podCreationTimestamp="2025-10-03 08:11:03 +0000 UTC" firstStartedPulling="2025-10-03 08:11:04.230577903 +0000 UTC m=+1365.051768393" lastFinishedPulling="2025-10-03 08:11:07.536685466 +0000 UTC m=+1368.357875966" observedRunningTime="2025-10-03 08:11:08.257068404 +0000 UTC m=+1369.078258914" watchObservedRunningTime="2025-10-03 08:11:08.261761422 +0000 UTC m=+1369.082951922" Oct 03 08:11:08 crc kubenswrapper[4664]: I1003 08:11:08.285220 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.9694347909999999 podStartE2EDuration="5.285199146s" podCreationTimestamp="2025-10-03 08:11:03 +0000 UTC" firstStartedPulling="2025-10-03 08:11:04.223321821 +0000 UTC m=+1365.044512311" lastFinishedPulling="2025-10-03 08:11:07.539086176 +0000 UTC m=+1368.360276666" observedRunningTime="2025-10-03 08:11:08.274671719 +0000 UTC m=+1369.095862229" watchObservedRunningTime="2025-10-03 08:11:08.285199146 +0000 UTC m=+1369.106389646" Oct 03 08:11:08 crc kubenswrapper[4664]: I1003 08:11:08.302580 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.503848979 podStartE2EDuration="5.302552793s" podCreationTimestamp="2025-10-03 08:11:03 +0000 UTC" firstStartedPulling="2025-10-03 08:11:04.733373907 +0000 UTC m=+1365.554564397" lastFinishedPulling="2025-10-03 08:11:07.532077721 +0000 UTC m=+1368.353268211" observedRunningTime="2025-10-03 08:11:08.292553181 +0000 UTC m=+1369.113743691" watchObservedRunningTime="2025-10-03 08:11:08.302552793 +0000 UTC m=+1369.123743283" Oct 03 08:11:08 crc kubenswrapper[4664]: I1003 08:11:08.572758 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 03 08:11:08 crc kubenswrapper[4664]: I1003 08:11:08.794078 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 08:11:08 crc kubenswrapper[4664]: I1003 08:11:08.794139 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 08:11:08 crc kubenswrapper[4664]: I1003 08:11:08.864310 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 03 08:11:09 crc kubenswrapper[4664]: I1003 08:11:09.269521 4664 generic.go:334] "Generic (PLEG): container finished" podID="8ca8d00b-c895-4c55-afd7-add414c51d9e" containerID="2bd29c0215789aed0e5598e84d66401fcb43b598c235b1eef5cc50064ad78561" exitCode=143 Oct 03 08:11:09 crc kubenswrapper[4664]: I1003 08:11:09.269877 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8ca8d00b-c895-4c55-afd7-add414c51d9e","Type":"ContainerDied","Data":"2bd29c0215789aed0e5598e84d66401fcb43b598c235b1eef5cc50064ad78561"} Oct 03 08:11:09 crc kubenswrapper[4664]: I1003 08:11:09.558551 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 03 08:11:13 crc kubenswrapper[4664]: I1003 08:11:13.307880 4664 generic.go:334] "Generic (PLEG): container finished" podID="6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef" containerID="b0b1c809d633c9aa80a2ccb69d3579a6dc65f481cf89091f33d43f760ee397bc" exitCode=0 Oct 03 08:11:13 crc kubenswrapper[4664]: I1003 08:11:13.307981 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-srldj" event={"ID":"6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef","Type":"ContainerDied","Data":"b0b1c809d633c9aa80a2ccb69d3579a6dc65f481cf89091f33d43f760ee397bc"} Oct 03 08:11:13 crc kubenswrapper[4664]: I1003 08:11:13.330185 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 08:11:13 crc kubenswrapper[4664]: I1003 08:11:13.330513 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="c1dcd075-92b3-4f17-888c-2e5580a45789" containerName="kube-state-metrics" containerID="cri-o://f61ffab7b1d18ae4b5b27fe6766cb5a6acaedbae8434e907bbc8459f00ffbdab" gracePeriod=30 Oct 03 08:11:13 crc kubenswrapper[4664]: I1003 08:11:13.574428 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 03 08:11:13 crc kubenswrapper[4664]: I1003 08:11:13.607566 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 08:11:13 crc kubenswrapper[4664]: I1003 08:11:13.607900 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 08:11:13 crc kubenswrapper[4664]: I1003 08:11:13.609405 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 03 08:11:13 crc kubenswrapper[4664]: I1003 08:11:13.811803 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-865f5d856f-rg57h" Oct 03 08:11:13 crc kubenswrapper[4664]: I1003 08:11:13.925400 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-x99b6"] Oct 03 08:11:13 crc kubenswrapper[4664]: I1003 08:11:13.925898 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bb4fc677f-x99b6" podUID="40e2fe2c-8b4e-4b78-899e-1daed68626da" containerName="dnsmasq-dns" containerID="cri-o://7b78d6b5329290ad7260c2fb3e58ca6d0ae56f128ad2207d8d3dae7e1d9b9f7a" gracePeriod=10 Oct 03 08:11:13 crc kubenswrapper[4664]: I1003 08:11:13.951730 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.016695 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxrxn\" (UniqueName: \"kubernetes.io/projected/c1dcd075-92b3-4f17-888c-2e5580a45789-kube-api-access-hxrxn\") pod \"c1dcd075-92b3-4f17-888c-2e5580a45789\" (UID: \"c1dcd075-92b3-4f17-888c-2e5580a45789\") " Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.049906 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1dcd075-92b3-4f17-888c-2e5580a45789-kube-api-access-hxrxn" (OuterVolumeSpecName: "kube-api-access-hxrxn") pod "c1dcd075-92b3-4f17-888c-2e5580a45789" (UID: "c1dcd075-92b3-4f17-888c-2e5580a45789"). InnerVolumeSpecName "kube-api-access-hxrxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.120781 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxrxn\" (UniqueName: \"kubernetes.io/projected/c1dcd075-92b3-4f17-888c-2e5580a45789-kube-api-access-hxrxn\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.340697 4664 generic.go:334] "Generic (PLEG): container finished" podID="bb30837f-4a77-4972-a012-ef0c51b62cb5" containerID="6559c0f3b65e30b8ddff55a52fd232a7626191dae7979b6ec4aabe70abbb11e6" exitCode=0 Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.340858 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7c992" event={"ID":"bb30837f-4a77-4972-a012-ef0c51b62cb5","Type":"ContainerDied","Data":"6559c0f3b65e30b8ddff55a52fd232a7626191dae7979b6ec4aabe70abbb11e6"} Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.358930 4664 generic.go:334] "Generic (PLEG): container finished" podID="c1dcd075-92b3-4f17-888c-2e5580a45789" containerID="f61ffab7b1d18ae4b5b27fe6766cb5a6acaedbae8434e907bbc8459f00ffbdab" exitCode=2 Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.359045 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c1dcd075-92b3-4f17-888c-2e5580a45789","Type":"ContainerDied","Data":"f61ffab7b1d18ae4b5b27fe6766cb5a6acaedbae8434e907bbc8459f00ffbdab"} Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.359098 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c1dcd075-92b3-4f17-888c-2e5580a45789","Type":"ContainerDied","Data":"4524900e84497138721aee9226368bf37246668c11d3848b5f94816bf202c2e3"} Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.359122 4664 scope.go:117] "RemoveContainer" containerID="f61ffab7b1d18ae4b5b27fe6766cb5a6acaedbae8434e907bbc8459f00ffbdab" Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.359373 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.372060 4664 generic.go:334] "Generic (PLEG): container finished" podID="40e2fe2c-8b4e-4b78-899e-1daed68626da" containerID="7b78d6b5329290ad7260c2fb3e58ca6d0ae56f128ad2207d8d3dae7e1d9b9f7a" exitCode=0 Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.372853 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-x99b6" event={"ID":"40e2fe2c-8b4e-4b78-899e-1daed68626da","Type":"ContainerDied","Data":"7b78d6b5329290ad7260c2fb3e58ca6d0ae56f128ad2207d8d3dae7e1d9b9f7a"} Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.403929 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.412955 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.449373 4664 scope.go:117] "RemoveContainer" containerID="f61ffab7b1d18ae4b5b27fe6766cb5a6acaedbae8434e907bbc8459f00ffbdab" Oct 03 08:11:14 crc kubenswrapper[4664]: E1003 08:11:14.451295 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f61ffab7b1d18ae4b5b27fe6766cb5a6acaedbae8434e907bbc8459f00ffbdab\": container with ID starting with f61ffab7b1d18ae4b5b27fe6766cb5a6acaedbae8434e907bbc8459f00ffbdab not found: ID does not exist" containerID="f61ffab7b1d18ae4b5b27fe6766cb5a6acaedbae8434e907bbc8459f00ffbdab" Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.451393 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f61ffab7b1d18ae4b5b27fe6766cb5a6acaedbae8434e907bbc8459f00ffbdab"} err="failed to get container status \"f61ffab7b1d18ae4b5b27fe6766cb5a6acaedbae8434e907bbc8459f00ffbdab\": rpc error: code = NotFound desc = could not find container \"f61ffab7b1d18ae4b5b27fe6766cb5a6acaedbae8434e907bbc8459f00ffbdab\": container with ID starting with f61ffab7b1d18ae4b5b27fe6766cb5a6acaedbae8434e907bbc8459f00ffbdab not found: ID does not exist" Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.469906 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.483815 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 08:11:14 crc kubenswrapper[4664]: E1003 08:11:14.484966 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1dcd075-92b3-4f17-888c-2e5580a45789" containerName="kube-state-metrics" Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.484990 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1dcd075-92b3-4f17-888c-2e5580a45789" containerName="kube-state-metrics" Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.485448 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1dcd075-92b3-4f17-888c-2e5580a45789" containerName="kube-state-metrics" Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.486646 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.490207 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.490633 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.527251 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.637483 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmkd7\" (UniqueName: \"kubernetes.io/projected/f0438fa7-19ee-4886-a77a-bc835552ca0a-kube-api-access-wmkd7\") pod \"kube-state-metrics-0\" (UID: \"f0438fa7-19ee-4886-a77a-bc835552ca0a\") " pod="openstack/kube-state-metrics-0" Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.637567 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0438fa7-19ee-4886-a77a-bc835552ca0a-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"f0438fa7-19ee-4886-a77a-bc835552ca0a\") " pod="openstack/kube-state-metrics-0" Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.637753 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/f0438fa7-19ee-4886-a77a-bc835552ca0a-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"f0438fa7-19ee-4886-a77a-bc835552ca0a\") " pod="openstack/kube-state-metrics-0" Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.637798 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0438fa7-19ee-4886-a77a-bc835552ca0a-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"f0438fa7-19ee-4886-a77a-bc835552ca0a\") " pod="openstack/kube-state-metrics-0" Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.644483 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-x99b6" Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.694767 4664 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.184:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.694797 4664 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.184:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.740546 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/f0438fa7-19ee-4886-a77a-bc835552ca0a-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"f0438fa7-19ee-4886-a77a-bc835552ca0a\") " pod="openstack/kube-state-metrics-0" Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.740670 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0438fa7-19ee-4886-a77a-bc835552ca0a-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"f0438fa7-19ee-4886-a77a-bc835552ca0a\") " pod="openstack/kube-state-metrics-0" Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.740759 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmkd7\" (UniqueName: \"kubernetes.io/projected/f0438fa7-19ee-4886-a77a-bc835552ca0a-kube-api-access-wmkd7\") pod \"kube-state-metrics-0\" (UID: \"f0438fa7-19ee-4886-a77a-bc835552ca0a\") " pod="openstack/kube-state-metrics-0" Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.740836 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0438fa7-19ee-4886-a77a-bc835552ca0a-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"f0438fa7-19ee-4886-a77a-bc835552ca0a\") " pod="openstack/kube-state-metrics-0" Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.753658 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0438fa7-19ee-4886-a77a-bc835552ca0a-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"f0438fa7-19ee-4886-a77a-bc835552ca0a\") " pod="openstack/kube-state-metrics-0" Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.759457 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/f0438fa7-19ee-4886-a77a-bc835552ca0a-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"f0438fa7-19ee-4886-a77a-bc835552ca0a\") " pod="openstack/kube-state-metrics-0" Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.760965 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmkd7\" (UniqueName: \"kubernetes.io/projected/f0438fa7-19ee-4886-a77a-bc835552ca0a-kube-api-access-wmkd7\") pod \"kube-state-metrics-0\" (UID: \"f0438fa7-19ee-4886-a77a-bc835552ca0a\") " pod="openstack/kube-state-metrics-0" Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.777378 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0438fa7-19ee-4886-a77a-bc835552ca0a-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"f0438fa7-19ee-4886-a77a-bc835552ca0a\") " pod="openstack/kube-state-metrics-0" Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.842579 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/40e2fe2c-8b4e-4b78-899e-1daed68626da-dns-swift-storage-0\") pod \"40e2fe2c-8b4e-4b78-899e-1daed68626da\" (UID: \"40e2fe2c-8b4e-4b78-899e-1daed68626da\") " Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.842681 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/40e2fe2c-8b4e-4b78-899e-1daed68626da-ovsdbserver-nb\") pod \"40e2fe2c-8b4e-4b78-899e-1daed68626da\" (UID: \"40e2fe2c-8b4e-4b78-899e-1daed68626da\") " Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.842776 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40e2fe2c-8b4e-4b78-899e-1daed68626da-config\") pod \"40e2fe2c-8b4e-4b78-899e-1daed68626da\" (UID: \"40e2fe2c-8b4e-4b78-899e-1daed68626da\") " Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.842816 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5x7s\" (UniqueName: \"kubernetes.io/projected/40e2fe2c-8b4e-4b78-899e-1daed68626da-kube-api-access-b5x7s\") pod \"40e2fe2c-8b4e-4b78-899e-1daed68626da\" (UID: \"40e2fe2c-8b4e-4b78-899e-1daed68626da\") " Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.842867 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/40e2fe2c-8b4e-4b78-899e-1daed68626da-dns-svc\") pod \"40e2fe2c-8b4e-4b78-899e-1daed68626da\" (UID: \"40e2fe2c-8b4e-4b78-899e-1daed68626da\") " Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.843523 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/40e2fe2c-8b4e-4b78-899e-1daed68626da-ovsdbserver-sb\") pod \"40e2fe2c-8b4e-4b78-899e-1daed68626da\" (UID: \"40e2fe2c-8b4e-4b78-899e-1daed68626da\") " Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.848991 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40e2fe2c-8b4e-4b78-899e-1daed68626da-kube-api-access-b5x7s" (OuterVolumeSpecName: "kube-api-access-b5x7s") pod "40e2fe2c-8b4e-4b78-899e-1daed68626da" (UID: "40e2fe2c-8b4e-4b78-899e-1daed68626da"). InnerVolumeSpecName "kube-api-access-b5x7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.911137 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40e2fe2c-8b4e-4b78-899e-1daed68626da-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "40e2fe2c-8b4e-4b78-899e-1daed68626da" (UID: "40e2fe2c-8b4e-4b78-899e-1daed68626da"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.915326 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40e2fe2c-8b4e-4b78-899e-1daed68626da-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "40e2fe2c-8b4e-4b78-899e-1daed68626da" (UID: "40e2fe2c-8b4e-4b78-899e-1daed68626da"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.921729 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40e2fe2c-8b4e-4b78-899e-1daed68626da-config" (OuterVolumeSpecName: "config") pod "40e2fe2c-8b4e-4b78-899e-1daed68626da" (UID: "40e2fe2c-8b4e-4b78-899e-1daed68626da"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.934022 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40e2fe2c-8b4e-4b78-899e-1daed68626da-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "40e2fe2c-8b4e-4b78-899e-1daed68626da" (UID: "40e2fe2c-8b4e-4b78-899e-1daed68626da"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.936480 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40e2fe2c-8b4e-4b78-899e-1daed68626da-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "40e2fe2c-8b4e-4b78-899e-1daed68626da" (UID: "40e2fe2c-8b4e-4b78-899e-1daed68626da"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.941233 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.951440 4664 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/40e2fe2c-8b4e-4b78-899e-1daed68626da-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.951524 4664 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/40e2fe2c-8b4e-4b78-899e-1daed68626da-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.951540 4664 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/40e2fe2c-8b4e-4b78-899e-1daed68626da-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.951554 4664 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40e2fe2c-8b4e-4b78-899e-1daed68626da-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.951566 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5x7s\" (UniqueName: \"kubernetes.io/projected/40e2fe2c-8b4e-4b78-899e-1daed68626da-kube-api-access-b5x7s\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.951579 4664 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/40e2fe2c-8b4e-4b78-899e-1daed68626da-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:14 crc kubenswrapper[4664]: I1003 08:11:14.988035 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-srldj" Oct 03 08:11:15 crc kubenswrapper[4664]: I1003 08:11:15.156226 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef-config-data\") pod \"6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef\" (UID: \"6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef\") " Oct 03 08:11:15 crc kubenswrapper[4664]: I1003 08:11:15.156332 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef-combined-ca-bundle\") pod \"6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef\" (UID: \"6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef\") " Oct 03 08:11:15 crc kubenswrapper[4664]: I1003 08:11:15.156491 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lxl5\" (UniqueName: \"kubernetes.io/projected/6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef-kube-api-access-7lxl5\") pod \"6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef\" (UID: \"6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef\") " Oct 03 08:11:15 crc kubenswrapper[4664]: I1003 08:11:15.156668 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef-scripts\") pod \"6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef\" (UID: \"6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef\") " Oct 03 08:11:15 crc kubenswrapper[4664]: I1003 08:11:15.164129 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef-scripts" (OuterVolumeSpecName: "scripts") pod "6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef" (UID: "6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:11:15 crc kubenswrapper[4664]: I1003 08:11:15.165226 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef-kube-api-access-7lxl5" (OuterVolumeSpecName: "kube-api-access-7lxl5") pod "6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef" (UID: "6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef"). InnerVolumeSpecName "kube-api-access-7lxl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:11:15 crc kubenswrapper[4664]: I1003 08:11:15.194889 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef" (UID: "6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:11:15 crc kubenswrapper[4664]: I1003 08:11:15.206934 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef-config-data" (OuterVolumeSpecName: "config-data") pod "6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef" (UID: "6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:11:15 crc kubenswrapper[4664]: I1003 08:11:15.266673 4664 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:15 crc kubenswrapper[4664]: I1003 08:11:15.266714 4664 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:15 crc kubenswrapper[4664]: I1003 08:11:15.266730 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lxl5\" (UniqueName: \"kubernetes.io/projected/6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef-kube-api-access-7lxl5\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:15 crc kubenswrapper[4664]: I1003 08:11:15.266741 4664 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:15 crc kubenswrapper[4664]: I1003 08:11:15.389990 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-x99b6" event={"ID":"40e2fe2c-8b4e-4b78-899e-1daed68626da","Type":"ContainerDied","Data":"9aca1718d202ab8145850318a6c0f977caccab430da3a348cc22376004b729f0"} Oct 03 08:11:15 crc kubenswrapper[4664]: I1003 08:11:15.390041 4664 scope.go:117] "RemoveContainer" containerID="7b78d6b5329290ad7260c2fb3e58ca6d0ae56f128ad2207d8d3dae7e1d9b9f7a" Oct 03 08:11:15 crc kubenswrapper[4664]: I1003 08:11:15.390142 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-x99b6" Oct 03 08:11:15 crc kubenswrapper[4664]: I1003 08:11:15.399631 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-srldj" Oct 03 08:11:15 crc kubenswrapper[4664]: I1003 08:11:15.400774 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-srldj" event={"ID":"6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef","Type":"ContainerDied","Data":"f6031a24aa714ebed026f2bb0d33e6579dc080919b71eb45fd4be22e19d59a2f"} Oct 03 08:11:15 crc kubenswrapper[4664]: I1003 08:11:15.400813 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6031a24aa714ebed026f2bb0d33e6579dc080919b71eb45fd4be22e19d59a2f" Oct 03 08:11:15 crc kubenswrapper[4664]: I1003 08:11:15.424749 4664 scope.go:117] "RemoveContainer" containerID="76dd6b14ebccb0558775bfe7eb0e9b14dca7ad99855d3395322a024bf61a3cbd" Oct 03 08:11:15 crc kubenswrapper[4664]: I1003 08:11:15.441114 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-x99b6"] Oct 03 08:11:15 crc kubenswrapper[4664]: I1003 08:11:15.457871 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-x99b6"] Oct 03 08:11:15 crc kubenswrapper[4664]: I1003 08:11:15.491042 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 08:11:15 crc kubenswrapper[4664]: I1003 08:11:15.554551 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 08:11:15 crc kubenswrapper[4664]: I1003 08:11:15.554775 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a" containerName="nova-api-log" containerID="cri-o://395d89a827175a810beb99f9780394f57f193611d5d18c103bac8269a8462df4" gracePeriod=30 Oct 03 08:11:15 crc kubenswrapper[4664]: I1003 08:11:15.555246 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a" containerName="nova-api-api" containerID="cri-o://af13fdc0e466e0a911754487cc060d068f208abf36f2b1302361c580f1e79d06" gracePeriod=30 Oct 03 08:11:15 crc kubenswrapper[4664]: I1003 08:11:15.616819 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 08:11:15 crc kubenswrapper[4664]: I1003 08:11:15.896486 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40e2fe2c-8b4e-4b78-899e-1daed68626da" path="/var/lib/kubelet/pods/40e2fe2c-8b4e-4b78-899e-1daed68626da/volumes" Oct 03 08:11:15 crc kubenswrapper[4664]: I1003 08:11:15.897724 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1dcd075-92b3-4f17-888c-2e5580a45789" path="/var/lib/kubelet/pods/c1dcd075-92b3-4f17-888c-2e5580a45789/volumes" Oct 03 08:11:15 crc kubenswrapper[4664]: I1003 08:11:15.904293 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7c992" Oct 03 08:11:16 crc kubenswrapper[4664]: I1003 08:11:16.076924 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:11:16 crc kubenswrapper[4664]: I1003 08:11:16.077273 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9c98f4aa-c34f-4ccb-9198-d2daf975e5df" containerName="ceilometer-central-agent" containerID="cri-o://ecb74b1639508af304c6b86b47042ebedf8fada82a3eee7bfad3d2c1c67c78c1" gracePeriod=30 Oct 03 08:11:16 crc kubenswrapper[4664]: I1003 08:11:16.077394 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9c98f4aa-c34f-4ccb-9198-d2daf975e5df" containerName="ceilometer-notification-agent" containerID="cri-o://b49ad7c2b6db5d70aaef83dc2b4b5b0b0ed1c6dcce10190324332842a156941d" gracePeriod=30 Oct 03 08:11:16 crc kubenswrapper[4664]: I1003 08:11:16.077386 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9c98f4aa-c34f-4ccb-9198-d2daf975e5df" containerName="proxy-httpd" containerID="cri-o://e1d7af77eda09d0f41ed8a469b12a776492acbb0035601478093de2735b67b6a" gracePeriod=30 Oct 03 08:11:16 crc kubenswrapper[4664]: I1003 08:11:16.077397 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9c98f4aa-c34f-4ccb-9198-d2daf975e5df" containerName="sg-core" containerID="cri-o://62613db899840e1c90b3db2b2802f4bd1d6263fcc10f9206537360bf95541040" gracePeriod=30 Oct 03 08:11:16 crc kubenswrapper[4664]: I1003 08:11:16.086560 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6rq5\" (UniqueName: \"kubernetes.io/projected/bb30837f-4a77-4972-a012-ef0c51b62cb5-kube-api-access-m6rq5\") pod \"bb30837f-4a77-4972-a012-ef0c51b62cb5\" (UID: \"bb30837f-4a77-4972-a012-ef0c51b62cb5\") " Oct 03 08:11:16 crc kubenswrapper[4664]: I1003 08:11:16.086729 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb30837f-4a77-4972-a012-ef0c51b62cb5-combined-ca-bundle\") pod \"bb30837f-4a77-4972-a012-ef0c51b62cb5\" (UID: \"bb30837f-4a77-4972-a012-ef0c51b62cb5\") " Oct 03 08:11:16 crc kubenswrapper[4664]: I1003 08:11:16.086857 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb30837f-4a77-4972-a012-ef0c51b62cb5-scripts\") pod \"bb30837f-4a77-4972-a012-ef0c51b62cb5\" (UID: \"bb30837f-4a77-4972-a012-ef0c51b62cb5\") " Oct 03 08:11:16 crc kubenswrapper[4664]: I1003 08:11:16.086894 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb30837f-4a77-4972-a012-ef0c51b62cb5-config-data\") pod \"bb30837f-4a77-4972-a012-ef0c51b62cb5\" (UID: \"bb30837f-4a77-4972-a012-ef0c51b62cb5\") " Oct 03 08:11:16 crc kubenswrapper[4664]: I1003 08:11:16.092384 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb30837f-4a77-4972-a012-ef0c51b62cb5-kube-api-access-m6rq5" (OuterVolumeSpecName: "kube-api-access-m6rq5") pod "bb30837f-4a77-4972-a012-ef0c51b62cb5" (UID: "bb30837f-4a77-4972-a012-ef0c51b62cb5"). InnerVolumeSpecName "kube-api-access-m6rq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:11:16 crc kubenswrapper[4664]: I1003 08:11:16.095356 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb30837f-4a77-4972-a012-ef0c51b62cb5-scripts" (OuterVolumeSpecName: "scripts") pod "bb30837f-4a77-4972-a012-ef0c51b62cb5" (UID: "bb30837f-4a77-4972-a012-ef0c51b62cb5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:11:16 crc kubenswrapper[4664]: I1003 08:11:16.122560 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb30837f-4a77-4972-a012-ef0c51b62cb5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb30837f-4a77-4972-a012-ef0c51b62cb5" (UID: "bb30837f-4a77-4972-a012-ef0c51b62cb5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:11:16 crc kubenswrapper[4664]: I1003 08:11:16.136761 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb30837f-4a77-4972-a012-ef0c51b62cb5-config-data" (OuterVolumeSpecName: "config-data") pod "bb30837f-4a77-4972-a012-ef0c51b62cb5" (UID: "bb30837f-4a77-4972-a012-ef0c51b62cb5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:11:16 crc kubenswrapper[4664]: I1003 08:11:16.189465 4664 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb30837f-4a77-4972-a012-ef0c51b62cb5-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:16 crc kubenswrapper[4664]: I1003 08:11:16.189514 4664 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb30837f-4a77-4972-a012-ef0c51b62cb5-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:16 crc kubenswrapper[4664]: I1003 08:11:16.189532 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6rq5\" (UniqueName: \"kubernetes.io/projected/bb30837f-4a77-4972-a012-ef0c51b62cb5-kube-api-access-m6rq5\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:16 crc kubenswrapper[4664]: I1003 08:11:16.189548 4664 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb30837f-4a77-4972-a012-ef0c51b62cb5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:16 crc kubenswrapper[4664]: I1003 08:11:16.423423 4664 generic.go:334] "Generic (PLEG): container finished" podID="9c98f4aa-c34f-4ccb-9198-d2daf975e5df" containerID="e1d7af77eda09d0f41ed8a469b12a776492acbb0035601478093de2735b67b6a" exitCode=0 Oct 03 08:11:16 crc kubenswrapper[4664]: I1003 08:11:16.423480 4664 generic.go:334] "Generic (PLEG): container finished" podID="9c98f4aa-c34f-4ccb-9198-d2daf975e5df" containerID="62613db899840e1c90b3db2b2802f4bd1d6263fcc10f9206537360bf95541040" exitCode=2 Oct 03 08:11:16 crc kubenswrapper[4664]: I1003 08:11:16.423540 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c98f4aa-c34f-4ccb-9198-d2daf975e5df","Type":"ContainerDied","Data":"e1d7af77eda09d0f41ed8a469b12a776492acbb0035601478093de2735b67b6a"} Oct 03 08:11:16 crc kubenswrapper[4664]: I1003 08:11:16.423582 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c98f4aa-c34f-4ccb-9198-d2daf975e5df","Type":"ContainerDied","Data":"62613db899840e1c90b3db2b2802f4bd1d6263fcc10f9206537360bf95541040"} Oct 03 08:11:16 crc kubenswrapper[4664]: I1003 08:11:16.427879 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7c992" event={"ID":"bb30837f-4a77-4972-a012-ef0c51b62cb5","Type":"ContainerDied","Data":"acd991a66150336a98f95cd11f4462e106fa5e7278e4071c69e9ce23dd9e1074"} Oct 03 08:11:16 crc kubenswrapper[4664]: I1003 08:11:16.427930 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acd991a66150336a98f95cd11f4462e106fa5e7278e4071c69e9ce23dd9e1074" Oct 03 08:11:16 crc kubenswrapper[4664]: I1003 08:11:16.427999 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7c992" Oct 03 08:11:16 crc kubenswrapper[4664]: I1003 08:11:16.450129 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f0438fa7-19ee-4886-a77a-bc835552ca0a","Type":"ContainerStarted","Data":"4cb3badde7d20bb3e0276897e37c33a1848828fc20fea98467b0d3cd0fa4cf8e"} Oct 03 08:11:16 crc kubenswrapper[4664]: I1003 08:11:16.450466 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f0438fa7-19ee-4886-a77a-bc835552ca0a","Type":"ContainerStarted","Data":"638271e9277478f10ea62a005135dbe37a5102813cd411b0153632072163f12b"} Oct 03 08:11:16 crc kubenswrapper[4664]: I1003 08:11:16.451033 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 03 08:11:16 crc kubenswrapper[4664]: I1003 08:11:16.464728 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 08:11:16 crc kubenswrapper[4664]: E1003 08:11:16.465384 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb30837f-4a77-4972-a012-ef0c51b62cb5" containerName="nova-cell1-conductor-db-sync" Oct 03 08:11:16 crc kubenswrapper[4664]: I1003 08:11:16.465401 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb30837f-4a77-4972-a012-ef0c51b62cb5" containerName="nova-cell1-conductor-db-sync" Oct 03 08:11:16 crc kubenswrapper[4664]: E1003 08:11:16.465410 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef" containerName="nova-manage" Oct 03 08:11:16 crc kubenswrapper[4664]: I1003 08:11:16.465417 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef" containerName="nova-manage" Oct 03 08:11:16 crc kubenswrapper[4664]: E1003 08:11:16.465444 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40e2fe2c-8b4e-4b78-899e-1daed68626da" containerName="init" Oct 03 08:11:16 crc kubenswrapper[4664]: I1003 08:11:16.465452 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="40e2fe2c-8b4e-4b78-899e-1daed68626da" containerName="init" Oct 03 08:11:16 crc kubenswrapper[4664]: E1003 08:11:16.465464 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40e2fe2c-8b4e-4b78-899e-1daed68626da" containerName="dnsmasq-dns" Oct 03 08:11:16 crc kubenswrapper[4664]: I1003 08:11:16.465470 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="40e2fe2c-8b4e-4b78-899e-1daed68626da" containerName="dnsmasq-dns" Oct 03 08:11:16 crc kubenswrapper[4664]: I1003 08:11:16.465775 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef" containerName="nova-manage" Oct 03 08:11:16 crc kubenswrapper[4664]: I1003 08:11:16.465799 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="40e2fe2c-8b4e-4b78-899e-1daed68626da" containerName="dnsmasq-dns" Oct 03 08:11:16 crc kubenswrapper[4664]: I1003 08:11:16.465818 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb30837f-4a77-4972-a012-ef0c51b62cb5" containerName="nova-cell1-conductor-db-sync" Oct 03 08:11:16 crc kubenswrapper[4664]: I1003 08:11:16.467182 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 03 08:11:16 crc kubenswrapper[4664]: I1003 08:11:16.472894 4664 generic.go:334] "Generic (PLEG): container finished" podID="3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a" containerID="395d89a827175a810beb99f9780394f57f193611d5d18c103bac8269a8462df4" exitCode=143 Oct 03 08:11:16 crc kubenswrapper[4664]: I1003 08:11:16.473429 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="8557bd77-628d-4cbd-8570-05897fa8495e" containerName="nova-scheduler-scheduler" containerID="cri-o://c222bd6eba8f217b0e4185bb409730230461d1d69b2f20b6406ca182f1863e24" gracePeriod=30 Oct 03 08:11:16 crc kubenswrapper[4664]: I1003 08:11:16.473118 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a","Type":"ContainerDied","Data":"395d89a827175a810beb99f9780394f57f193611d5d18c103bac8269a8462df4"} Oct 03 08:11:16 crc kubenswrapper[4664]: I1003 08:11:16.476940 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 03 08:11:16 crc kubenswrapper[4664]: I1003 08:11:16.486402 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 08:11:16 crc kubenswrapper[4664]: I1003 08:11:16.515805 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.122198752 podStartE2EDuration="2.515778366s" podCreationTimestamp="2025-10-03 08:11:14 +0000 UTC" firstStartedPulling="2025-10-03 08:11:15.507674275 +0000 UTC m=+1376.328864765" lastFinishedPulling="2025-10-03 08:11:15.901253889 +0000 UTC m=+1376.722444379" observedRunningTime="2025-10-03 08:11:16.48680479 +0000 UTC m=+1377.307995290" watchObservedRunningTime="2025-10-03 08:11:16.515778366 +0000 UTC m=+1377.336968856" Oct 03 08:11:16 crc kubenswrapper[4664]: I1003 08:11:16.609780 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r765g\" (UniqueName: \"kubernetes.io/projected/647a5434-1e87-43fa-a85c-e57c75f985f3-kube-api-access-r765g\") pod \"nova-cell1-conductor-0\" (UID: \"647a5434-1e87-43fa-a85c-e57c75f985f3\") " pod="openstack/nova-cell1-conductor-0" Oct 03 08:11:16 crc kubenswrapper[4664]: I1003 08:11:16.609889 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/647a5434-1e87-43fa-a85c-e57c75f985f3-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"647a5434-1e87-43fa-a85c-e57c75f985f3\") " pod="openstack/nova-cell1-conductor-0" Oct 03 08:11:16 crc kubenswrapper[4664]: I1003 08:11:16.609928 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/647a5434-1e87-43fa-a85c-e57c75f985f3-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"647a5434-1e87-43fa-a85c-e57c75f985f3\") " pod="openstack/nova-cell1-conductor-0" Oct 03 08:11:16 crc kubenswrapper[4664]: I1003 08:11:16.712418 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/647a5434-1e87-43fa-a85c-e57c75f985f3-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"647a5434-1e87-43fa-a85c-e57c75f985f3\") " pod="openstack/nova-cell1-conductor-0" Oct 03 08:11:16 crc kubenswrapper[4664]: I1003 08:11:16.712491 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/647a5434-1e87-43fa-a85c-e57c75f985f3-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"647a5434-1e87-43fa-a85c-e57c75f985f3\") " pod="openstack/nova-cell1-conductor-0" Oct 03 08:11:16 crc kubenswrapper[4664]: I1003 08:11:16.712677 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r765g\" (UniqueName: \"kubernetes.io/projected/647a5434-1e87-43fa-a85c-e57c75f985f3-kube-api-access-r765g\") pod \"nova-cell1-conductor-0\" (UID: \"647a5434-1e87-43fa-a85c-e57c75f985f3\") " pod="openstack/nova-cell1-conductor-0" Oct 03 08:11:16 crc kubenswrapper[4664]: I1003 08:11:16.725763 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/647a5434-1e87-43fa-a85c-e57c75f985f3-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"647a5434-1e87-43fa-a85c-e57c75f985f3\") " pod="openstack/nova-cell1-conductor-0" Oct 03 08:11:16 crc kubenswrapper[4664]: I1003 08:11:16.725767 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/647a5434-1e87-43fa-a85c-e57c75f985f3-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"647a5434-1e87-43fa-a85c-e57c75f985f3\") " pod="openstack/nova-cell1-conductor-0" Oct 03 08:11:16 crc kubenswrapper[4664]: I1003 08:11:16.760164 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r765g\" (UniqueName: \"kubernetes.io/projected/647a5434-1e87-43fa-a85c-e57c75f985f3-kube-api-access-r765g\") pod \"nova-cell1-conductor-0\" (UID: \"647a5434-1e87-43fa-a85c-e57c75f985f3\") " pod="openstack/nova-cell1-conductor-0" Oct 03 08:11:16 crc kubenswrapper[4664]: I1003 08:11:16.876905 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 03 08:11:17 crc kubenswrapper[4664]: I1003 08:11:17.361123 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 08:11:17 crc kubenswrapper[4664]: I1003 08:11:17.490391 4664 generic.go:334] "Generic (PLEG): container finished" podID="9c98f4aa-c34f-4ccb-9198-d2daf975e5df" containerID="ecb74b1639508af304c6b86b47042ebedf8fada82a3eee7bfad3d2c1c67c78c1" exitCode=0 Oct 03 08:11:17 crc kubenswrapper[4664]: I1003 08:11:17.490500 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c98f4aa-c34f-4ccb-9198-d2daf975e5df","Type":"ContainerDied","Data":"ecb74b1639508af304c6b86b47042ebedf8fada82a3eee7bfad3d2c1c67c78c1"} Oct 03 08:11:17 crc kubenswrapper[4664]: I1003 08:11:17.496545 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"647a5434-1e87-43fa-a85c-e57c75f985f3","Type":"ContainerStarted","Data":"9452c3e81b8bbd9459db43198174c13bb07dbb114d463220173dc45594fdeea9"} Oct 03 08:11:18 crc kubenswrapper[4664]: I1003 08:11:18.505891 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"647a5434-1e87-43fa-a85c-e57c75f985f3","Type":"ContainerStarted","Data":"42a2694bc955c496cfd5733c4fb82acbdb92e0b65a1edc808df6508c1db7ada1"} Oct 03 08:11:18 crc kubenswrapper[4664]: I1003 08:11:18.506466 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 03 08:11:18 crc kubenswrapper[4664]: I1003 08:11:18.529231 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.529212367 podStartE2EDuration="2.529212367s" podCreationTimestamp="2025-10-03 08:11:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:11:18.523688976 +0000 UTC m=+1379.344879486" watchObservedRunningTime="2025-10-03 08:11:18.529212367 +0000 UTC m=+1379.350402857" Oct 03 08:11:18 crc kubenswrapper[4664]: E1003 08:11:18.577886 4664 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c222bd6eba8f217b0e4185bb409730230461d1d69b2f20b6406ca182f1863e24" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 08:11:18 crc kubenswrapper[4664]: E1003 08:11:18.579891 4664 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c222bd6eba8f217b0e4185bb409730230461d1d69b2f20b6406ca182f1863e24" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 08:11:18 crc kubenswrapper[4664]: E1003 08:11:18.582887 4664 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c222bd6eba8f217b0e4185bb409730230461d1d69b2f20b6406ca182f1863e24" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 08:11:18 crc kubenswrapper[4664]: E1003 08:11:18.582932 4664 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="8557bd77-628d-4cbd-8570-05897fa8495e" containerName="nova-scheduler-scheduler" Oct 03 08:11:19 crc kubenswrapper[4664]: I1003 08:11:19.241723 4664 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6bb4fc677f-x99b6" podUID="40e2fe2c-8b4e-4b78-899e-1daed68626da" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.162:5353: i/o timeout" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.186028 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.245978 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.293164 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfmwc\" (UniqueName: \"kubernetes.io/projected/9c98f4aa-c34f-4ccb-9198-d2daf975e5df-kube-api-access-nfmwc\") pod \"9c98f4aa-c34f-4ccb-9198-d2daf975e5df\" (UID: \"9c98f4aa-c34f-4ccb-9198-d2daf975e5df\") " Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.293251 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c98f4aa-c34f-4ccb-9198-d2daf975e5df-config-data\") pod \"9c98f4aa-c34f-4ccb-9198-d2daf975e5df\" (UID: \"9c98f4aa-c34f-4ccb-9198-d2daf975e5df\") " Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.293313 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c98f4aa-c34f-4ccb-9198-d2daf975e5df-run-httpd\") pod \"9c98f4aa-c34f-4ccb-9198-d2daf975e5df\" (UID: \"9c98f4aa-c34f-4ccb-9198-d2daf975e5df\") " Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.293373 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c98f4aa-c34f-4ccb-9198-d2daf975e5df-combined-ca-bundle\") pod \"9c98f4aa-c34f-4ccb-9198-d2daf975e5df\" (UID: \"9c98f4aa-c34f-4ccb-9198-d2daf975e5df\") " Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.293463 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c98f4aa-c34f-4ccb-9198-d2daf975e5df-scripts\") pod \"9c98f4aa-c34f-4ccb-9198-d2daf975e5df\" (UID: \"9c98f4aa-c34f-4ccb-9198-d2daf975e5df\") " Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.293650 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9c98f4aa-c34f-4ccb-9198-d2daf975e5df-sg-core-conf-yaml\") pod \"9c98f4aa-c34f-4ccb-9198-d2daf975e5df\" (UID: \"9c98f4aa-c34f-4ccb-9198-d2daf975e5df\") " Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.293682 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c98f4aa-c34f-4ccb-9198-d2daf975e5df-log-httpd\") pod \"9c98f4aa-c34f-4ccb-9198-d2daf975e5df\" (UID: \"9c98f4aa-c34f-4ccb-9198-d2daf975e5df\") " Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.294696 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c98f4aa-c34f-4ccb-9198-d2daf975e5df-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9c98f4aa-c34f-4ccb-9198-d2daf975e5df" (UID: "9c98f4aa-c34f-4ccb-9198-d2daf975e5df"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.295593 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c98f4aa-c34f-4ccb-9198-d2daf975e5df-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9c98f4aa-c34f-4ccb-9198-d2daf975e5df" (UID: "9c98f4aa-c34f-4ccb-9198-d2daf975e5df"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.302187 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c98f4aa-c34f-4ccb-9198-d2daf975e5df-scripts" (OuterVolumeSpecName: "scripts") pod "9c98f4aa-c34f-4ccb-9198-d2daf975e5df" (UID: "9c98f4aa-c34f-4ccb-9198-d2daf975e5df"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.302857 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c98f4aa-c34f-4ccb-9198-d2daf975e5df-kube-api-access-nfmwc" (OuterVolumeSpecName: "kube-api-access-nfmwc") pod "9c98f4aa-c34f-4ccb-9198-d2daf975e5df" (UID: "9c98f4aa-c34f-4ccb-9198-d2daf975e5df"). InnerVolumeSpecName "kube-api-access-nfmwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.332794 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c98f4aa-c34f-4ccb-9198-d2daf975e5df-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9c98f4aa-c34f-4ccb-9198-d2daf975e5df" (UID: "9c98f4aa-c34f-4ccb-9198-d2daf975e5df"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.395744 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8557bd77-628d-4cbd-8570-05897fa8495e-combined-ca-bundle\") pod \"8557bd77-628d-4cbd-8570-05897fa8495e\" (UID: \"8557bd77-628d-4cbd-8570-05897fa8495e\") " Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.396100 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8557bd77-628d-4cbd-8570-05897fa8495e-config-data\") pod \"8557bd77-628d-4cbd-8570-05897fa8495e\" (UID: \"8557bd77-628d-4cbd-8570-05897fa8495e\") " Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.396221 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2p82\" (UniqueName: \"kubernetes.io/projected/8557bd77-628d-4cbd-8570-05897fa8495e-kube-api-access-l2p82\") pod \"8557bd77-628d-4cbd-8570-05897fa8495e\" (UID: \"8557bd77-628d-4cbd-8570-05897fa8495e\") " Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.396886 4664 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c98f4aa-c34f-4ccb-9198-d2daf975e5df-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.396965 4664 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9c98f4aa-c34f-4ccb-9198-d2daf975e5df-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.397063 4664 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c98f4aa-c34f-4ccb-9198-d2daf975e5df-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.397150 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfmwc\" (UniqueName: \"kubernetes.io/projected/9c98f4aa-c34f-4ccb-9198-d2daf975e5df-kube-api-access-nfmwc\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.397233 4664 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c98f4aa-c34f-4ccb-9198-d2daf975e5df-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.400327 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c98f4aa-c34f-4ccb-9198-d2daf975e5df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9c98f4aa-c34f-4ccb-9198-d2daf975e5df" (UID: "9c98f4aa-c34f-4ccb-9198-d2daf975e5df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.400404 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8557bd77-628d-4cbd-8570-05897fa8495e-kube-api-access-l2p82" (OuterVolumeSpecName: "kube-api-access-l2p82") pod "8557bd77-628d-4cbd-8570-05897fa8495e" (UID: "8557bd77-628d-4cbd-8570-05897fa8495e"). InnerVolumeSpecName "kube-api-access-l2p82". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.429323 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8557bd77-628d-4cbd-8570-05897fa8495e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8557bd77-628d-4cbd-8570-05897fa8495e" (UID: "8557bd77-628d-4cbd-8570-05897fa8495e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.429565 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c98f4aa-c34f-4ccb-9198-d2daf975e5df-config-data" (OuterVolumeSpecName: "config-data") pod "9c98f4aa-c34f-4ccb-9198-d2daf975e5df" (UID: "9c98f4aa-c34f-4ccb-9198-d2daf975e5df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.441838 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8557bd77-628d-4cbd-8570-05897fa8495e-config-data" (OuterVolumeSpecName: "config-data") pod "8557bd77-628d-4cbd-8570-05897fa8495e" (UID: "8557bd77-628d-4cbd-8570-05897fa8495e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.498940 4664 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c98f4aa-c34f-4ccb-9198-d2daf975e5df-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.499337 4664 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c98f4aa-c34f-4ccb-9198-d2daf975e5df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.499440 4664 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8557bd77-628d-4cbd-8570-05897fa8495e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.499523 4664 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8557bd77-628d-4cbd-8570-05897fa8495e-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.499629 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2p82\" (UniqueName: \"kubernetes.io/projected/8557bd77-628d-4cbd-8570-05897fa8495e-kube-api-access-l2p82\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.533527 4664 generic.go:334] "Generic (PLEG): container finished" podID="9c98f4aa-c34f-4ccb-9198-d2daf975e5df" containerID="b49ad7c2b6db5d70aaef83dc2b4b5b0b0ed1c6dcce10190324332842a156941d" exitCode=0 Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.533755 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c98f4aa-c34f-4ccb-9198-d2daf975e5df","Type":"ContainerDied","Data":"b49ad7c2b6db5d70aaef83dc2b4b5b0b0ed1c6dcce10190324332842a156941d"} Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.533795 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c98f4aa-c34f-4ccb-9198-d2daf975e5df","Type":"ContainerDied","Data":"1d8c80d4d32f8cefbc4f9606e7336a30d4086a19871787c47b81454294e67df6"} Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.533820 4664 scope.go:117] "RemoveContainer" containerID="e1d7af77eda09d0f41ed8a469b12a776492acbb0035601478093de2735b67b6a" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.533832 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.537901 4664 generic.go:334] "Generic (PLEG): container finished" podID="8557bd77-628d-4cbd-8570-05897fa8495e" containerID="c222bd6eba8f217b0e4185bb409730230461d1d69b2f20b6406ca182f1863e24" exitCode=0 Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.537962 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8557bd77-628d-4cbd-8570-05897fa8495e","Type":"ContainerDied","Data":"c222bd6eba8f217b0e4185bb409730230461d1d69b2f20b6406ca182f1863e24"} Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.537994 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8557bd77-628d-4cbd-8570-05897fa8495e","Type":"ContainerDied","Data":"5567f02afcbdb6769855d30e14868796e2ea93463306e6814a5fd08a616d63f0"} Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.538054 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.566418 4664 scope.go:117] "RemoveContainer" containerID="62613db899840e1c90b3db2b2802f4bd1d6263fcc10f9206537360bf95541040" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.598137 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.608417 4664 scope.go:117] "RemoveContainer" containerID="b49ad7c2b6db5d70aaef83dc2b4b5b0b0ed1c6dcce10190324332842a156941d" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.616822 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.646672 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.664403 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.672408 4664 scope.go:117] "RemoveContainer" containerID="ecb74b1639508af304c6b86b47042ebedf8fada82a3eee7bfad3d2c1c67c78c1" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.679850 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:11:20 crc kubenswrapper[4664]: E1003 08:11:20.680299 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c98f4aa-c34f-4ccb-9198-d2daf975e5df" containerName="ceilometer-central-agent" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.680317 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c98f4aa-c34f-4ccb-9198-d2daf975e5df" containerName="ceilometer-central-agent" Oct 03 08:11:20 crc kubenswrapper[4664]: E1003 08:11:20.680331 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c98f4aa-c34f-4ccb-9198-d2daf975e5df" containerName="sg-core" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.680337 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c98f4aa-c34f-4ccb-9198-d2daf975e5df" containerName="sg-core" Oct 03 08:11:20 crc kubenswrapper[4664]: E1003 08:11:20.680361 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c98f4aa-c34f-4ccb-9198-d2daf975e5df" containerName="ceilometer-notification-agent" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.680368 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c98f4aa-c34f-4ccb-9198-d2daf975e5df" containerName="ceilometer-notification-agent" Oct 03 08:11:20 crc kubenswrapper[4664]: E1003 08:11:20.680382 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c98f4aa-c34f-4ccb-9198-d2daf975e5df" containerName="proxy-httpd" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.680388 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c98f4aa-c34f-4ccb-9198-d2daf975e5df" containerName="proxy-httpd" Oct 03 08:11:20 crc kubenswrapper[4664]: E1003 08:11:20.680403 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8557bd77-628d-4cbd-8570-05897fa8495e" containerName="nova-scheduler-scheduler" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.680409 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="8557bd77-628d-4cbd-8570-05897fa8495e" containerName="nova-scheduler-scheduler" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.680589 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="8557bd77-628d-4cbd-8570-05897fa8495e" containerName="nova-scheduler-scheduler" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.680622 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c98f4aa-c34f-4ccb-9198-d2daf975e5df" containerName="sg-core" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.680633 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c98f4aa-c34f-4ccb-9198-d2daf975e5df" containerName="ceilometer-central-agent" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.680648 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c98f4aa-c34f-4ccb-9198-d2daf975e5df" containerName="ceilometer-notification-agent" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.680658 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c98f4aa-c34f-4ccb-9198-d2daf975e5df" containerName="proxy-httpd" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.682379 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.688182 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.688248 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.688344 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.693397 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.696205 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.699577 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.714250 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.726102 4664 scope.go:117] "RemoveContainer" containerID="e1d7af77eda09d0f41ed8a469b12a776492acbb0035601478093de2735b67b6a" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.727622 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 08:11:20 crc kubenswrapper[4664]: E1003 08:11:20.728296 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1d7af77eda09d0f41ed8a469b12a776492acbb0035601478093de2735b67b6a\": container with ID starting with e1d7af77eda09d0f41ed8a469b12a776492acbb0035601478093de2735b67b6a not found: ID does not exist" containerID="e1d7af77eda09d0f41ed8a469b12a776492acbb0035601478093de2735b67b6a" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.728330 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1d7af77eda09d0f41ed8a469b12a776492acbb0035601478093de2735b67b6a"} err="failed to get container status \"e1d7af77eda09d0f41ed8a469b12a776492acbb0035601478093de2735b67b6a\": rpc error: code = NotFound desc = could not find container \"e1d7af77eda09d0f41ed8a469b12a776492acbb0035601478093de2735b67b6a\": container with ID starting with e1d7af77eda09d0f41ed8a469b12a776492acbb0035601478093de2735b67b6a not found: ID does not exist" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.728353 4664 scope.go:117] "RemoveContainer" containerID="62613db899840e1c90b3db2b2802f4bd1d6263fcc10f9206537360bf95541040" Oct 03 08:11:20 crc kubenswrapper[4664]: E1003 08:11:20.733727 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62613db899840e1c90b3db2b2802f4bd1d6263fcc10f9206537360bf95541040\": container with ID starting with 62613db899840e1c90b3db2b2802f4bd1d6263fcc10f9206537360bf95541040 not found: ID does not exist" containerID="62613db899840e1c90b3db2b2802f4bd1d6263fcc10f9206537360bf95541040" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.733775 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62613db899840e1c90b3db2b2802f4bd1d6263fcc10f9206537360bf95541040"} err="failed to get container status \"62613db899840e1c90b3db2b2802f4bd1d6263fcc10f9206537360bf95541040\": rpc error: code = NotFound desc = could not find container \"62613db899840e1c90b3db2b2802f4bd1d6263fcc10f9206537360bf95541040\": container with ID starting with 62613db899840e1c90b3db2b2802f4bd1d6263fcc10f9206537360bf95541040 not found: ID does not exist" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.733847 4664 scope.go:117] "RemoveContainer" containerID="b49ad7c2b6db5d70aaef83dc2b4b5b0b0ed1c6dcce10190324332842a156941d" Oct 03 08:11:20 crc kubenswrapper[4664]: E1003 08:11:20.741942 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b49ad7c2b6db5d70aaef83dc2b4b5b0b0ed1c6dcce10190324332842a156941d\": container with ID starting with b49ad7c2b6db5d70aaef83dc2b4b5b0b0ed1c6dcce10190324332842a156941d not found: ID does not exist" containerID="b49ad7c2b6db5d70aaef83dc2b4b5b0b0ed1c6dcce10190324332842a156941d" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.742007 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b49ad7c2b6db5d70aaef83dc2b4b5b0b0ed1c6dcce10190324332842a156941d"} err="failed to get container status \"b49ad7c2b6db5d70aaef83dc2b4b5b0b0ed1c6dcce10190324332842a156941d\": rpc error: code = NotFound desc = could not find container \"b49ad7c2b6db5d70aaef83dc2b4b5b0b0ed1c6dcce10190324332842a156941d\": container with ID starting with b49ad7c2b6db5d70aaef83dc2b4b5b0b0ed1c6dcce10190324332842a156941d not found: ID does not exist" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.742049 4664 scope.go:117] "RemoveContainer" containerID="ecb74b1639508af304c6b86b47042ebedf8fada82a3eee7bfad3d2c1c67c78c1" Oct 03 08:11:20 crc kubenswrapper[4664]: E1003 08:11:20.744650 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecb74b1639508af304c6b86b47042ebedf8fada82a3eee7bfad3d2c1c67c78c1\": container with ID starting with ecb74b1639508af304c6b86b47042ebedf8fada82a3eee7bfad3d2c1c67c78c1 not found: ID does not exist" containerID="ecb74b1639508af304c6b86b47042ebedf8fada82a3eee7bfad3d2c1c67c78c1" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.744682 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecb74b1639508af304c6b86b47042ebedf8fada82a3eee7bfad3d2c1c67c78c1"} err="failed to get container status \"ecb74b1639508af304c6b86b47042ebedf8fada82a3eee7bfad3d2c1c67c78c1\": rpc error: code = NotFound desc = could not find container \"ecb74b1639508af304c6b86b47042ebedf8fada82a3eee7bfad3d2c1c67c78c1\": container with ID starting with ecb74b1639508af304c6b86b47042ebedf8fada82a3eee7bfad3d2c1c67c78c1 not found: ID does not exist" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.744705 4664 scope.go:117] "RemoveContainer" containerID="c222bd6eba8f217b0e4185bb409730230461d1d69b2f20b6406ca182f1863e24" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.779880 4664 scope.go:117] "RemoveContainer" containerID="c222bd6eba8f217b0e4185bb409730230461d1d69b2f20b6406ca182f1863e24" Oct 03 08:11:20 crc kubenswrapper[4664]: E1003 08:11:20.781466 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c222bd6eba8f217b0e4185bb409730230461d1d69b2f20b6406ca182f1863e24\": container with ID starting with c222bd6eba8f217b0e4185bb409730230461d1d69b2f20b6406ca182f1863e24 not found: ID does not exist" containerID="c222bd6eba8f217b0e4185bb409730230461d1d69b2f20b6406ca182f1863e24" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.781513 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c222bd6eba8f217b0e4185bb409730230461d1d69b2f20b6406ca182f1863e24"} err="failed to get container status \"c222bd6eba8f217b0e4185bb409730230461d1d69b2f20b6406ca182f1863e24\": rpc error: code = NotFound desc = could not find container \"c222bd6eba8f217b0e4185bb409730230461d1d69b2f20b6406ca182f1863e24\": container with ID starting with c222bd6eba8f217b0e4185bb409730230461d1d69b2f20b6406ca182f1863e24 not found: ID does not exist" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.805103 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38363065-9fad-46b4-bb38-3cf143e66913-log-httpd\") pod \"ceilometer-0\" (UID: \"38363065-9fad-46b4-bb38-3cf143e66913\") " pod="openstack/ceilometer-0" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.805165 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/38363065-9fad-46b4-bb38-3cf143e66913-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"38363065-9fad-46b4-bb38-3cf143e66913\") " pod="openstack/ceilometer-0" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.805474 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38363065-9fad-46b4-bb38-3cf143e66913-config-data\") pod \"ceilometer-0\" (UID: \"38363065-9fad-46b4-bb38-3cf143e66913\") " pod="openstack/ceilometer-0" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.805586 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38363065-9fad-46b4-bb38-3cf143e66913-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"38363065-9fad-46b4-bb38-3cf143e66913\") " pod="openstack/ceilometer-0" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.805654 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b62e85e1-8010-42b2-b674-271b49596620-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b62e85e1-8010-42b2-b674-271b49596620\") " pod="openstack/nova-scheduler-0" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.805759 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/38363065-9fad-46b4-bb38-3cf143e66913-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"38363065-9fad-46b4-bb38-3cf143e66913\") " pod="openstack/ceilometer-0" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.805928 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38363065-9fad-46b4-bb38-3cf143e66913-run-httpd\") pod \"ceilometer-0\" (UID: \"38363065-9fad-46b4-bb38-3cf143e66913\") " pod="openstack/ceilometer-0" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.806031 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38363065-9fad-46b4-bb38-3cf143e66913-scripts\") pod \"ceilometer-0\" (UID: \"38363065-9fad-46b4-bb38-3cf143e66913\") " pod="openstack/ceilometer-0" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.806117 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x87lc\" (UniqueName: \"kubernetes.io/projected/b62e85e1-8010-42b2-b674-271b49596620-kube-api-access-x87lc\") pod \"nova-scheduler-0\" (UID: \"b62e85e1-8010-42b2-b674-271b49596620\") " pod="openstack/nova-scheduler-0" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.806258 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b62e85e1-8010-42b2-b674-271b49596620-config-data\") pod \"nova-scheduler-0\" (UID: \"b62e85e1-8010-42b2-b674-271b49596620\") " pod="openstack/nova-scheduler-0" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.806303 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpfjm\" (UniqueName: \"kubernetes.io/projected/38363065-9fad-46b4-bb38-3cf143e66913-kube-api-access-jpfjm\") pod \"ceilometer-0\" (UID: \"38363065-9fad-46b4-bb38-3cf143e66913\") " pod="openstack/ceilometer-0" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.908961 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38363065-9fad-46b4-bb38-3cf143e66913-scripts\") pod \"ceilometer-0\" (UID: \"38363065-9fad-46b4-bb38-3cf143e66913\") " pod="openstack/ceilometer-0" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.909034 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x87lc\" (UniqueName: \"kubernetes.io/projected/b62e85e1-8010-42b2-b674-271b49596620-kube-api-access-x87lc\") pod \"nova-scheduler-0\" (UID: \"b62e85e1-8010-42b2-b674-271b49596620\") " pod="openstack/nova-scheduler-0" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.909063 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b62e85e1-8010-42b2-b674-271b49596620-config-data\") pod \"nova-scheduler-0\" (UID: \"b62e85e1-8010-42b2-b674-271b49596620\") " pod="openstack/nova-scheduler-0" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.909082 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpfjm\" (UniqueName: \"kubernetes.io/projected/38363065-9fad-46b4-bb38-3cf143e66913-kube-api-access-jpfjm\") pod \"ceilometer-0\" (UID: \"38363065-9fad-46b4-bb38-3cf143e66913\") " pod="openstack/ceilometer-0" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.909117 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38363065-9fad-46b4-bb38-3cf143e66913-log-httpd\") pod \"ceilometer-0\" (UID: \"38363065-9fad-46b4-bb38-3cf143e66913\") " pod="openstack/ceilometer-0" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.909143 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/38363065-9fad-46b4-bb38-3cf143e66913-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"38363065-9fad-46b4-bb38-3cf143e66913\") " pod="openstack/ceilometer-0" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.909207 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38363065-9fad-46b4-bb38-3cf143e66913-config-data\") pod \"ceilometer-0\" (UID: \"38363065-9fad-46b4-bb38-3cf143e66913\") " pod="openstack/ceilometer-0" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.909245 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38363065-9fad-46b4-bb38-3cf143e66913-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"38363065-9fad-46b4-bb38-3cf143e66913\") " pod="openstack/ceilometer-0" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.909272 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b62e85e1-8010-42b2-b674-271b49596620-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b62e85e1-8010-42b2-b674-271b49596620\") " pod="openstack/nova-scheduler-0" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.909340 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/38363065-9fad-46b4-bb38-3cf143e66913-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"38363065-9fad-46b4-bb38-3cf143e66913\") " pod="openstack/ceilometer-0" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.909433 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38363065-9fad-46b4-bb38-3cf143e66913-run-httpd\") pod \"ceilometer-0\" (UID: \"38363065-9fad-46b4-bb38-3cf143e66913\") " pod="openstack/ceilometer-0" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.910550 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38363065-9fad-46b4-bb38-3cf143e66913-run-httpd\") pod \"ceilometer-0\" (UID: \"38363065-9fad-46b4-bb38-3cf143e66913\") " pod="openstack/ceilometer-0" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.910626 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38363065-9fad-46b4-bb38-3cf143e66913-log-httpd\") pod \"ceilometer-0\" (UID: \"38363065-9fad-46b4-bb38-3cf143e66913\") " pod="openstack/ceilometer-0" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.916006 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38363065-9fad-46b4-bb38-3cf143e66913-config-data\") pod \"ceilometer-0\" (UID: \"38363065-9fad-46b4-bb38-3cf143e66913\") " pod="openstack/ceilometer-0" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.920254 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b62e85e1-8010-42b2-b674-271b49596620-config-data\") pod \"nova-scheduler-0\" (UID: \"b62e85e1-8010-42b2-b674-271b49596620\") " pod="openstack/nova-scheduler-0" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.924123 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38363065-9fad-46b4-bb38-3cf143e66913-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"38363065-9fad-46b4-bb38-3cf143e66913\") " pod="openstack/ceilometer-0" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.924347 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/38363065-9fad-46b4-bb38-3cf143e66913-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"38363065-9fad-46b4-bb38-3cf143e66913\") " pod="openstack/ceilometer-0" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.925376 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/38363065-9fad-46b4-bb38-3cf143e66913-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"38363065-9fad-46b4-bb38-3cf143e66913\") " pod="openstack/ceilometer-0" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.933657 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b62e85e1-8010-42b2-b674-271b49596620-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b62e85e1-8010-42b2-b674-271b49596620\") " pod="openstack/nova-scheduler-0" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.935571 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpfjm\" (UniqueName: \"kubernetes.io/projected/38363065-9fad-46b4-bb38-3cf143e66913-kube-api-access-jpfjm\") pod \"ceilometer-0\" (UID: \"38363065-9fad-46b4-bb38-3cf143e66913\") " pod="openstack/ceilometer-0" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.938431 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38363065-9fad-46b4-bb38-3cf143e66913-scripts\") pod \"ceilometer-0\" (UID: \"38363065-9fad-46b4-bb38-3cf143e66913\") " pod="openstack/ceilometer-0" Oct 03 08:11:20 crc kubenswrapper[4664]: I1003 08:11:20.945106 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x87lc\" (UniqueName: \"kubernetes.io/projected/b62e85e1-8010-42b2-b674-271b49596620-kube-api-access-x87lc\") pod \"nova-scheduler-0\" (UID: \"b62e85e1-8010-42b2-b674-271b49596620\") " pod="openstack/nova-scheduler-0" Oct 03 08:11:21 crc kubenswrapper[4664]: I1003 08:11:21.087323 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 08:11:21 crc kubenswrapper[4664]: I1003 08:11:21.099906 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 08:11:21 crc kubenswrapper[4664]: I1003 08:11:21.567420 4664 generic.go:334] "Generic (PLEG): container finished" podID="3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a" containerID="af13fdc0e466e0a911754487cc060d068f208abf36f2b1302361c580f1e79d06" exitCode=0 Oct 03 08:11:21 crc kubenswrapper[4664]: I1003 08:11:21.567732 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a","Type":"ContainerDied","Data":"af13fdc0e466e0a911754487cc060d068f208abf36f2b1302361c580f1e79d06"} Oct 03 08:11:21 crc kubenswrapper[4664]: I1003 08:11:21.646534 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 08:11:21 crc kubenswrapper[4664]: I1003 08:11:21.655525 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:11:21 crc kubenswrapper[4664]: I1003 08:11:21.670809 4664 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 08:11:21 crc kubenswrapper[4664]: I1003 08:11:21.701544 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a-logs\") pod \"3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a\" (UID: \"3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a\") " Oct 03 08:11:21 crc kubenswrapper[4664]: I1003 08:11:21.702271 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a-combined-ca-bundle\") pod \"3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a\" (UID: \"3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a\") " Oct 03 08:11:21 crc kubenswrapper[4664]: I1003 08:11:21.702341 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9p49g\" (UniqueName: \"kubernetes.io/projected/3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a-kube-api-access-9p49g\") pod \"3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a\" (UID: \"3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a\") " Oct 03 08:11:21 crc kubenswrapper[4664]: I1003 08:11:21.702549 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a-config-data\") pod \"3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a\" (UID: \"3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a\") " Oct 03 08:11:21 crc kubenswrapper[4664]: I1003 08:11:21.706455 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a-logs" (OuterVolumeSpecName: "logs") pod "3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a" (UID: "3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:11:21 crc kubenswrapper[4664]: I1003 08:11:21.720891 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a-kube-api-access-9p49g" (OuterVolumeSpecName: "kube-api-access-9p49g") pod "3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a" (UID: "3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a"). InnerVolumeSpecName "kube-api-access-9p49g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:11:21 crc kubenswrapper[4664]: I1003 08:11:21.773768 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 08:11:21 crc kubenswrapper[4664]: I1003 08:11:21.782550 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a" (UID: "3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:11:21 crc kubenswrapper[4664]: I1003 08:11:21.797005 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a-config-data" (OuterVolumeSpecName: "config-data") pod "3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a" (UID: "3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:11:21 crc kubenswrapper[4664]: I1003 08:11:21.806658 4664 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a-logs\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:21 crc kubenswrapper[4664]: I1003 08:11:21.806682 4664 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:21 crc kubenswrapper[4664]: I1003 08:11:21.806694 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9p49g\" (UniqueName: \"kubernetes.io/projected/3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a-kube-api-access-9p49g\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:21 crc kubenswrapper[4664]: I1003 08:11:21.806708 4664 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:21 crc kubenswrapper[4664]: I1003 08:11:21.888430 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8557bd77-628d-4cbd-8570-05897fa8495e" path="/var/lib/kubelet/pods/8557bd77-628d-4cbd-8570-05897fa8495e/volumes" Oct 03 08:11:21 crc kubenswrapper[4664]: I1003 08:11:21.889291 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c98f4aa-c34f-4ccb-9198-d2daf975e5df" path="/var/lib/kubelet/pods/9c98f4aa-c34f-4ccb-9198-d2daf975e5df/volumes" Oct 03 08:11:22 crc kubenswrapper[4664]: I1003 08:11:22.591513 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b62e85e1-8010-42b2-b674-271b49596620","Type":"ContainerStarted","Data":"191612892f5f8a7fadf384544affdfc81939e55e4ae4d741a1451acc4f2e8551"} Oct 03 08:11:22 crc kubenswrapper[4664]: I1003 08:11:22.591858 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b62e85e1-8010-42b2-b674-271b49596620","Type":"ContainerStarted","Data":"d80acc6cb40df6df76226fed4de0cf3921f094654f765edcf8bad1726760d7a8"} Oct 03 08:11:22 crc kubenswrapper[4664]: I1003 08:11:22.594734 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38363065-9fad-46b4-bb38-3cf143e66913","Type":"ContainerStarted","Data":"41fe124b4625424ca380f7fdcc0cc750af67596a8f412a25a1ba046926232d6f"} Oct 03 08:11:22 crc kubenswrapper[4664]: I1003 08:11:22.594789 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38363065-9fad-46b4-bb38-3cf143e66913","Type":"ContainerStarted","Data":"88d2ab128a09febeb19f4bf4fd319cb9bf5f565a1a08832cccc4fb7fa1bec159"} Oct 03 08:11:22 crc kubenswrapper[4664]: I1003 08:11:22.602740 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a","Type":"ContainerDied","Data":"2146ee7a295ba4d67b57644b0689b281cba7deec152e6b7a62f0ee6dac91f3b3"} Oct 03 08:11:22 crc kubenswrapper[4664]: I1003 08:11:22.602802 4664 scope.go:117] "RemoveContainer" containerID="af13fdc0e466e0a911754487cc060d068f208abf36f2b1302361c580f1e79d06" Oct 03 08:11:22 crc kubenswrapper[4664]: I1003 08:11:22.603144 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 08:11:22 crc kubenswrapper[4664]: I1003 08:11:22.624141 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.6241159769999998 podStartE2EDuration="2.624115977s" podCreationTimestamp="2025-10-03 08:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:11:22.611740835 +0000 UTC m=+1383.432931335" watchObservedRunningTime="2025-10-03 08:11:22.624115977 +0000 UTC m=+1383.445306487" Oct 03 08:11:22 crc kubenswrapper[4664]: I1003 08:11:22.649704 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 08:11:22 crc kubenswrapper[4664]: I1003 08:11:22.664504 4664 scope.go:117] "RemoveContainer" containerID="395d89a827175a810beb99f9780394f57f193611d5d18c103bac8269a8462df4" Oct 03 08:11:22 crc kubenswrapper[4664]: I1003 08:11:22.664686 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 03 08:11:22 crc kubenswrapper[4664]: I1003 08:11:22.681729 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 03 08:11:22 crc kubenswrapper[4664]: E1003 08:11:22.682207 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a" containerName="nova-api-log" Oct 03 08:11:22 crc kubenswrapper[4664]: I1003 08:11:22.682226 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a" containerName="nova-api-log" Oct 03 08:11:22 crc kubenswrapper[4664]: E1003 08:11:22.682258 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a" containerName="nova-api-api" Oct 03 08:11:22 crc kubenswrapper[4664]: I1003 08:11:22.682264 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a" containerName="nova-api-api" Oct 03 08:11:22 crc kubenswrapper[4664]: I1003 08:11:22.682469 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a" containerName="nova-api-api" Oct 03 08:11:22 crc kubenswrapper[4664]: I1003 08:11:22.682488 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a" containerName="nova-api-log" Oct 03 08:11:22 crc kubenswrapper[4664]: I1003 08:11:22.685162 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 08:11:22 crc kubenswrapper[4664]: I1003 08:11:22.691896 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 08:11:22 crc kubenswrapper[4664]: I1003 08:11:22.692047 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 03 08:11:22 crc kubenswrapper[4664]: I1003 08:11:22.726492 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c810060-a84b-47cd-95cc-583e8ef6a535-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0c810060-a84b-47cd-95cc-583e8ef6a535\") " pod="openstack/nova-api-0" Oct 03 08:11:22 crc kubenswrapper[4664]: I1003 08:11:22.726567 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nt2b\" (UniqueName: \"kubernetes.io/projected/0c810060-a84b-47cd-95cc-583e8ef6a535-kube-api-access-4nt2b\") pod \"nova-api-0\" (UID: \"0c810060-a84b-47cd-95cc-583e8ef6a535\") " pod="openstack/nova-api-0" Oct 03 08:11:22 crc kubenswrapper[4664]: I1003 08:11:22.726630 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c810060-a84b-47cd-95cc-583e8ef6a535-config-data\") pod \"nova-api-0\" (UID: \"0c810060-a84b-47cd-95cc-583e8ef6a535\") " pod="openstack/nova-api-0" Oct 03 08:11:22 crc kubenswrapper[4664]: I1003 08:11:22.726702 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c810060-a84b-47cd-95cc-583e8ef6a535-logs\") pod \"nova-api-0\" (UID: \"0c810060-a84b-47cd-95cc-583e8ef6a535\") " pod="openstack/nova-api-0" Oct 03 08:11:22 crc kubenswrapper[4664]: I1003 08:11:22.828700 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c810060-a84b-47cd-95cc-583e8ef6a535-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0c810060-a84b-47cd-95cc-583e8ef6a535\") " pod="openstack/nova-api-0" Oct 03 08:11:22 crc kubenswrapper[4664]: I1003 08:11:22.829130 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nt2b\" (UniqueName: \"kubernetes.io/projected/0c810060-a84b-47cd-95cc-583e8ef6a535-kube-api-access-4nt2b\") pod \"nova-api-0\" (UID: \"0c810060-a84b-47cd-95cc-583e8ef6a535\") " pod="openstack/nova-api-0" Oct 03 08:11:22 crc kubenswrapper[4664]: I1003 08:11:22.829187 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c810060-a84b-47cd-95cc-583e8ef6a535-config-data\") pod \"nova-api-0\" (UID: \"0c810060-a84b-47cd-95cc-583e8ef6a535\") " pod="openstack/nova-api-0" Oct 03 08:11:22 crc kubenswrapper[4664]: I1003 08:11:22.829229 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c810060-a84b-47cd-95cc-583e8ef6a535-logs\") pod \"nova-api-0\" (UID: \"0c810060-a84b-47cd-95cc-583e8ef6a535\") " pod="openstack/nova-api-0" Oct 03 08:11:22 crc kubenswrapper[4664]: I1003 08:11:22.829560 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c810060-a84b-47cd-95cc-583e8ef6a535-logs\") pod \"nova-api-0\" (UID: \"0c810060-a84b-47cd-95cc-583e8ef6a535\") " pod="openstack/nova-api-0" Oct 03 08:11:22 crc kubenswrapper[4664]: I1003 08:11:22.832971 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c810060-a84b-47cd-95cc-583e8ef6a535-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0c810060-a84b-47cd-95cc-583e8ef6a535\") " pod="openstack/nova-api-0" Oct 03 08:11:22 crc kubenswrapper[4664]: I1003 08:11:22.841022 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c810060-a84b-47cd-95cc-583e8ef6a535-config-data\") pod \"nova-api-0\" (UID: \"0c810060-a84b-47cd-95cc-583e8ef6a535\") " pod="openstack/nova-api-0" Oct 03 08:11:22 crc kubenswrapper[4664]: I1003 08:11:22.854557 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nt2b\" (UniqueName: \"kubernetes.io/projected/0c810060-a84b-47cd-95cc-583e8ef6a535-kube-api-access-4nt2b\") pod \"nova-api-0\" (UID: \"0c810060-a84b-47cd-95cc-583e8ef6a535\") " pod="openstack/nova-api-0" Oct 03 08:11:23 crc kubenswrapper[4664]: I1003 08:11:23.043400 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 08:11:23 crc kubenswrapper[4664]: I1003 08:11:23.524644 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 08:11:23 crc kubenswrapper[4664]: I1003 08:11:23.613316 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38363065-9fad-46b4-bb38-3cf143e66913","Type":"ContainerStarted","Data":"93c72f2411ece905b6d3e4d337e7f3080f5077c17f9cdaeb73ccbe4cbc85f288"} Oct 03 08:11:23 crc kubenswrapper[4664]: I1003 08:11:23.617385 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0c810060-a84b-47cd-95cc-583e8ef6a535","Type":"ContainerStarted","Data":"b119aad6fbdb648e7397339d05b4eef66cddf014b20e600a6b77e66d71e2a832"} Oct 03 08:11:23 crc kubenswrapper[4664]: I1003 08:11:23.894431 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a" path="/var/lib/kubelet/pods/3b6e1e4a-7a03-4bb2-8cce-c74a2193d24a/volumes" Oct 03 08:11:24 crc kubenswrapper[4664]: I1003 08:11:24.630700 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38363065-9fad-46b4-bb38-3cf143e66913","Type":"ContainerStarted","Data":"81eb9c781eb0b1a7a527682700a3c0e5666eaf20e7f0d795ef6ce7998bca90da"} Oct 03 08:11:24 crc kubenswrapper[4664]: I1003 08:11:24.632552 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0c810060-a84b-47cd-95cc-583e8ef6a535","Type":"ContainerStarted","Data":"1dd621fa344843a13ec96569b9f659dd131c066a6b422b1b54c37c498a27882d"} Oct 03 08:11:24 crc kubenswrapper[4664]: I1003 08:11:24.632574 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0c810060-a84b-47cd-95cc-583e8ef6a535","Type":"ContainerStarted","Data":"330a2a7c95e89213be7693e703668762c278d503763d9b016850748e16d7a222"} Oct 03 08:11:24 crc kubenswrapper[4664]: I1003 08:11:24.656384 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.6563624580000003 podStartE2EDuration="2.656362458s" podCreationTimestamp="2025-10-03 08:11:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:11:24.654142003 +0000 UTC m=+1385.475332503" watchObservedRunningTime="2025-10-03 08:11:24.656362458 +0000 UTC m=+1385.477552968" Oct 03 08:11:24 crc kubenswrapper[4664]: I1003 08:11:24.954275 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 03 08:11:25 crc kubenswrapper[4664]: I1003 08:11:25.643773 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38363065-9fad-46b4-bb38-3cf143e66913","Type":"ContainerStarted","Data":"e637af41c173bf613d4f38203f5c8864fb7a71c5707ea3b918b5213cc202608c"} Oct 03 08:11:25 crc kubenswrapper[4664]: I1003 08:11:25.644244 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 08:11:25 crc kubenswrapper[4664]: I1003 08:11:25.669506 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.143264784 podStartE2EDuration="5.669487285s" podCreationTimestamp="2025-10-03 08:11:20 +0000 UTC" firstStartedPulling="2025-10-03 08:11:21.670506787 +0000 UTC m=+1382.491697277" lastFinishedPulling="2025-10-03 08:11:25.196729288 +0000 UTC m=+1386.017919778" observedRunningTime="2025-10-03 08:11:25.669353041 +0000 UTC m=+1386.490543541" watchObservedRunningTime="2025-10-03 08:11:25.669487285 +0000 UTC m=+1386.490677775" Oct 03 08:11:26 crc kubenswrapper[4664]: I1003 08:11:26.100191 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 03 08:11:26 crc kubenswrapper[4664]: I1003 08:11:26.911182 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 03 08:11:31 crc kubenswrapper[4664]: I1003 08:11:31.100927 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 03 08:11:31 crc kubenswrapper[4664]: I1003 08:11:31.127357 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 03 08:11:31 crc kubenswrapper[4664]: I1003 08:11:31.727438 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 03 08:11:33 crc kubenswrapper[4664]: I1003 08:11:33.044741 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 08:11:33 crc kubenswrapper[4664]: I1003 08:11:33.044798 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 08:11:34 crc kubenswrapper[4664]: I1003 08:11:34.085874 4664 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0c810060-a84b-47cd-95cc-583e8ef6a535" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 08:11:34 crc kubenswrapper[4664]: I1003 08:11:34.126891 4664 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0c810060-a84b-47cd-95cc-583e8ef6a535" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 08:11:38 crc kubenswrapper[4664]: I1003 08:11:38.218583 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-g8q67"] Oct 03 08:11:38 crc kubenswrapper[4664]: I1003 08:11:38.221407 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g8q67" Oct 03 08:11:38 crc kubenswrapper[4664]: I1003 08:11:38.228473 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g8q67"] Oct 03 08:11:38 crc kubenswrapper[4664]: I1003 08:11:38.332967 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tdfm\" (UniqueName: \"kubernetes.io/projected/ba25e34e-5665-4cf0-aaf4-b23e705acd52-kube-api-access-4tdfm\") pod \"community-operators-g8q67\" (UID: \"ba25e34e-5665-4cf0-aaf4-b23e705acd52\") " pod="openshift-marketplace/community-operators-g8q67" Oct 03 08:11:38 crc kubenswrapper[4664]: I1003 08:11:38.333074 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba25e34e-5665-4cf0-aaf4-b23e705acd52-utilities\") pod \"community-operators-g8q67\" (UID: \"ba25e34e-5665-4cf0-aaf4-b23e705acd52\") " pod="openshift-marketplace/community-operators-g8q67" Oct 03 08:11:38 crc kubenswrapper[4664]: I1003 08:11:38.333110 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba25e34e-5665-4cf0-aaf4-b23e705acd52-catalog-content\") pod \"community-operators-g8q67\" (UID: \"ba25e34e-5665-4cf0-aaf4-b23e705acd52\") " pod="openshift-marketplace/community-operators-g8q67" Oct 03 08:11:38 crc kubenswrapper[4664]: I1003 08:11:38.435431 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba25e34e-5665-4cf0-aaf4-b23e705acd52-utilities\") pod \"community-operators-g8q67\" (UID: \"ba25e34e-5665-4cf0-aaf4-b23e705acd52\") " pod="openshift-marketplace/community-operators-g8q67" Oct 03 08:11:38 crc kubenswrapper[4664]: I1003 08:11:38.435501 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba25e34e-5665-4cf0-aaf4-b23e705acd52-catalog-content\") pod \"community-operators-g8q67\" (UID: \"ba25e34e-5665-4cf0-aaf4-b23e705acd52\") " pod="openshift-marketplace/community-operators-g8q67" Oct 03 08:11:38 crc kubenswrapper[4664]: I1003 08:11:38.435665 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tdfm\" (UniqueName: \"kubernetes.io/projected/ba25e34e-5665-4cf0-aaf4-b23e705acd52-kube-api-access-4tdfm\") pod \"community-operators-g8q67\" (UID: \"ba25e34e-5665-4cf0-aaf4-b23e705acd52\") " pod="openshift-marketplace/community-operators-g8q67" Oct 03 08:11:38 crc kubenswrapper[4664]: I1003 08:11:38.436242 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba25e34e-5665-4cf0-aaf4-b23e705acd52-utilities\") pod \"community-operators-g8q67\" (UID: \"ba25e34e-5665-4cf0-aaf4-b23e705acd52\") " pod="openshift-marketplace/community-operators-g8q67" Oct 03 08:11:38 crc kubenswrapper[4664]: I1003 08:11:38.436263 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba25e34e-5665-4cf0-aaf4-b23e705acd52-catalog-content\") pod \"community-operators-g8q67\" (UID: \"ba25e34e-5665-4cf0-aaf4-b23e705acd52\") " pod="openshift-marketplace/community-operators-g8q67" Oct 03 08:11:38 crc kubenswrapper[4664]: I1003 08:11:38.460099 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tdfm\" (UniqueName: \"kubernetes.io/projected/ba25e34e-5665-4cf0-aaf4-b23e705acd52-kube-api-access-4tdfm\") pod \"community-operators-g8q67\" (UID: \"ba25e34e-5665-4cf0-aaf4-b23e705acd52\") " pod="openshift-marketplace/community-operators-g8q67" Oct 03 08:11:38 crc kubenswrapper[4664]: I1003 08:11:38.556724 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g8q67" Oct 03 08:11:38 crc kubenswrapper[4664]: I1003 08:11:38.702254 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 08:11:38 crc kubenswrapper[4664]: I1003 08:11:38.796556 4664 generic.go:334] "Generic (PLEG): container finished" podID="8ca8d00b-c895-4c55-afd7-add414c51d9e" containerID="e66fb412acce8750bbcf82b71f1085888925806eb83b282705799a0d209f6c8c" exitCode=137 Oct 03 08:11:38 crc kubenswrapper[4664]: I1003 08:11:38.796708 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8ca8d00b-c895-4c55-afd7-add414c51d9e","Type":"ContainerDied","Data":"e66fb412acce8750bbcf82b71f1085888925806eb83b282705799a0d209f6c8c"} Oct 03 08:11:38 crc kubenswrapper[4664]: I1003 08:11:38.799406 4664 generic.go:334] "Generic (PLEG): container finished" podID="0cd69772-a7a7-4126-9302-e7aae399ee11" containerID="52634ef9946d611a3caff50a56e86b1ac0e9a595f7a33923319fa611ae831287" exitCode=137 Oct 03 08:11:38 crc kubenswrapper[4664]: I1003 08:11:38.799457 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0cd69772-a7a7-4126-9302-e7aae399ee11","Type":"ContainerDied","Data":"52634ef9946d611a3caff50a56e86b1ac0e9a595f7a33923319fa611ae831287"} Oct 03 08:11:38 crc kubenswrapper[4664]: I1003 08:11:38.799489 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0cd69772-a7a7-4126-9302-e7aae399ee11","Type":"ContainerDied","Data":"e72ba4aeff440834fff4aacd47ec0e91f4be50f2334658e944084a239231443b"} Oct 03 08:11:38 crc kubenswrapper[4664]: I1003 08:11:38.799508 4664 scope.go:117] "RemoveContainer" containerID="52634ef9946d611a3caff50a56e86b1ac0e9a595f7a33923319fa611ae831287" Oct 03 08:11:38 crc kubenswrapper[4664]: I1003 08:11:38.799782 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 08:11:38 crc kubenswrapper[4664]: I1003 08:11:38.839903 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 08:11:38 crc kubenswrapper[4664]: I1003 08:11:38.851575 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t74pm\" (UniqueName: \"kubernetes.io/projected/0cd69772-a7a7-4126-9302-e7aae399ee11-kube-api-access-t74pm\") pod \"0cd69772-a7a7-4126-9302-e7aae399ee11\" (UID: \"0cd69772-a7a7-4126-9302-e7aae399ee11\") " Oct 03 08:11:38 crc kubenswrapper[4664]: I1003 08:11:38.851669 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cd69772-a7a7-4126-9302-e7aae399ee11-config-data\") pod \"0cd69772-a7a7-4126-9302-e7aae399ee11\" (UID: \"0cd69772-a7a7-4126-9302-e7aae399ee11\") " Oct 03 08:11:38 crc kubenswrapper[4664]: I1003 08:11:38.851722 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cd69772-a7a7-4126-9302-e7aae399ee11-combined-ca-bundle\") pod \"0cd69772-a7a7-4126-9302-e7aae399ee11\" (UID: \"0cd69772-a7a7-4126-9302-e7aae399ee11\") " Oct 03 08:11:38 crc kubenswrapper[4664]: I1003 08:11:38.871988 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cd69772-a7a7-4126-9302-e7aae399ee11-kube-api-access-t74pm" (OuterVolumeSpecName: "kube-api-access-t74pm") pod "0cd69772-a7a7-4126-9302-e7aae399ee11" (UID: "0cd69772-a7a7-4126-9302-e7aae399ee11"). InnerVolumeSpecName "kube-api-access-t74pm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:11:38 crc kubenswrapper[4664]: I1003 08:11:38.886809 4664 scope.go:117] "RemoveContainer" containerID="52634ef9946d611a3caff50a56e86b1ac0e9a595f7a33923319fa611ae831287" Oct 03 08:11:38 crc kubenswrapper[4664]: E1003 08:11:38.896285 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52634ef9946d611a3caff50a56e86b1ac0e9a595f7a33923319fa611ae831287\": container with ID starting with 52634ef9946d611a3caff50a56e86b1ac0e9a595f7a33923319fa611ae831287 not found: ID does not exist" containerID="52634ef9946d611a3caff50a56e86b1ac0e9a595f7a33923319fa611ae831287" Oct 03 08:11:38 crc kubenswrapper[4664]: I1003 08:11:38.896572 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52634ef9946d611a3caff50a56e86b1ac0e9a595f7a33923319fa611ae831287"} err="failed to get container status \"52634ef9946d611a3caff50a56e86b1ac0e9a595f7a33923319fa611ae831287\": rpc error: code = NotFound desc = could not find container \"52634ef9946d611a3caff50a56e86b1ac0e9a595f7a33923319fa611ae831287\": container with ID starting with 52634ef9946d611a3caff50a56e86b1ac0e9a595f7a33923319fa611ae831287 not found: ID does not exist" Oct 03 08:11:38 crc kubenswrapper[4664]: I1003 08:11:38.916300 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cd69772-a7a7-4126-9302-e7aae399ee11-config-data" (OuterVolumeSpecName: "config-data") pod "0cd69772-a7a7-4126-9302-e7aae399ee11" (UID: "0cd69772-a7a7-4126-9302-e7aae399ee11"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:11:38 crc kubenswrapper[4664]: I1003 08:11:38.918679 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cd69772-a7a7-4126-9302-e7aae399ee11-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0cd69772-a7a7-4126-9302-e7aae399ee11" (UID: "0cd69772-a7a7-4126-9302-e7aae399ee11"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:11:38 crc kubenswrapper[4664]: I1003 08:11:38.953235 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chx6s\" (UniqueName: \"kubernetes.io/projected/8ca8d00b-c895-4c55-afd7-add414c51d9e-kube-api-access-chx6s\") pod \"8ca8d00b-c895-4c55-afd7-add414c51d9e\" (UID: \"8ca8d00b-c895-4c55-afd7-add414c51d9e\") " Oct 03 08:11:38 crc kubenswrapper[4664]: I1003 08:11:38.953414 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ca8d00b-c895-4c55-afd7-add414c51d9e-config-data\") pod \"8ca8d00b-c895-4c55-afd7-add414c51d9e\" (UID: \"8ca8d00b-c895-4c55-afd7-add414c51d9e\") " Oct 03 08:11:38 crc kubenswrapper[4664]: I1003 08:11:38.953583 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ca8d00b-c895-4c55-afd7-add414c51d9e-combined-ca-bundle\") pod \"8ca8d00b-c895-4c55-afd7-add414c51d9e\" (UID: \"8ca8d00b-c895-4c55-afd7-add414c51d9e\") " Oct 03 08:11:38 crc kubenswrapper[4664]: I1003 08:11:38.953628 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ca8d00b-c895-4c55-afd7-add414c51d9e-logs\") pod \"8ca8d00b-c895-4c55-afd7-add414c51d9e\" (UID: \"8ca8d00b-c895-4c55-afd7-add414c51d9e\") " Oct 03 08:11:38 crc kubenswrapper[4664]: I1003 08:11:38.954195 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t74pm\" (UniqueName: \"kubernetes.io/projected/0cd69772-a7a7-4126-9302-e7aae399ee11-kube-api-access-t74pm\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:38 crc kubenswrapper[4664]: I1003 08:11:38.954217 4664 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cd69772-a7a7-4126-9302-e7aae399ee11-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:38 crc kubenswrapper[4664]: I1003 08:11:38.954228 4664 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cd69772-a7a7-4126-9302-e7aae399ee11-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:38 crc kubenswrapper[4664]: I1003 08:11:38.955049 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ca8d00b-c895-4c55-afd7-add414c51d9e-logs" (OuterVolumeSpecName: "logs") pod "8ca8d00b-c895-4c55-afd7-add414c51d9e" (UID: "8ca8d00b-c895-4c55-afd7-add414c51d9e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:11:38 crc kubenswrapper[4664]: I1003 08:11:38.958455 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ca8d00b-c895-4c55-afd7-add414c51d9e-kube-api-access-chx6s" (OuterVolumeSpecName: "kube-api-access-chx6s") pod "8ca8d00b-c895-4c55-afd7-add414c51d9e" (UID: "8ca8d00b-c895-4c55-afd7-add414c51d9e"). InnerVolumeSpecName "kube-api-access-chx6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:11:38 crc kubenswrapper[4664]: I1003 08:11:38.978549 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ca8d00b-c895-4c55-afd7-add414c51d9e-config-data" (OuterVolumeSpecName: "config-data") pod "8ca8d00b-c895-4c55-afd7-add414c51d9e" (UID: "8ca8d00b-c895-4c55-afd7-add414c51d9e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:11:38 crc kubenswrapper[4664]: I1003 08:11:38.981505 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ca8d00b-c895-4c55-afd7-add414c51d9e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ca8d00b-c895-4c55-afd7-add414c51d9e" (UID: "8ca8d00b-c895-4c55-afd7-add414c51d9e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:11:39 crc kubenswrapper[4664]: I1003 08:11:39.055661 4664 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ca8d00b-c895-4c55-afd7-add414c51d9e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:39 crc kubenswrapper[4664]: I1003 08:11:39.055701 4664 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ca8d00b-c895-4c55-afd7-add414c51d9e-logs\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:39 crc kubenswrapper[4664]: I1003 08:11:39.055715 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chx6s\" (UniqueName: \"kubernetes.io/projected/8ca8d00b-c895-4c55-afd7-add414c51d9e-kube-api-access-chx6s\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:39 crc kubenswrapper[4664]: I1003 08:11:39.055728 4664 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ca8d00b-c895-4c55-afd7-add414c51d9e-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:39 crc kubenswrapper[4664]: I1003 08:11:39.171827 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 08:11:39 crc kubenswrapper[4664]: I1003 08:11:39.191674 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 08:11:39 crc kubenswrapper[4664]: I1003 08:11:39.215156 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 08:11:39 crc kubenswrapper[4664]: E1003 08:11:39.215681 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ca8d00b-c895-4c55-afd7-add414c51d9e" containerName="nova-metadata-metadata" Oct 03 08:11:39 crc kubenswrapper[4664]: I1003 08:11:39.215725 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ca8d00b-c895-4c55-afd7-add414c51d9e" containerName="nova-metadata-metadata" Oct 03 08:11:39 crc kubenswrapper[4664]: E1003 08:11:39.215763 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cd69772-a7a7-4126-9302-e7aae399ee11" containerName="nova-cell1-novncproxy-novncproxy" Oct 03 08:11:39 crc kubenswrapper[4664]: I1003 08:11:39.215773 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cd69772-a7a7-4126-9302-e7aae399ee11" containerName="nova-cell1-novncproxy-novncproxy" Oct 03 08:11:39 crc kubenswrapper[4664]: E1003 08:11:39.215808 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ca8d00b-c895-4c55-afd7-add414c51d9e" containerName="nova-metadata-log" Oct 03 08:11:39 crc kubenswrapper[4664]: I1003 08:11:39.215817 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ca8d00b-c895-4c55-afd7-add414c51d9e" containerName="nova-metadata-log" Oct 03 08:11:39 crc kubenswrapper[4664]: I1003 08:11:39.216054 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ca8d00b-c895-4c55-afd7-add414c51d9e" containerName="nova-metadata-log" Oct 03 08:11:39 crc kubenswrapper[4664]: I1003 08:11:39.216095 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cd69772-a7a7-4126-9302-e7aae399ee11" containerName="nova-cell1-novncproxy-novncproxy" Oct 03 08:11:39 crc kubenswrapper[4664]: I1003 08:11:39.216114 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ca8d00b-c895-4c55-afd7-add414c51d9e" containerName="nova-metadata-metadata" Oct 03 08:11:39 crc kubenswrapper[4664]: I1003 08:11:39.216959 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 08:11:39 crc kubenswrapper[4664]: I1003 08:11:39.225890 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g8q67"] Oct 03 08:11:39 crc kubenswrapper[4664]: I1003 08:11:39.227533 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 03 08:11:39 crc kubenswrapper[4664]: I1003 08:11:39.227641 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 03 08:11:39 crc kubenswrapper[4664]: I1003 08:11:39.227915 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 03 08:11:39 crc kubenswrapper[4664]: I1003 08:11:39.237366 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 08:11:39 crc kubenswrapper[4664]: I1003 08:11:39.361905 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc938212-d4cd-4ab5-bc5c-126715d0e3d4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc938212-d4cd-4ab5-bc5c-126715d0e3d4\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 08:11:39 crc kubenswrapper[4664]: I1003 08:11:39.362215 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jncfz\" (UniqueName: \"kubernetes.io/projected/dc938212-d4cd-4ab5-bc5c-126715d0e3d4-kube-api-access-jncfz\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc938212-d4cd-4ab5-bc5c-126715d0e3d4\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 08:11:39 crc kubenswrapper[4664]: I1003 08:11:39.362352 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc938212-d4cd-4ab5-bc5c-126715d0e3d4-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc938212-d4cd-4ab5-bc5c-126715d0e3d4\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 08:11:39 crc kubenswrapper[4664]: I1003 08:11:39.362395 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc938212-d4cd-4ab5-bc5c-126715d0e3d4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc938212-d4cd-4ab5-bc5c-126715d0e3d4\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 08:11:39 crc kubenswrapper[4664]: I1003 08:11:39.362676 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc938212-d4cd-4ab5-bc5c-126715d0e3d4-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc938212-d4cd-4ab5-bc5c-126715d0e3d4\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 08:11:39 crc kubenswrapper[4664]: I1003 08:11:39.465060 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc938212-d4cd-4ab5-bc5c-126715d0e3d4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc938212-d4cd-4ab5-bc5c-126715d0e3d4\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 08:11:39 crc kubenswrapper[4664]: I1003 08:11:39.465101 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jncfz\" (UniqueName: \"kubernetes.io/projected/dc938212-d4cd-4ab5-bc5c-126715d0e3d4-kube-api-access-jncfz\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc938212-d4cd-4ab5-bc5c-126715d0e3d4\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 08:11:39 crc kubenswrapper[4664]: I1003 08:11:39.465216 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc938212-d4cd-4ab5-bc5c-126715d0e3d4-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc938212-d4cd-4ab5-bc5c-126715d0e3d4\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 08:11:39 crc kubenswrapper[4664]: I1003 08:11:39.465250 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc938212-d4cd-4ab5-bc5c-126715d0e3d4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc938212-d4cd-4ab5-bc5c-126715d0e3d4\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 08:11:39 crc kubenswrapper[4664]: I1003 08:11:39.465920 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc938212-d4cd-4ab5-bc5c-126715d0e3d4-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc938212-d4cd-4ab5-bc5c-126715d0e3d4\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 08:11:39 crc kubenswrapper[4664]: I1003 08:11:39.469352 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc938212-d4cd-4ab5-bc5c-126715d0e3d4-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc938212-d4cd-4ab5-bc5c-126715d0e3d4\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 08:11:39 crc kubenswrapper[4664]: I1003 08:11:39.470312 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc938212-d4cd-4ab5-bc5c-126715d0e3d4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc938212-d4cd-4ab5-bc5c-126715d0e3d4\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 08:11:39 crc kubenswrapper[4664]: I1003 08:11:39.471829 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc938212-d4cd-4ab5-bc5c-126715d0e3d4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc938212-d4cd-4ab5-bc5c-126715d0e3d4\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 08:11:39 crc kubenswrapper[4664]: I1003 08:11:39.472028 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc938212-d4cd-4ab5-bc5c-126715d0e3d4-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc938212-d4cd-4ab5-bc5c-126715d0e3d4\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 08:11:39 crc kubenswrapper[4664]: I1003 08:11:39.484723 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jncfz\" (UniqueName: \"kubernetes.io/projected/dc938212-d4cd-4ab5-bc5c-126715d0e3d4-kube-api-access-jncfz\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc938212-d4cd-4ab5-bc5c-126715d0e3d4\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 08:11:39 crc kubenswrapper[4664]: I1003 08:11:39.548236 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 08:11:39 crc kubenswrapper[4664]: I1003 08:11:39.810318 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8ca8d00b-c895-4c55-afd7-add414c51d9e","Type":"ContainerDied","Data":"6f85f1e8ec87d01d19181d6d6310bfbfbed288e85fdd0ab5ed0858fcf01d74c1"} Oct 03 08:11:39 crc kubenswrapper[4664]: I1003 08:11:39.810390 4664 scope.go:117] "RemoveContainer" containerID="e66fb412acce8750bbcf82b71f1085888925806eb83b282705799a0d209f6c8c" Oct 03 08:11:39 crc kubenswrapper[4664]: I1003 08:11:39.810513 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 08:11:39 crc kubenswrapper[4664]: I1003 08:11:39.816249 4664 generic.go:334] "Generic (PLEG): container finished" podID="ba25e34e-5665-4cf0-aaf4-b23e705acd52" containerID="7e80d0bddc347f88a7b4c46a0739f4eeed7187235f59d865236b02a13e569015" exitCode=0 Oct 03 08:11:39 crc kubenswrapper[4664]: I1003 08:11:39.816291 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g8q67" event={"ID":"ba25e34e-5665-4cf0-aaf4-b23e705acd52","Type":"ContainerDied","Data":"7e80d0bddc347f88a7b4c46a0739f4eeed7187235f59d865236b02a13e569015"} Oct 03 08:11:39 crc kubenswrapper[4664]: I1003 08:11:39.816317 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g8q67" event={"ID":"ba25e34e-5665-4cf0-aaf4-b23e705acd52","Type":"ContainerStarted","Data":"ede8540e9a2bf64206a8d0850abd148129a0aa953df9dd54130035648b0bd6bc"} Oct 03 08:11:39 crc kubenswrapper[4664]: I1003 08:11:39.837437 4664 scope.go:117] "RemoveContainer" containerID="2bd29c0215789aed0e5598e84d66401fcb43b598c235b1eef5cc50064ad78561" Oct 03 08:11:39 crc kubenswrapper[4664]: I1003 08:11:39.864661 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 08:11:39 crc kubenswrapper[4664]: I1003 08:11:39.890972 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cd69772-a7a7-4126-9302-e7aae399ee11" path="/var/lib/kubelet/pods/0cd69772-a7a7-4126-9302-e7aae399ee11/volumes" Oct 03 08:11:39 crc kubenswrapper[4664]: I1003 08:11:39.891523 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 08:11:39 crc kubenswrapper[4664]: I1003 08:11:39.895651 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 03 08:11:39 crc kubenswrapper[4664]: I1003 08:11:39.897154 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 08:11:39 crc kubenswrapper[4664]: I1003 08:11:39.899800 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 03 08:11:39 crc kubenswrapper[4664]: I1003 08:11:39.899976 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 03 08:11:39 crc kubenswrapper[4664]: I1003 08:11:39.907998 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 08:11:39 crc kubenswrapper[4664]: I1003 08:11:39.965797 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 08:11:39 crc kubenswrapper[4664]: W1003 08:11:39.969712 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc938212_d4cd_4ab5_bc5c_126715d0e3d4.slice/crio-07f90386a4ed8130091dfb2f85d9e16bfeffce8911ae9b745c557f28ac48af71 WatchSource:0}: Error finding container 07f90386a4ed8130091dfb2f85d9e16bfeffce8911ae9b745c557f28ac48af71: Status 404 returned error can't find the container with id 07f90386a4ed8130091dfb2f85d9e16bfeffce8911ae9b745c557f28ac48af71 Oct 03 08:11:40 crc kubenswrapper[4664]: I1003 08:11:40.077688 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c157f08-279b-48dd-85a3-14dd87b27864-config-data\") pod \"nova-metadata-0\" (UID: \"0c157f08-279b-48dd-85a3-14dd87b27864\") " pod="openstack/nova-metadata-0" Oct 03 08:11:40 crc kubenswrapper[4664]: I1003 08:11:40.077749 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c157f08-279b-48dd-85a3-14dd87b27864-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0c157f08-279b-48dd-85a3-14dd87b27864\") " pod="openstack/nova-metadata-0" Oct 03 08:11:40 crc kubenswrapper[4664]: I1003 08:11:40.077854 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c157f08-279b-48dd-85a3-14dd87b27864-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0c157f08-279b-48dd-85a3-14dd87b27864\") " pod="openstack/nova-metadata-0" Oct 03 08:11:40 crc kubenswrapper[4664]: I1003 08:11:40.077896 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c157f08-279b-48dd-85a3-14dd87b27864-logs\") pod \"nova-metadata-0\" (UID: \"0c157f08-279b-48dd-85a3-14dd87b27864\") " pod="openstack/nova-metadata-0" Oct 03 08:11:40 crc kubenswrapper[4664]: I1003 08:11:40.077926 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtn9t\" (UniqueName: \"kubernetes.io/projected/0c157f08-279b-48dd-85a3-14dd87b27864-kube-api-access-wtn9t\") pod \"nova-metadata-0\" (UID: \"0c157f08-279b-48dd-85a3-14dd87b27864\") " pod="openstack/nova-metadata-0" Oct 03 08:11:40 crc kubenswrapper[4664]: I1003 08:11:40.180433 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c157f08-279b-48dd-85a3-14dd87b27864-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0c157f08-279b-48dd-85a3-14dd87b27864\") " pod="openstack/nova-metadata-0" Oct 03 08:11:40 crc kubenswrapper[4664]: I1003 08:11:40.180548 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c157f08-279b-48dd-85a3-14dd87b27864-logs\") pod \"nova-metadata-0\" (UID: \"0c157f08-279b-48dd-85a3-14dd87b27864\") " pod="openstack/nova-metadata-0" Oct 03 08:11:40 crc kubenswrapper[4664]: I1003 08:11:40.180594 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtn9t\" (UniqueName: \"kubernetes.io/projected/0c157f08-279b-48dd-85a3-14dd87b27864-kube-api-access-wtn9t\") pod \"nova-metadata-0\" (UID: \"0c157f08-279b-48dd-85a3-14dd87b27864\") " pod="openstack/nova-metadata-0" Oct 03 08:11:40 crc kubenswrapper[4664]: I1003 08:11:40.180673 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c157f08-279b-48dd-85a3-14dd87b27864-config-data\") pod \"nova-metadata-0\" (UID: \"0c157f08-279b-48dd-85a3-14dd87b27864\") " pod="openstack/nova-metadata-0" Oct 03 08:11:40 crc kubenswrapper[4664]: I1003 08:11:40.180695 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c157f08-279b-48dd-85a3-14dd87b27864-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0c157f08-279b-48dd-85a3-14dd87b27864\") " pod="openstack/nova-metadata-0" Oct 03 08:11:40 crc kubenswrapper[4664]: I1003 08:11:40.181120 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c157f08-279b-48dd-85a3-14dd87b27864-logs\") pod \"nova-metadata-0\" (UID: \"0c157f08-279b-48dd-85a3-14dd87b27864\") " pod="openstack/nova-metadata-0" Oct 03 08:11:40 crc kubenswrapper[4664]: I1003 08:11:40.185177 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c157f08-279b-48dd-85a3-14dd87b27864-config-data\") pod \"nova-metadata-0\" (UID: \"0c157f08-279b-48dd-85a3-14dd87b27864\") " pod="openstack/nova-metadata-0" Oct 03 08:11:40 crc kubenswrapper[4664]: I1003 08:11:40.186039 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c157f08-279b-48dd-85a3-14dd87b27864-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0c157f08-279b-48dd-85a3-14dd87b27864\") " pod="openstack/nova-metadata-0" Oct 03 08:11:40 crc kubenswrapper[4664]: I1003 08:11:40.187408 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c157f08-279b-48dd-85a3-14dd87b27864-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0c157f08-279b-48dd-85a3-14dd87b27864\") " pod="openstack/nova-metadata-0" Oct 03 08:11:40 crc kubenswrapper[4664]: I1003 08:11:40.200445 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtn9t\" (UniqueName: \"kubernetes.io/projected/0c157f08-279b-48dd-85a3-14dd87b27864-kube-api-access-wtn9t\") pod \"nova-metadata-0\" (UID: \"0c157f08-279b-48dd-85a3-14dd87b27864\") " pod="openstack/nova-metadata-0" Oct 03 08:11:40 crc kubenswrapper[4664]: I1003 08:11:40.215689 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 08:11:40 crc kubenswrapper[4664]: I1003 08:11:40.516832 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 08:11:40 crc kubenswrapper[4664]: I1003 08:11:40.843148 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"dc938212-d4cd-4ab5-bc5c-126715d0e3d4","Type":"ContainerStarted","Data":"57b9ef706d956e86d39cd5501ec8478360871eebaa6673589c3a0985452e862f"} Oct 03 08:11:40 crc kubenswrapper[4664]: I1003 08:11:40.843206 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"dc938212-d4cd-4ab5-bc5c-126715d0e3d4","Type":"ContainerStarted","Data":"07f90386a4ed8130091dfb2f85d9e16bfeffce8911ae9b745c557f28ac48af71"} Oct 03 08:11:40 crc kubenswrapper[4664]: I1003 08:11:40.883857 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0c157f08-279b-48dd-85a3-14dd87b27864","Type":"ContainerStarted","Data":"958aa7ca0a5322005f9012262823ce36614c9dc908ce654220fe4e337d2168b0"} Oct 03 08:11:40 crc kubenswrapper[4664]: I1003 08:11:40.883919 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0c157f08-279b-48dd-85a3-14dd87b27864","Type":"ContainerStarted","Data":"e012e106913ea77f1084dd61c14fad3c53e449083574b74f9909fd8fe99572e9"} Oct 03 08:11:40 crc kubenswrapper[4664]: I1003 08:11:40.921792 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.921774031 podStartE2EDuration="1.921774031s" podCreationTimestamp="2025-10-03 08:11:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:11:40.879184327 +0000 UTC m=+1401.700374847" watchObservedRunningTime="2025-10-03 08:11:40.921774031 +0000 UTC m=+1401.742964521" Oct 03 08:11:41 crc kubenswrapper[4664]: I1003 08:11:41.900409 4664 generic.go:334] "Generic (PLEG): container finished" podID="ba25e34e-5665-4cf0-aaf4-b23e705acd52" containerID="d0db8bea4b10f1cd8034e6325b94671fca1165cb4681ad7d040797c9f9402f22" exitCode=0 Oct 03 08:11:41 crc kubenswrapper[4664]: I1003 08:11:41.910792 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ca8d00b-c895-4c55-afd7-add414c51d9e" path="/var/lib/kubelet/pods/8ca8d00b-c895-4c55-afd7-add414c51d9e/volumes" Oct 03 08:11:41 crc kubenswrapper[4664]: I1003 08:11:41.914883 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0c157f08-279b-48dd-85a3-14dd87b27864","Type":"ContainerStarted","Data":"5b2a58751be16c596b359d4ce29d6e09592cec35efa1747f430670c24874ea23"} Oct 03 08:11:41 crc kubenswrapper[4664]: I1003 08:11:41.914998 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g8q67" event={"ID":"ba25e34e-5665-4cf0-aaf4-b23e705acd52","Type":"ContainerDied","Data":"d0db8bea4b10f1cd8034e6325b94671fca1165cb4681ad7d040797c9f9402f22"} Oct 03 08:11:41 crc kubenswrapper[4664]: I1003 08:11:41.921351 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.921331993 podStartE2EDuration="2.921331993s" podCreationTimestamp="2025-10-03 08:11:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:11:41.916976036 +0000 UTC m=+1402.738166546" watchObservedRunningTime="2025-10-03 08:11:41.921331993 +0000 UTC m=+1402.742522483" Oct 03 08:11:42 crc kubenswrapper[4664]: I1003 08:11:42.912979 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g8q67" event={"ID":"ba25e34e-5665-4cf0-aaf4-b23e705acd52","Type":"ContainerStarted","Data":"c32acf42b0a0e4bc1796355961b1ed1400990a743fabe124beb3950599f9e840"} Oct 03 08:11:42 crc kubenswrapper[4664]: I1003 08:11:42.931475 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-g8q67" podStartSLOduration=2.184984902 podStartE2EDuration="4.931446252s" podCreationTimestamp="2025-10-03 08:11:38 +0000 UTC" firstStartedPulling="2025-10-03 08:11:39.818296024 +0000 UTC m=+1400.639486514" lastFinishedPulling="2025-10-03 08:11:42.564757384 +0000 UTC m=+1403.385947864" observedRunningTime="2025-10-03 08:11:42.930131924 +0000 UTC m=+1403.751322424" watchObservedRunningTime="2025-10-03 08:11:42.931446252 +0000 UTC m=+1403.752636742" Oct 03 08:11:43 crc kubenswrapper[4664]: I1003 08:11:43.049024 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 03 08:11:43 crc kubenswrapper[4664]: I1003 08:11:43.049589 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 03 08:11:43 crc kubenswrapper[4664]: I1003 08:11:43.054135 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 03 08:11:43 crc kubenswrapper[4664]: I1003 08:11:43.054293 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 03 08:11:43 crc kubenswrapper[4664]: I1003 08:11:43.922489 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 03 08:11:43 crc kubenswrapper[4664]: I1003 08:11:43.927084 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 03 08:11:44 crc kubenswrapper[4664]: I1003 08:11:44.136681 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-czxf9"] Oct 03 08:11:44 crc kubenswrapper[4664]: I1003 08:11:44.138298 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-czxf9" Oct 03 08:11:44 crc kubenswrapper[4664]: I1003 08:11:44.154726 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-czxf9"] Oct 03 08:11:44 crc kubenswrapper[4664]: I1003 08:11:44.268256 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90873c04-0d3c-41db-808b-8d550af4fe50-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-czxf9\" (UID: \"90873c04-0d3c-41db-808b-8d550af4fe50\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-czxf9" Oct 03 08:11:44 crc kubenswrapper[4664]: I1003 08:11:44.268342 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90873c04-0d3c-41db-808b-8d550af4fe50-config\") pod \"dnsmasq-dns-5c7b6c5df9-czxf9\" (UID: \"90873c04-0d3c-41db-808b-8d550af4fe50\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-czxf9" Oct 03 08:11:44 crc kubenswrapper[4664]: I1003 08:11:44.268372 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/90873c04-0d3c-41db-808b-8d550af4fe50-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-czxf9\" (UID: \"90873c04-0d3c-41db-808b-8d550af4fe50\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-czxf9" Oct 03 08:11:44 crc kubenswrapper[4664]: I1003 08:11:44.268422 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90873c04-0d3c-41db-808b-8d550af4fe50-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-czxf9\" (UID: \"90873c04-0d3c-41db-808b-8d550af4fe50\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-czxf9" Oct 03 08:11:44 crc kubenswrapper[4664]: I1003 08:11:44.268511 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90873c04-0d3c-41db-808b-8d550af4fe50-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-czxf9\" (UID: \"90873c04-0d3c-41db-808b-8d550af4fe50\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-czxf9" Oct 03 08:11:44 crc kubenswrapper[4664]: I1003 08:11:44.268837 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp92w\" (UniqueName: \"kubernetes.io/projected/90873c04-0d3c-41db-808b-8d550af4fe50-kube-api-access-lp92w\") pod \"dnsmasq-dns-5c7b6c5df9-czxf9\" (UID: \"90873c04-0d3c-41db-808b-8d550af4fe50\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-czxf9" Oct 03 08:11:44 crc kubenswrapper[4664]: I1003 08:11:44.370453 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90873c04-0d3c-41db-808b-8d550af4fe50-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-czxf9\" (UID: \"90873c04-0d3c-41db-808b-8d550af4fe50\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-czxf9" Oct 03 08:11:44 crc kubenswrapper[4664]: I1003 08:11:44.370629 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lp92w\" (UniqueName: \"kubernetes.io/projected/90873c04-0d3c-41db-808b-8d550af4fe50-kube-api-access-lp92w\") pod \"dnsmasq-dns-5c7b6c5df9-czxf9\" (UID: \"90873c04-0d3c-41db-808b-8d550af4fe50\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-czxf9" Oct 03 08:11:44 crc kubenswrapper[4664]: I1003 08:11:44.370875 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90873c04-0d3c-41db-808b-8d550af4fe50-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-czxf9\" (UID: \"90873c04-0d3c-41db-808b-8d550af4fe50\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-czxf9" Oct 03 08:11:44 crc kubenswrapper[4664]: I1003 08:11:44.370943 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90873c04-0d3c-41db-808b-8d550af4fe50-config\") pod \"dnsmasq-dns-5c7b6c5df9-czxf9\" (UID: \"90873c04-0d3c-41db-808b-8d550af4fe50\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-czxf9" Oct 03 08:11:44 crc kubenswrapper[4664]: I1003 08:11:44.370974 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/90873c04-0d3c-41db-808b-8d550af4fe50-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-czxf9\" (UID: \"90873c04-0d3c-41db-808b-8d550af4fe50\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-czxf9" Oct 03 08:11:44 crc kubenswrapper[4664]: I1003 08:11:44.371027 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90873c04-0d3c-41db-808b-8d550af4fe50-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-czxf9\" (UID: \"90873c04-0d3c-41db-808b-8d550af4fe50\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-czxf9" Oct 03 08:11:44 crc kubenswrapper[4664]: I1003 08:11:44.371990 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90873c04-0d3c-41db-808b-8d550af4fe50-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-czxf9\" (UID: \"90873c04-0d3c-41db-808b-8d550af4fe50\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-czxf9" Oct 03 08:11:44 crc kubenswrapper[4664]: I1003 08:11:44.372156 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90873c04-0d3c-41db-808b-8d550af4fe50-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-czxf9\" (UID: \"90873c04-0d3c-41db-808b-8d550af4fe50\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-czxf9" Oct 03 08:11:44 crc kubenswrapper[4664]: I1003 08:11:44.372200 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90873c04-0d3c-41db-808b-8d550af4fe50-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-czxf9\" (UID: \"90873c04-0d3c-41db-808b-8d550af4fe50\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-czxf9" Oct 03 08:11:44 crc kubenswrapper[4664]: I1003 08:11:44.372230 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90873c04-0d3c-41db-808b-8d550af4fe50-config\") pod \"dnsmasq-dns-5c7b6c5df9-czxf9\" (UID: \"90873c04-0d3c-41db-808b-8d550af4fe50\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-czxf9" Oct 03 08:11:44 crc kubenswrapper[4664]: I1003 08:11:44.372239 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/90873c04-0d3c-41db-808b-8d550af4fe50-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-czxf9\" (UID: \"90873c04-0d3c-41db-808b-8d550af4fe50\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-czxf9" Oct 03 08:11:44 crc kubenswrapper[4664]: I1003 08:11:44.398574 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lp92w\" (UniqueName: \"kubernetes.io/projected/90873c04-0d3c-41db-808b-8d550af4fe50-kube-api-access-lp92w\") pod \"dnsmasq-dns-5c7b6c5df9-czxf9\" (UID: \"90873c04-0d3c-41db-808b-8d550af4fe50\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-czxf9" Oct 03 08:11:44 crc kubenswrapper[4664]: I1003 08:11:44.459509 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-czxf9" Oct 03 08:11:44 crc kubenswrapper[4664]: I1003 08:11:44.548475 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 03 08:11:45 crc kubenswrapper[4664]: I1003 08:11:45.016240 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-czxf9"] Oct 03 08:11:45 crc kubenswrapper[4664]: I1003 08:11:45.217595 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 08:11:45 crc kubenswrapper[4664]: I1003 08:11:45.217923 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 08:11:45 crc kubenswrapper[4664]: I1003 08:11:45.941217 4664 generic.go:334] "Generic (PLEG): container finished" podID="90873c04-0d3c-41db-808b-8d550af4fe50" containerID="b620d494e00cebd9a19844540362b70ef19dd7906c828ea37f2362685b8dff76" exitCode=0 Oct 03 08:11:45 crc kubenswrapper[4664]: I1003 08:11:45.941319 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-czxf9" event={"ID":"90873c04-0d3c-41db-808b-8d550af4fe50","Type":"ContainerDied","Data":"b620d494e00cebd9a19844540362b70ef19dd7906c828ea37f2362685b8dff76"} Oct 03 08:11:45 crc kubenswrapper[4664]: I1003 08:11:45.941533 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-czxf9" event={"ID":"90873c04-0d3c-41db-808b-8d550af4fe50","Type":"ContainerStarted","Data":"0963a1e7e9dfccb8763fa46fa81764860893e9979f065b1d9534efe0a0392c64"} Oct 03 08:11:46 crc kubenswrapper[4664]: I1003 08:11:46.509078 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:11:46 crc kubenswrapper[4664]: I1003 08:11:46.509796 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="38363065-9fad-46b4-bb38-3cf143e66913" containerName="ceilometer-central-agent" containerID="cri-o://41fe124b4625424ca380f7fdcc0cc750af67596a8f412a25a1ba046926232d6f" gracePeriod=30 Oct 03 08:11:46 crc kubenswrapper[4664]: I1003 08:11:46.509861 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="38363065-9fad-46b4-bb38-3cf143e66913" containerName="ceilometer-notification-agent" containerID="cri-o://93c72f2411ece905b6d3e4d337e7f3080f5077c17f9cdaeb73ccbe4cbc85f288" gracePeriod=30 Oct 03 08:11:46 crc kubenswrapper[4664]: I1003 08:11:46.509881 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="38363065-9fad-46b4-bb38-3cf143e66913" containerName="sg-core" containerID="cri-o://81eb9c781eb0b1a7a527682700a3c0e5666eaf20e7f0d795ef6ce7998bca90da" gracePeriod=30 Oct 03 08:11:46 crc kubenswrapper[4664]: I1003 08:11:46.509883 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="38363065-9fad-46b4-bb38-3cf143e66913" containerName="proxy-httpd" containerID="cri-o://e637af41c173bf613d4f38203f5c8864fb7a71c5707ea3b918b5213cc202608c" gracePeriod=30 Oct 03 08:11:46 crc kubenswrapper[4664]: I1003 08:11:46.524921 4664 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="38363065-9fad-46b4-bb38-3cf143e66913" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.191:3000/\": EOF" Oct 03 08:11:46 crc kubenswrapper[4664]: I1003 08:11:46.955349 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-czxf9" event={"ID":"90873c04-0d3c-41db-808b-8d550af4fe50","Type":"ContainerStarted","Data":"e52f7c9a816336a03a3996b335ffe428595913b0431ab0ab580de426c6a8e1a3"} Oct 03 08:11:46 crc kubenswrapper[4664]: I1003 08:11:46.955930 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c7b6c5df9-czxf9" Oct 03 08:11:46 crc kubenswrapper[4664]: I1003 08:11:46.959592 4664 generic.go:334] "Generic (PLEG): container finished" podID="38363065-9fad-46b4-bb38-3cf143e66913" containerID="e637af41c173bf613d4f38203f5c8864fb7a71c5707ea3b918b5213cc202608c" exitCode=0 Oct 03 08:11:46 crc kubenswrapper[4664]: I1003 08:11:46.959651 4664 generic.go:334] "Generic (PLEG): container finished" podID="38363065-9fad-46b4-bb38-3cf143e66913" containerID="81eb9c781eb0b1a7a527682700a3c0e5666eaf20e7f0d795ef6ce7998bca90da" exitCode=2 Oct 03 08:11:46 crc kubenswrapper[4664]: I1003 08:11:46.959682 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38363065-9fad-46b4-bb38-3cf143e66913","Type":"ContainerDied","Data":"e637af41c173bf613d4f38203f5c8864fb7a71c5707ea3b918b5213cc202608c"} Oct 03 08:11:46 crc kubenswrapper[4664]: I1003 08:11:46.960220 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38363065-9fad-46b4-bb38-3cf143e66913","Type":"ContainerDied","Data":"81eb9c781eb0b1a7a527682700a3c0e5666eaf20e7f0d795ef6ce7998bca90da"} Oct 03 08:11:46 crc kubenswrapper[4664]: I1003 08:11:46.982198 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c7b6c5df9-czxf9" podStartSLOduration=2.982182462 podStartE2EDuration="2.982182462s" podCreationTimestamp="2025-10-03 08:11:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:11:46.981910394 +0000 UTC m=+1407.803100884" watchObservedRunningTime="2025-10-03 08:11:46.982182462 +0000 UTC m=+1407.803372952" Oct 03 08:11:47 crc kubenswrapper[4664]: I1003 08:11:47.142913 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 08:11:47 crc kubenswrapper[4664]: I1003 08:11:47.143148 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0c810060-a84b-47cd-95cc-583e8ef6a535" containerName="nova-api-log" containerID="cri-o://330a2a7c95e89213be7693e703668762c278d503763d9b016850748e16d7a222" gracePeriod=30 Oct 03 08:11:47 crc kubenswrapper[4664]: I1003 08:11:47.143244 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0c810060-a84b-47cd-95cc-583e8ef6a535" containerName="nova-api-api" containerID="cri-o://1dd621fa344843a13ec96569b9f659dd131c066a6b422b1b54c37c498a27882d" gracePeriod=30 Oct 03 08:11:47 crc kubenswrapper[4664]: I1003 08:11:47.972801 4664 generic.go:334] "Generic (PLEG): container finished" podID="38363065-9fad-46b4-bb38-3cf143e66913" containerID="41fe124b4625424ca380f7fdcc0cc750af67596a8f412a25a1ba046926232d6f" exitCode=0 Oct 03 08:11:47 crc kubenswrapper[4664]: I1003 08:11:47.972888 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38363065-9fad-46b4-bb38-3cf143e66913","Type":"ContainerDied","Data":"41fe124b4625424ca380f7fdcc0cc750af67596a8f412a25a1ba046926232d6f"} Oct 03 08:11:47 crc kubenswrapper[4664]: I1003 08:11:47.977959 4664 generic.go:334] "Generic (PLEG): container finished" podID="0c810060-a84b-47cd-95cc-583e8ef6a535" containerID="330a2a7c95e89213be7693e703668762c278d503763d9b016850748e16d7a222" exitCode=143 Oct 03 08:11:47 crc kubenswrapper[4664]: I1003 08:11:47.978036 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0c810060-a84b-47cd-95cc-583e8ef6a535","Type":"ContainerDied","Data":"330a2a7c95e89213be7693e703668762c278d503763d9b016850748e16d7a222"} Oct 03 08:11:48 crc kubenswrapper[4664]: I1003 08:11:48.558827 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-g8q67" Oct 03 08:11:48 crc kubenswrapper[4664]: I1003 08:11:48.558895 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-g8q67" Oct 03 08:11:48 crc kubenswrapper[4664]: I1003 08:11:48.609417 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-g8q67" Oct 03 08:11:49 crc kubenswrapper[4664]: I1003 08:11:49.044443 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-g8q67" Oct 03 08:11:49 crc kubenswrapper[4664]: I1003 08:11:49.100378 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g8q67"] Oct 03 08:11:49 crc kubenswrapper[4664]: I1003 08:11:49.549241 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 03 08:11:49 crc kubenswrapper[4664]: I1003 08:11:49.570632 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 03 08:11:49 crc kubenswrapper[4664]: I1003 08:11:49.818196 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 08:11:49 crc kubenswrapper[4664]: I1003 08:11:49.887409 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/38363065-9fad-46b4-bb38-3cf143e66913-ceilometer-tls-certs\") pod \"38363065-9fad-46b4-bb38-3cf143e66913\" (UID: \"38363065-9fad-46b4-bb38-3cf143e66913\") " Oct 03 08:11:49 crc kubenswrapper[4664]: I1003 08:11:49.887461 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38363065-9fad-46b4-bb38-3cf143e66913-log-httpd\") pod \"38363065-9fad-46b4-bb38-3cf143e66913\" (UID: \"38363065-9fad-46b4-bb38-3cf143e66913\") " Oct 03 08:11:49 crc kubenswrapper[4664]: I1003 08:11:49.887482 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpfjm\" (UniqueName: \"kubernetes.io/projected/38363065-9fad-46b4-bb38-3cf143e66913-kube-api-access-jpfjm\") pod \"38363065-9fad-46b4-bb38-3cf143e66913\" (UID: \"38363065-9fad-46b4-bb38-3cf143e66913\") " Oct 03 08:11:49 crc kubenswrapper[4664]: I1003 08:11:49.887535 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38363065-9fad-46b4-bb38-3cf143e66913-combined-ca-bundle\") pod \"38363065-9fad-46b4-bb38-3cf143e66913\" (UID: \"38363065-9fad-46b4-bb38-3cf143e66913\") " Oct 03 08:11:49 crc kubenswrapper[4664]: I1003 08:11:49.887564 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38363065-9fad-46b4-bb38-3cf143e66913-run-httpd\") pod \"38363065-9fad-46b4-bb38-3cf143e66913\" (UID: \"38363065-9fad-46b4-bb38-3cf143e66913\") " Oct 03 08:11:49 crc kubenswrapper[4664]: I1003 08:11:49.887737 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38363065-9fad-46b4-bb38-3cf143e66913-config-data\") pod \"38363065-9fad-46b4-bb38-3cf143e66913\" (UID: \"38363065-9fad-46b4-bb38-3cf143e66913\") " Oct 03 08:11:49 crc kubenswrapper[4664]: I1003 08:11:49.887839 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/38363065-9fad-46b4-bb38-3cf143e66913-sg-core-conf-yaml\") pod \"38363065-9fad-46b4-bb38-3cf143e66913\" (UID: \"38363065-9fad-46b4-bb38-3cf143e66913\") " Oct 03 08:11:49 crc kubenswrapper[4664]: I1003 08:11:49.887906 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38363065-9fad-46b4-bb38-3cf143e66913-scripts\") pod \"38363065-9fad-46b4-bb38-3cf143e66913\" (UID: \"38363065-9fad-46b4-bb38-3cf143e66913\") " Oct 03 08:11:49 crc kubenswrapper[4664]: I1003 08:11:49.889378 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38363065-9fad-46b4-bb38-3cf143e66913-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "38363065-9fad-46b4-bb38-3cf143e66913" (UID: "38363065-9fad-46b4-bb38-3cf143e66913"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:11:49 crc kubenswrapper[4664]: I1003 08:11:49.889566 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38363065-9fad-46b4-bb38-3cf143e66913-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "38363065-9fad-46b4-bb38-3cf143e66913" (UID: "38363065-9fad-46b4-bb38-3cf143e66913"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:11:49 crc kubenswrapper[4664]: I1003 08:11:49.896927 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38363065-9fad-46b4-bb38-3cf143e66913-scripts" (OuterVolumeSpecName: "scripts") pod "38363065-9fad-46b4-bb38-3cf143e66913" (UID: "38363065-9fad-46b4-bb38-3cf143e66913"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:11:49 crc kubenswrapper[4664]: I1003 08:11:49.910149 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38363065-9fad-46b4-bb38-3cf143e66913-kube-api-access-jpfjm" (OuterVolumeSpecName: "kube-api-access-jpfjm") pod "38363065-9fad-46b4-bb38-3cf143e66913" (UID: "38363065-9fad-46b4-bb38-3cf143e66913"). InnerVolumeSpecName "kube-api-access-jpfjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:11:49 crc kubenswrapper[4664]: I1003 08:11:49.922182 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38363065-9fad-46b4-bb38-3cf143e66913-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "38363065-9fad-46b4-bb38-3cf143e66913" (UID: "38363065-9fad-46b4-bb38-3cf143e66913"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:11:49 crc kubenswrapper[4664]: I1003 08:11:49.957248 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38363065-9fad-46b4-bb38-3cf143e66913-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "38363065-9fad-46b4-bb38-3cf143e66913" (UID: "38363065-9fad-46b4-bb38-3cf143e66913"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:11:49 crc kubenswrapper[4664]: I1003 08:11:49.990980 4664 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/38363065-9fad-46b4-bb38-3cf143e66913-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:49 crc kubenswrapper[4664]: I1003 08:11:49.991007 4664 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38363065-9fad-46b4-bb38-3cf143e66913-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:49 crc kubenswrapper[4664]: I1003 08:11:49.991018 4664 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/38363065-9fad-46b4-bb38-3cf143e66913-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:49 crc kubenswrapper[4664]: I1003 08:11:49.991026 4664 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38363065-9fad-46b4-bb38-3cf143e66913-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:49 crc kubenswrapper[4664]: I1003 08:11:49.991034 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jpfjm\" (UniqueName: \"kubernetes.io/projected/38363065-9fad-46b4-bb38-3cf143e66913-kube-api-access-jpfjm\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:49 crc kubenswrapper[4664]: I1003 08:11:49.991047 4664 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38363065-9fad-46b4-bb38-3cf143e66913-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:49 crc kubenswrapper[4664]: I1003 08:11:49.997744 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38363065-9fad-46b4-bb38-3cf143e66913-config-data" (OuterVolumeSpecName: "config-data") pod "38363065-9fad-46b4-bb38-3cf143e66913" (UID: "38363065-9fad-46b4-bb38-3cf143e66913"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:11:49 crc kubenswrapper[4664]: I1003 08:11:49.998545 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38363065-9fad-46b4-bb38-3cf143e66913-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "38363065-9fad-46b4-bb38-3cf143e66913" (UID: "38363065-9fad-46b4-bb38-3cf143e66913"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.020440 4664 generic.go:334] "Generic (PLEG): container finished" podID="38363065-9fad-46b4-bb38-3cf143e66913" containerID="93c72f2411ece905b6d3e4d337e7f3080f5077c17f9cdaeb73ccbe4cbc85f288" exitCode=0 Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.020468 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.020482 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38363065-9fad-46b4-bb38-3cf143e66913","Type":"ContainerDied","Data":"93c72f2411ece905b6d3e4d337e7f3080f5077c17f9cdaeb73ccbe4cbc85f288"} Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.020527 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38363065-9fad-46b4-bb38-3cf143e66913","Type":"ContainerDied","Data":"88d2ab128a09febeb19f4bf4fd319cb9bf5f565a1a08832cccc4fb7fa1bec159"} Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.020548 4664 scope.go:117] "RemoveContainer" containerID="e637af41c173bf613d4f38203f5c8864fb7a71c5707ea3b918b5213cc202608c" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.039581 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.045291 4664 scope.go:117] "RemoveContainer" containerID="81eb9c781eb0b1a7a527682700a3c0e5666eaf20e7f0d795ef6ce7998bca90da" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.065977 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.085881 4664 scope.go:117] "RemoveContainer" containerID="93c72f2411ece905b6d3e4d337e7f3080f5077c17f9cdaeb73ccbe4cbc85f288" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.093429 4664 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38363065-9fad-46b4-bb38-3cf143e66913-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.093468 4664 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38363065-9fad-46b4-bb38-3cf143e66913-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.096215 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.109124 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:11:50 crc kubenswrapper[4664]: E1003 08:11:50.109619 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38363065-9fad-46b4-bb38-3cf143e66913" containerName="ceilometer-notification-agent" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.109638 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="38363065-9fad-46b4-bb38-3cf143e66913" containerName="ceilometer-notification-agent" Oct 03 08:11:50 crc kubenswrapper[4664]: E1003 08:11:50.109657 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38363065-9fad-46b4-bb38-3cf143e66913" containerName="proxy-httpd" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.109663 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="38363065-9fad-46b4-bb38-3cf143e66913" containerName="proxy-httpd" Oct 03 08:11:50 crc kubenswrapper[4664]: E1003 08:11:50.109683 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38363065-9fad-46b4-bb38-3cf143e66913" containerName="ceilometer-central-agent" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.109690 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="38363065-9fad-46b4-bb38-3cf143e66913" containerName="ceilometer-central-agent" Oct 03 08:11:50 crc kubenswrapper[4664]: E1003 08:11:50.109701 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38363065-9fad-46b4-bb38-3cf143e66913" containerName="sg-core" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.109706 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="38363065-9fad-46b4-bb38-3cf143e66913" containerName="sg-core" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.109898 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="38363065-9fad-46b4-bb38-3cf143e66913" containerName="ceilometer-notification-agent" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.109932 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="38363065-9fad-46b4-bb38-3cf143e66913" containerName="proxy-httpd" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.109944 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="38363065-9fad-46b4-bb38-3cf143e66913" containerName="ceilometer-central-agent" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.109954 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="38363065-9fad-46b4-bb38-3cf143e66913" containerName="sg-core" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.111741 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.122539 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.122868 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.123007 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.128672 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.132122 4664 scope.go:117] "RemoveContainer" containerID="41fe124b4625424ca380f7fdcc0cc750af67596a8f412a25a1ba046926232d6f" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.195565 4664 scope.go:117] "RemoveContainer" containerID="e637af41c173bf613d4f38203f5c8864fb7a71c5707ea3b918b5213cc202608c" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.196768 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9d15d050-2edb-489f-aa55-439467f10bd8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9d15d050-2edb-489f-aa55-439467f10bd8\") " pod="openstack/ceilometer-0" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.196825 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d15d050-2edb-489f-aa55-439467f10bd8-run-httpd\") pod \"ceilometer-0\" (UID: \"9d15d050-2edb-489f-aa55-439467f10bd8\") " pod="openstack/ceilometer-0" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.196871 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d15d050-2edb-489f-aa55-439467f10bd8-scripts\") pod \"ceilometer-0\" (UID: \"9d15d050-2edb-489f-aa55-439467f10bd8\") " pod="openstack/ceilometer-0" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.197004 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d15d050-2edb-489f-aa55-439467f10bd8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9d15d050-2edb-489f-aa55-439467f10bd8\") " pod="openstack/ceilometer-0" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.197031 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d15d050-2edb-489f-aa55-439467f10bd8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9d15d050-2edb-489f-aa55-439467f10bd8\") " pod="openstack/ceilometer-0" Oct 03 08:11:50 crc kubenswrapper[4664]: E1003 08:11:50.199109 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e637af41c173bf613d4f38203f5c8864fb7a71c5707ea3b918b5213cc202608c\": container with ID starting with e637af41c173bf613d4f38203f5c8864fb7a71c5707ea3b918b5213cc202608c not found: ID does not exist" containerID="e637af41c173bf613d4f38203f5c8864fb7a71c5707ea3b918b5213cc202608c" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.199154 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e637af41c173bf613d4f38203f5c8864fb7a71c5707ea3b918b5213cc202608c"} err="failed to get container status \"e637af41c173bf613d4f38203f5c8864fb7a71c5707ea3b918b5213cc202608c\": rpc error: code = NotFound desc = could not find container \"e637af41c173bf613d4f38203f5c8864fb7a71c5707ea3b918b5213cc202608c\": container with ID starting with e637af41c173bf613d4f38203f5c8864fb7a71c5707ea3b918b5213cc202608c not found: ID does not exist" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.199181 4664 scope.go:117] "RemoveContainer" containerID="81eb9c781eb0b1a7a527682700a3c0e5666eaf20e7f0d795ef6ce7998bca90da" Oct 03 08:11:50 crc kubenswrapper[4664]: E1003 08:11:50.199460 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81eb9c781eb0b1a7a527682700a3c0e5666eaf20e7f0d795ef6ce7998bca90da\": container with ID starting with 81eb9c781eb0b1a7a527682700a3c0e5666eaf20e7f0d795ef6ce7998bca90da not found: ID does not exist" containerID="81eb9c781eb0b1a7a527682700a3c0e5666eaf20e7f0d795ef6ce7998bca90da" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.199485 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81eb9c781eb0b1a7a527682700a3c0e5666eaf20e7f0d795ef6ce7998bca90da"} err="failed to get container status \"81eb9c781eb0b1a7a527682700a3c0e5666eaf20e7f0d795ef6ce7998bca90da\": rpc error: code = NotFound desc = could not find container \"81eb9c781eb0b1a7a527682700a3c0e5666eaf20e7f0d795ef6ce7998bca90da\": container with ID starting with 81eb9c781eb0b1a7a527682700a3c0e5666eaf20e7f0d795ef6ce7998bca90da not found: ID does not exist" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.199499 4664 scope.go:117] "RemoveContainer" containerID="93c72f2411ece905b6d3e4d337e7f3080f5077c17f9cdaeb73ccbe4cbc85f288" Oct 03 08:11:50 crc kubenswrapper[4664]: E1003 08:11:50.199695 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93c72f2411ece905b6d3e4d337e7f3080f5077c17f9cdaeb73ccbe4cbc85f288\": container with ID starting with 93c72f2411ece905b6d3e4d337e7f3080f5077c17f9cdaeb73ccbe4cbc85f288 not found: ID does not exist" containerID="93c72f2411ece905b6d3e4d337e7f3080f5077c17f9cdaeb73ccbe4cbc85f288" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.199712 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93c72f2411ece905b6d3e4d337e7f3080f5077c17f9cdaeb73ccbe4cbc85f288"} err="failed to get container status \"93c72f2411ece905b6d3e4d337e7f3080f5077c17f9cdaeb73ccbe4cbc85f288\": rpc error: code = NotFound desc = could not find container \"93c72f2411ece905b6d3e4d337e7f3080f5077c17f9cdaeb73ccbe4cbc85f288\": container with ID starting with 93c72f2411ece905b6d3e4d337e7f3080f5077c17f9cdaeb73ccbe4cbc85f288 not found: ID does not exist" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.199725 4664 scope.go:117] "RemoveContainer" containerID="41fe124b4625424ca380f7fdcc0cc750af67596a8f412a25a1ba046926232d6f" Oct 03 08:11:50 crc kubenswrapper[4664]: E1003 08:11:50.199906 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41fe124b4625424ca380f7fdcc0cc750af67596a8f412a25a1ba046926232d6f\": container with ID starting with 41fe124b4625424ca380f7fdcc0cc750af67596a8f412a25a1ba046926232d6f not found: ID does not exist" containerID="41fe124b4625424ca380f7fdcc0cc750af67596a8f412a25a1ba046926232d6f" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.199922 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41fe124b4625424ca380f7fdcc0cc750af67596a8f412a25a1ba046926232d6f"} err="failed to get container status \"41fe124b4625424ca380f7fdcc0cc750af67596a8f412a25a1ba046926232d6f\": rpc error: code = NotFound desc = could not find container \"41fe124b4625424ca380f7fdcc0cc750af67596a8f412a25a1ba046926232d6f\": container with ID starting with 41fe124b4625424ca380f7fdcc0cc750af67596a8f412a25a1ba046926232d6f not found: ID does not exist" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.207015 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d15d050-2edb-489f-aa55-439467f10bd8-log-httpd\") pod \"ceilometer-0\" (UID: \"9d15d050-2edb-489f-aa55-439467f10bd8\") " pod="openstack/ceilometer-0" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.207159 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7gjf\" (UniqueName: \"kubernetes.io/projected/9d15d050-2edb-489f-aa55-439467f10bd8-kube-api-access-q7gjf\") pod \"ceilometer-0\" (UID: \"9d15d050-2edb-489f-aa55-439467f10bd8\") " pod="openstack/ceilometer-0" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.207259 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d15d050-2edb-489f-aa55-439467f10bd8-config-data\") pod \"ceilometer-0\" (UID: \"9d15d050-2edb-489f-aa55-439467f10bd8\") " pod="openstack/ceilometer-0" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.216327 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.217158 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.283367 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-hvj8z"] Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.289967 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hvj8z" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.292712 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.293542 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.298280 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-hvj8z"] Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.313829 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9d15d050-2edb-489f-aa55-439467f10bd8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9d15d050-2edb-489f-aa55-439467f10bd8\") " pod="openstack/ceilometer-0" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.313882 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d15d050-2edb-489f-aa55-439467f10bd8-run-httpd\") pod \"ceilometer-0\" (UID: \"9d15d050-2edb-489f-aa55-439467f10bd8\") " pod="openstack/ceilometer-0" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.313913 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d15d050-2edb-489f-aa55-439467f10bd8-scripts\") pod \"ceilometer-0\" (UID: \"9d15d050-2edb-489f-aa55-439467f10bd8\") " pod="openstack/ceilometer-0" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.313977 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d15d050-2edb-489f-aa55-439467f10bd8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9d15d050-2edb-489f-aa55-439467f10bd8\") " pod="openstack/ceilometer-0" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.313995 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d15d050-2edb-489f-aa55-439467f10bd8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9d15d050-2edb-489f-aa55-439467f10bd8\") " pod="openstack/ceilometer-0" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.314039 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d15d050-2edb-489f-aa55-439467f10bd8-log-httpd\") pod \"ceilometer-0\" (UID: \"9d15d050-2edb-489f-aa55-439467f10bd8\") " pod="openstack/ceilometer-0" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.314072 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7gjf\" (UniqueName: \"kubernetes.io/projected/9d15d050-2edb-489f-aa55-439467f10bd8-kube-api-access-q7gjf\") pod \"ceilometer-0\" (UID: \"9d15d050-2edb-489f-aa55-439467f10bd8\") " pod="openstack/ceilometer-0" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.314123 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d15d050-2edb-489f-aa55-439467f10bd8-config-data\") pod \"ceilometer-0\" (UID: \"9d15d050-2edb-489f-aa55-439467f10bd8\") " pod="openstack/ceilometer-0" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.317002 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d15d050-2edb-489f-aa55-439467f10bd8-log-httpd\") pod \"ceilometer-0\" (UID: \"9d15d050-2edb-489f-aa55-439467f10bd8\") " pod="openstack/ceilometer-0" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.317804 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d15d050-2edb-489f-aa55-439467f10bd8-run-httpd\") pod \"ceilometer-0\" (UID: \"9d15d050-2edb-489f-aa55-439467f10bd8\") " pod="openstack/ceilometer-0" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.321803 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d15d050-2edb-489f-aa55-439467f10bd8-scripts\") pod \"ceilometer-0\" (UID: \"9d15d050-2edb-489f-aa55-439467f10bd8\") " pod="openstack/ceilometer-0" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.322159 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9d15d050-2edb-489f-aa55-439467f10bd8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9d15d050-2edb-489f-aa55-439467f10bd8\") " pod="openstack/ceilometer-0" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.322830 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d15d050-2edb-489f-aa55-439467f10bd8-config-data\") pod \"ceilometer-0\" (UID: \"9d15d050-2edb-489f-aa55-439467f10bd8\") " pod="openstack/ceilometer-0" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.324990 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d15d050-2edb-489f-aa55-439467f10bd8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9d15d050-2edb-489f-aa55-439467f10bd8\") " pod="openstack/ceilometer-0" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.325000 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d15d050-2edb-489f-aa55-439467f10bd8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9d15d050-2edb-489f-aa55-439467f10bd8\") " pod="openstack/ceilometer-0" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.333518 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7gjf\" (UniqueName: \"kubernetes.io/projected/9d15d050-2edb-489f-aa55-439467f10bd8-kube-api-access-q7gjf\") pod \"ceilometer-0\" (UID: \"9d15d050-2edb-489f-aa55-439467f10bd8\") " pod="openstack/ceilometer-0" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.415945 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/149fac5c-dc4f-47b5-94f7-c36271ad6bc6-scripts\") pod \"nova-cell1-cell-mapping-hvj8z\" (UID: \"149fac5c-dc4f-47b5-94f7-c36271ad6bc6\") " pod="openstack/nova-cell1-cell-mapping-hvj8z" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.416740 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfg7k\" (UniqueName: \"kubernetes.io/projected/149fac5c-dc4f-47b5-94f7-c36271ad6bc6-kube-api-access-vfg7k\") pod \"nova-cell1-cell-mapping-hvj8z\" (UID: \"149fac5c-dc4f-47b5-94f7-c36271ad6bc6\") " pod="openstack/nova-cell1-cell-mapping-hvj8z" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.416889 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/149fac5c-dc4f-47b5-94f7-c36271ad6bc6-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-hvj8z\" (UID: \"149fac5c-dc4f-47b5-94f7-c36271ad6bc6\") " pod="openstack/nova-cell1-cell-mapping-hvj8z" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.417107 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/149fac5c-dc4f-47b5-94f7-c36271ad6bc6-config-data\") pod \"nova-cell1-cell-mapping-hvj8z\" (UID: \"149fac5c-dc4f-47b5-94f7-c36271ad6bc6\") " pod="openstack/nova-cell1-cell-mapping-hvj8z" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.438394 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.520114 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/149fac5c-dc4f-47b5-94f7-c36271ad6bc6-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-hvj8z\" (UID: \"149fac5c-dc4f-47b5-94f7-c36271ad6bc6\") " pod="openstack/nova-cell1-cell-mapping-hvj8z" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.520155 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfg7k\" (UniqueName: \"kubernetes.io/projected/149fac5c-dc4f-47b5-94f7-c36271ad6bc6-kube-api-access-vfg7k\") pod \"nova-cell1-cell-mapping-hvj8z\" (UID: \"149fac5c-dc4f-47b5-94f7-c36271ad6bc6\") " pod="openstack/nova-cell1-cell-mapping-hvj8z" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.520231 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/149fac5c-dc4f-47b5-94f7-c36271ad6bc6-config-data\") pod \"nova-cell1-cell-mapping-hvj8z\" (UID: \"149fac5c-dc4f-47b5-94f7-c36271ad6bc6\") " pod="openstack/nova-cell1-cell-mapping-hvj8z" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.520285 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/149fac5c-dc4f-47b5-94f7-c36271ad6bc6-scripts\") pod \"nova-cell1-cell-mapping-hvj8z\" (UID: \"149fac5c-dc4f-47b5-94f7-c36271ad6bc6\") " pod="openstack/nova-cell1-cell-mapping-hvj8z" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.528279 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/149fac5c-dc4f-47b5-94f7-c36271ad6bc6-scripts\") pod \"nova-cell1-cell-mapping-hvj8z\" (UID: \"149fac5c-dc4f-47b5-94f7-c36271ad6bc6\") " pod="openstack/nova-cell1-cell-mapping-hvj8z" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.528388 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/149fac5c-dc4f-47b5-94f7-c36271ad6bc6-config-data\") pod \"nova-cell1-cell-mapping-hvj8z\" (UID: \"149fac5c-dc4f-47b5-94f7-c36271ad6bc6\") " pod="openstack/nova-cell1-cell-mapping-hvj8z" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.543448 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/149fac5c-dc4f-47b5-94f7-c36271ad6bc6-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-hvj8z\" (UID: \"149fac5c-dc4f-47b5-94f7-c36271ad6bc6\") " pod="openstack/nova-cell1-cell-mapping-hvj8z" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.549624 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfg7k\" (UniqueName: \"kubernetes.io/projected/149fac5c-dc4f-47b5-94f7-c36271ad6bc6-kube-api-access-vfg7k\") pod \"nova-cell1-cell-mapping-hvj8z\" (UID: \"149fac5c-dc4f-47b5-94f7-c36271ad6bc6\") " pod="openstack/nova-cell1-cell-mapping-hvj8z" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.679229 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hvj8z" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.781700 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.929324 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c810060-a84b-47cd-95cc-583e8ef6a535-config-data\") pod \"0c810060-a84b-47cd-95cc-583e8ef6a535\" (UID: \"0c810060-a84b-47cd-95cc-583e8ef6a535\") " Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.929447 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nt2b\" (UniqueName: \"kubernetes.io/projected/0c810060-a84b-47cd-95cc-583e8ef6a535-kube-api-access-4nt2b\") pod \"0c810060-a84b-47cd-95cc-583e8ef6a535\" (UID: \"0c810060-a84b-47cd-95cc-583e8ef6a535\") " Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.929532 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c810060-a84b-47cd-95cc-583e8ef6a535-logs\") pod \"0c810060-a84b-47cd-95cc-583e8ef6a535\" (UID: \"0c810060-a84b-47cd-95cc-583e8ef6a535\") " Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.929566 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c810060-a84b-47cd-95cc-583e8ef6a535-combined-ca-bundle\") pod \"0c810060-a84b-47cd-95cc-583e8ef6a535\" (UID: \"0c810060-a84b-47cd-95cc-583e8ef6a535\") " Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.930954 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c810060-a84b-47cd-95cc-583e8ef6a535-logs" (OuterVolumeSpecName: "logs") pod "0c810060-a84b-47cd-95cc-583e8ef6a535" (UID: "0c810060-a84b-47cd-95cc-583e8ef6a535"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.937200 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c810060-a84b-47cd-95cc-583e8ef6a535-kube-api-access-4nt2b" (OuterVolumeSpecName: "kube-api-access-4nt2b") pod "0c810060-a84b-47cd-95cc-583e8ef6a535" (UID: "0c810060-a84b-47cd-95cc-583e8ef6a535"). InnerVolumeSpecName "kube-api-access-4nt2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.971260 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c810060-a84b-47cd-95cc-583e8ef6a535-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0c810060-a84b-47cd-95cc-583e8ef6a535" (UID: "0c810060-a84b-47cd-95cc-583e8ef6a535"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:11:50 crc kubenswrapper[4664]: I1003 08:11:50.980182 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c810060-a84b-47cd-95cc-583e8ef6a535-config-data" (OuterVolumeSpecName: "config-data") pod "0c810060-a84b-47cd-95cc-583e8ef6a535" (UID: "0c810060-a84b-47cd-95cc-583e8ef6a535"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.032318 4664 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c810060-a84b-47cd-95cc-583e8ef6a535-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.032342 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nt2b\" (UniqueName: \"kubernetes.io/projected/0c810060-a84b-47cd-95cc-583e8ef6a535-kube-api-access-4nt2b\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.032353 4664 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c810060-a84b-47cd-95cc-583e8ef6a535-logs\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.032361 4664 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c810060-a84b-47cd-95cc-583e8ef6a535-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.048415 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.053962 4664 generic.go:334] "Generic (PLEG): container finished" podID="0c810060-a84b-47cd-95cc-583e8ef6a535" containerID="1dd621fa344843a13ec96569b9f659dd131c066a6b422b1b54c37c498a27882d" exitCode=0 Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.054406 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-g8q67" podUID="ba25e34e-5665-4cf0-aaf4-b23e705acd52" containerName="registry-server" containerID="cri-o://c32acf42b0a0e4bc1796355961b1ed1400990a743fabe124beb3950599f9e840" gracePeriod=2 Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.054863 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.057123 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0c810060-a84b-47cd-95cc-583e8ef6a535","Type":"ContainerDied","Data":"1dd621fa344843a13ec96569b9f659dd131c066a6b422b1b54c37c498a27882d"} Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.057164 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0c810060-a84b-47cd-95cc-583e8ef6a535","Type":"ContainerDied","Data":"b119aad6fbdb648e7397339d05b4eef66cddf014b20e600a6b77e66d71e2a832"} Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.057182 4664 scope.go:117] "RemoveContainer" containerID="1dd621fa344843a13ec96569b9f659dd131c066a6b422b1b54c37c498a27882d" Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.115421 4664 scope.go:117] "RemoveContainer" containerID="330a2a7c95e89213be7693e703668762c278d503763d9b016850748e16d7a222" Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.122970 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.162596 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.178087 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 03 08:11:51 crc kubenswrapper[4664]: E1003 08:11:51.178652 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c810060-a84b-47cd-95cc-583e8ef6a535" containerName="nova-api-api" Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.178670 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c810060-a84b-47cd-95cc-583e8ef6a535" containerName="nova-api-api" Oct 03 08:11:51 crc kubenswrapper[4664]: E1003 08:11:51.178707 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c810060-a84b-47cd-95cc-583e8ef6a535" containerName="nova-api-log" Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.178719 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c810060-a84b-47cd-95cc-583e8ef6a535" containerName="nova-api-log" Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.179144 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c810060-a84b-47cd-95cc-583e8ef6a535" containerName="nova-api-log" Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.179169 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c810060-a84b-47cd-95cc-583e8ef6a535" containerName="nova-api-api" Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.197883 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.198083 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.204351 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.204527 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.215280 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.227859 4664 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="0c157f08-279b-48dd-85a3-14dd87b27864" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.196:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.228179 4664 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="0c157f08-279b-48dd-85a3-14dd87b27864" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.196:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.246880 4664 scope.go:117] "RemoveContainer" containerID="1dd621fa344843a13ec96569b9f659dd131c066a6b422b1b54c37c498a27882d" Oct 03 08:11:51 crc kubenswrapper[4664]: E1003 08:11:51.251505 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1dd621fa344843a13ec96569b9f659dd131c066a6b422b1b54c37c498a27882d\": container with ID starting with 1dd621fa344843a13ec96569b9f659dd131c066a6b422b1b54c37c498a27882d not found: ID does not exist" containerID="1dd621fa344843a13ec96569b9f659dd131c066a6b422b1b54c37c498a27882d" Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.251565 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dd621fa344843a13ec96569b9f659dd131c066a6b422b1b54c37c498a27882d"} err="failed to get container status \"1dd621fa344843a13ec96569b9f659dd131c066a6b422b1b54c37c498a27882d\": rpc error: code = NotFound desc = could not find container \"1dd621fa344843a13ec96569b9f659dd131c066a6b422b1b54c37c498a27882d\": container with ID starting with 1dd621fa344843a13ec96569b9f659dd131c066a6b422b1b54c37c498a27882d not found: ID does not exist" Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.251620 4664 scope.go:117] "RemoveContainer" containerID="330a2a7c95e89213be7693e703668762c278d503763d9b016850748e16d7a222" Oct 03 08:11:51 crc kubenswrapper[4664]: E1003 08:11:51.252144 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"330a2a7c95e89213be7693e703668762c278d503763d9b016850748e16d7a222\": container with ID starting with 330a2a7c95e89213be7693e703668762c278d503763d9b016850748e16d7a222 not found: ID does not exist" containerID="330a2a7c95e89213be7693e703668762c278d503763d9b016850748e16d7a222" Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.252183 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"330a2a7c95e89213be7693e703668762c278d503763d9b016850748e16d7a222"} err="failed to get container status \"330a2a7c95e89213be7693e703668762c278d503763d9b016850748e16d7a222\": rpc error: code = NotFound desc = could not find container \"330a2a7c95e89213be7693e703668762c278d503763d9b016850748e16d7a222\": container with ID starting with 330a2a7c95e89213be7693e703668762c278d503763d9b016850748e16d7a222 not found: ID does not exist" Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.329083 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-hvj8z"] Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.349307 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t6lq\" (UniqueName: \"kubernetes.io/projected/1348c670-9f44-4f00-8044-0afb98bf7368-kube-api-access-7t6lq\") pod \"nova-api-0\" (UID: \"1348c670-9f44-4f00-8044-0afb98bf7368\") " pod="openstack/nova-api-0" Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.349345 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1348c670-9f44-4f00-8044-0afb98bf7368-logs\") pod \"nova-api-0\" (UID: \"1348c670-9f44-4f00-8044-0afb98bf7368\") " pod="openstack/nova-api-0" Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.349402 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1348c670-9f44-4f00-8044-0afb98bf7368-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1348c670-9f44-4f00-8044-0afb98bf7368\") " pod="openstack/nova-api-0" Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.349489 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1348c670-9f44-4f00-8044-0afb98bf7368-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1348c670-9f44-4f00-8044-0afb98bf7368\") " pod="openstack/nova-api-0" Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.349525 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1348c670-9f44-4f00-8044-0afb98bf7368-public-tls-certs\") pod \"nova-api-0\" (UID: \"1348c670-9f44-4f00-8044-0afb98bf7368\") " pod="openstack/nova-api-0" Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.349567 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1348c670-9f44-4f00-8044-0afb98bf7368-config-data\") pod \"nova-api-0\" (UID: \"1348c670-9f44-4f00-8044-0afb98bf7368\") " pod="openstack/nova-api-0" Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.452000 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1348c670-9f44-4f00-8044-0afb98bf7368-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1348c670-9f44-4f00-8044-0afb98bf7368\") " pod="openstack/nova-api-0" Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.453149 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1348c670-9f44-4f00-8044-0afb98bf7368-public-tls-certs\") pod \"nova-api-0\" (UID: \"1348c670-9f44-4f00-8044-0afb98bf7368\") " pod="openstack/nova-api-0" Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.454668 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1348c670-9f44-4f00-8044-0afb98bf7368-config-data\") pod \"nova-api-0\" (UID: \"1348c670-9f44-4f00-8044-0afb98bf7368\") " pod="openstack/nova-api-0" Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.455961 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t6lq\" (UniqueName: \"kubernetes.io/projected/1348c670-9f44-4f00-8044-0afb98bf7368-kube-api-access-7t6lq\") pod \"nova-api-0\" (UID: \"1348c670-9f44-4f00-8044-0afb98bf7368\") " pod="openstack/nova-api-0" Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.456011 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1348c670-9f44-4f00-8044-0afb98bf7368-logs\") pod \"nova-api-0\" (UID: \"1348c670-9f44-4f00-8044-0afb98bf7368\") " pod="openstack/nova-api-0" Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.460730 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1348c670-9f44-4f00-8044-0afb98bf7368-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1348c670-9f44-4f00-8044-0afb98bf7368\") " pod="openstack/nova-api-0" Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.460389 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1348c670-9f44-4f00-8044-0afb98bf7368-config-data\") pod \"nova-api-0\" (UID: \"1348c670-9f44-4f00-8044-0afb98bf7368\") " pod="openstack/nova-api-0" Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.461234 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1348c670-9f44-4f00-8044-0afb98bf7368-logs\") pod \"nova-api-0\" (UID: \"1348c670-9f44-4f00-8044-0afb98bf7368\") " pod="openstack/nova-api-0" Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.460266 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1348c670-9f44-4f00-8044-0afb98bf7368-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1348c670-9f44-4f00-8044-0afb98bf7368\") " pod="openstack/nova-api-0" Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.464080 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1348c670-9f44-4f00-8044-0afb98bf7368-public-tls-certs\") pod \"nova-api-0\" (UID: \"1348c670-9f44-4f00-8044-0afb98bf7368\") " pod="openstack/nova-api-0" Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.475466 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1348c670-9f44-4f00-8044-0afb98bf7368-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1348c670-9f44-4f00-8044-0afb98bf7368\") " pod="openstack/nova-api-0" Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.488865 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t6lq\" (UniqueName: \"kubernetes.io/projected/1348c670-9f44-4f00-8044-0afb98bf7368-kube-api-access-7t6lq\") pod \"nova-api-0\" (UID: \"1348c670-9f44-4f00-8044-0afb98bf7368\") " pod="openstack/nova-api-0" Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.543183 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.847983 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g8q67" Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.902478 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tdfm\" (UniqueName: \"kubernetes.io/projected/ba25e34e-5665-4cf0-aaf4-b23e705acd52-kube-api-access-4tdfm\") pod \"ba25e34e-5665-4cf0-aaf4-b23e705acd52\" (UID: \"ba25e34e-5665-4cf0-aaf4-b23e705acd52\") " Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.902623 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba25e34e-5665-4cf0-aaf4-b23e705acd52-utilities\") pod \"ba25e34e-5665-4cf0-aaf4-b23e705acd52\" (UID: \"ba25e34e-5665-4cf0-aaf4-b23e705acd52\") " Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.902820 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba25e34e-5665-4cf0-aaf4-b23e705acd52-catalog-content\") pod \"ba25e34e-5665-4cf0-aaf4-b23e705acd52\" (UID: \"ba25e34e-5665-4cf0-aaf4-b23e705acd52\") " Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.903830 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c810060-a84b-47cd-95cc-583e8ef6a535" path="/var/lib/kubelet/pods/0c810060-a84b-47cd-95cc-583e8ef6a535/volumes" Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.904632 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba25e34e-5665-4cf0-aaf4-b23e705acd52-utilities" (OuterVolumeSpecName: "utilities") pod "ba25e34e-5665-4cf0-aaf4-b23e705acd52" (UID: "ba25e34e-5665-4cf0-aaf4-b23e705acd52"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.905597 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38363065-9fad-46b4-bb38-3cf143e66913" path="/var/lib/kubelet/pods/38363065-9fad-46b4-bb38-3cf143e66913/volumes" Oct 03 08:11:51 crc kubenswrapper[4664]: I1003 08:11:51.909435 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba25e34e-5665-4cf0-aaf4-b23e705acd52-kube-api-access-4tdfm" (OuterVolumeSpecName: "kube-api-access-4tdfm") pod "ba25e34e-5665-4cf0-aaf4-b23e705acd52" (UID: "ba25e34e-5665-4cf0-aaf4-b23e705acd52"). InnerVolumeSpecName "kube-api-access-4tdfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:11:52 crc kubenswrapper[4664]: I1003 08:11:52.007275 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tdfm\" (UniqueName: \"kubernetes.io/projected/ba25e34e-5665-4cf0-aaf4-b23e705acd52-kube-api-access-4tdfm\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:52 crc kubenswrapper[4664]: I1003 08:11:52.007780 4664 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba25e34e-5665-4cf0-aaf4-b23e705acd52-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:52 crc kubenswrapper[4664]: I1003 08:11:52.042916 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba25e34e-5665-4cf0-aaf4-b23e705acd52-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ba25e34e-5665-4cf0-aaf4-b23e705acd52" (UID: "ba25e34e-5665-4cf0-aaf4-b23e705acd52"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:11:52 crc kubenswrapper[4664]: I1003 08:11:52.110022 4664 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba25e34e-5665-4cf0-aaf4-b23e705acd52-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:52 crc kubenswrapper[4664]: I1003 08:11:52.120720 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g8q67" Oct 03 08:11:52 crc kubenswrapper[4664]: I1003 08:11:52.120581 4664 generic.go:334] "Generic (PLEG): container finished" podID="ba25e34e-5665-4cf0-aaf4-b23e705acd52" containerID="c32acf42b0a0e4bc1796355961b1ed1400990a743fabe124beb3950599f9e840" exitCode=0 Oct 03 08:11:52 crc kubenswrapper[4664]: I1003 08:11:52.123267 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g8q67" event={"ID":"ba25e34e-5665-4cf0-aaf4-b23e705acd52","Type":"ContainerDied","Data":"c32acf42b0a0e4bc1796355961b1ed1400990a743fabe124beb3950599f9e840"} Oct 03 08:11:52 crc kubenswrapper[4664]: I1003 08:11:52.123422 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g8q67" event={"ID":"ba25e34e-5665-4cf0-aaf4-b23e705acd52","Type":"ContainerDied","Data":"ede8540e9a2bf64206a8d0850abd148129a0aa953df9dd54130035648b0bd6bc"} Oct 03 08:11:52 crc kubenswrapper[4664]: I1003 08:11:52.123446 4664 scope.go:117] "RemoveContainer" containerID="c32acf42b0a0e4bc1796355961b1ed1400990a743fabe124beb3950599f9e840" Oct 03 08:11:52 crc kubenswrapper[4664]: I1003 08:11:52.136690 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hvj8z" event={"ID":"149fac5c-dc4f-47b5-94f7-c36271ad6bc6","Type":"ContainerStarted","Data":"f618544ea84ac2201d9c8f6c7f6c36caf56a676ebd6fd4faf75985b6cc2c7a97"} Oct 03 08:11:52 crc kubenswrapper[4664]: I1003 08:11:52.136743 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hvj8z" event={"ID":"149fac5c-dc4f-47b5-94f7-c36271ad6bc6","Type":"ContainerStarted","Data":"728ac9014d49d372ff20bd97dda732cfb3e3eeaa79cfd782e6da5f02e03c5c14"} Oct 03 08:11:52 crc kubenswrapper[4664]: I1003 08:11:52.166369 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-hvj8z" podStartSLOduration=2.166347713 podStartE2EDuration="2.166347713s" podCreationTimestamp="2025-10-03 08:11:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:11:52.162056688 +0000 UTC m=+1412.983247188" watchObservedRunningTime="2025-10-03 08:11:52.166347713 +0000 UTC m=+1412.987538203" Oct 03 08:11:52 crc kubenswrapper[4664]: I1003 08:11:52.167495 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9d15d050-2edb-489f-aa55-439467f10bd8","Type":"ContainerStarted","Data":"b6c307d04a6924e36affa82afe4130e18d23269de4236980b2b80e83ab3f5d8a"} Oct 03 08:11:52 crc kubenswrapper[4664]: I1003 08:11:52.167562 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9d15d050-2edb-489f-aa55-439467f10bd8","Type":"ContainerStarted","Data":"96aa26dbf1a25d4096337c0b02b97c0f55e3b5a4d2aa6e5ed4341566ade5ca5e"} Oct 03 08:11:52 crc kubenswrapper[4664]: I1003 08:11:52.184920 4664 scope.go:117] "RemoveContainer" containerID="d0db8bea4b10f1cd8034e6325b94671fca1165cb4681ad7d040797c9f9402f22" Oct 03 08:11:52 crc kubenswrapper[4664]: I1003 08:11:52.197231 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g8q67"] Oct 03 08:11:52 crc kubenswrapper[4664]: I1003 08:11:52.224066 4664 scope.go:117] "RemoveContainer" containerID="7e80d0bddc347f88a7b4c46a0739f4eeed7187235f59d865236b02a13e569015" Oct 03 08:11:52 crc kubenswrapper[4664]: I1003 08:11:52.234671 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-g8q67"] Oct 03 08:11:52 crc kubenswrapper[4664]: I1003 08:11:52.271717 4664 scope.go:117] "RemoveContainer" containerID="c32acf42b0a0e4bc1796355961b1ed1400990a743fabe124beb3950599f9e840" Oct 03 08:11:52 crc kubenswrapper[4664]: E1003 08:11:52.274575 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c32acf42b0a0e4bc1796355961b1ed1400990a743fabe124beb3950599f9e840\": container with ID starting with c32acf42b0a0e4bc1796355961b1ed1400990a743fabe124beb3950599f9e840 not found: ID does not exist" containerID="c32acf42b0a0e4bc1796355961b1ed1400990a743fabe124beb3950599f9e840" Oct 03 08:11:52 crc kubenswrapper[4664]: I1003 08:11:52.274652 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c32acf42b0a0e4bc1796355961b1ed1400990a743fabe124beb3950599f9e840"} err="failed to get container status \"c32acf42b0a0e4bc1796355961b1ed1400990a743fabe124beb3950599f9e840\": rpc error: code = NotFound desc = could not find container \"c32acf42b0a0e4bc1796355961b1ed1400990a743fabe124beb3950599f9e840\": container with ID starting with c32acf42b0a0e4bc1796355961b1ed1400990a743fabe124beb3950599f9e840 not found: ID does not exist" Oct 03 08:11:52 crc kubenswrapper[4664]: I1003 08:11:52.274695 4664 scope.go:117] "RemoveContainer" containerID="d0db8bea4b10f1cd8034e6325b94671fca1165cb4681ad7d040797c9f9402f22" Oct 03 08:11:52 crc kubenswrapper[4664]: E1003 08:11:52.278915 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0db8bea4b10f1cd8034e6325b94671fca1165cb4681ad7d040797c9f9402f22\": container with ID starting with d0db8bea4b10f1cd8034e6325b94671fca1165cb4681ad7d040797c9f9402f22 not found: ID does not exist" containerID="d0db8bea4b10f1cd8034e6325b94671fca1165cb4681ad7d040797c9f9402f22" Oct 03 08:11:52 crc kubenswrapper[4664]: I1003 08:11:52.278956 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0db8bea4b10f1cd8034e6325b94671fca1165cb4681ad7d040797c9f9402f22"} err="failed to get container status \"d0db8bea4b10f1cd8034e6325b94671fca1165cb4681ad7d040797c9f9402f22\": rpc error: code = NotFound desc = could not find container \"d0db8bea4b10f1cd8034e6325b94671fca1165cb4681ad7d040797c9f9402f22\": container with ID starting with d0db8bea4b10f1cd8034e6325b94671fca1165cb4681ad7d040797c9f9402f22 not found: ID does not exist" Oct 03 08:11:52 crc kubenswrapper[4664]: I1003 08:11:52.278974 4664 scope.go:117] "RemoveContainer" containerID="7e80d0bddc347f88a7b4c46a0739f4eeed7187235f59d865236b02a13e569015" Oct 03 08:11:52 crc kubenswrapper[4664]: E1003 08:11:52.279924 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e80d0bddc347f88a7b4c46a0739f4eeed7187235f59d865236b02a13e569015\": container with ID starting with 7e80d0bddc347f88a7b4c46a0739f4eeed7187235f59d865236b02a13e569015 not found: ID does not exist" containerID="7e80d0bddc347f88a7b4c46a0739f4eeed7187235f59d865236b02a13e569015" Oct 03 08:11:52 crc kubenswrapper[4664]: I1003 08:11:52.279961 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e80d0bddc347f88a7b4c46a0739f4eeed7187235f59d865236b02a13e569015"} err="failed to get container status \"7e80d0bddc347f88a7b4c46a0739f4eeed7187235f59d865236b02a13e569015\": rpc error: code = NotFound desc = could not find container \"7e80d0bddc347f88a7b4c46a0739f4eeed7187235f59d865236b02a13e569015\": container with ID starting with 7e80d0bddc347f88a7b4c46a0739f4eeed7187235f59d865236b02a13e569015 not found: ID does not exist" Oct 03 08:11:52 crc kubenswrapper[4664]: E1003 08:11:52.319982 4664 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba25e34e_5665_4cf0_aaf4_b23e705acd52.slice\": RecentStats: unable to find data in memory cache]" Oct 03 08:11:52 crc kubenswrapper[4664]: W1003 08:11:52.375445 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1348c670_9f44_4f00_8044_0afb98bf7368.slice/crio-46613ef0706b1537fdae60939f79b7dbb51b7255ecca6db15559908d48f7c3fa WatchSource:0}: Error finding container 46613ef0706b1537fdae60939f79b7dbb51b7255ecca6db15559908d48f7c3fa: Status 404 returned error can't find the container with id 46613ef0706b1537fdae60939f79b7dbb51b7255ecca6db15559908d48f7c3fa Oct 03 08:11:52 crc kubenswrapper[4664]: I1003 08:11:52.378564 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 08:11:53 crc kubenswrapper[4664]: I1003 08:11:53.178889 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1348c670-9f44-4f00-8044-0afb98bf7368","Type":"ContainerStarted","Data":"32f457212fd2dceee0257de8c9acb57e806bd3f4c44bdf4012bfa570c064cbbd"} Oct 03 08:11:53 crc kubenswrapper[4664]: I1003 08:11:53.179411 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1348c670-9f44-4f00-8044-0afb98bf7368","Type":"ContainerStarted","Data":"0a6da16e6aa2056d3954a2a40ab83727491a3d0d8d2c290880e31d497ad288de"} Oct 03 08:11:53 crc kubenswrapper[4664]: I1003 08:11:53.179445 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1348c670-9f44-4f00-8044-0afb98bf7368","Type":"ContainerStarted","Data":"46613ef0706b1537fdae60939f79b7dbb51b7255ecca6db15559908d48f7c3fa"} Oct 03 08:11:53 crc kubenswrapper[4664]: I1003 08:11:53.184965 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9d15d050-2edb-489f-aa55-439467f10bd8","Type":"ContainerStarted","Data":"df20c8c35a3766917d878b051d310a610e5ecff99a81b9175ab5fed60f3680b2"} Oct 03 08:11:53 crc kubenswrapper[4664]: I1003 08:11:53.214908 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.214889636 podStartE2EDuration="2.214889636s" podCreationTimestamp="2025-10-03 08:11:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:11:53.210456376 +0000 UTC m=+1414.031646876" watchObservedRunningTime="2025-10-03 08:11:53.214889636 +0000 UTC m=+1414.036080116" Oct 03 08:11:53 crc kubenswrapper[4664]: I1003 08:11:53.886628 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba25e34e-5665-4cf0-aaf4-b23e705acd52" path="/var/lib/kubelet/pods/ba25e34e-5665-4cf0-aaf4-b23e705acd52/volumes" Oct 03 08:11:54 crc kubenswrapper[4664]: I1003 08:11:54.195754 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9d15d050-2edb-489f-aa55-439467f10bd8","Type":"ContainerStarted","Data":"d3751f260b6faa0e58901ae286b7f6dfa23d5023f42bbdff013a43b7f40db919"} Oct 03 08:11:54 crc kubenswrapper[4664]: I1003 08:11:54.462819 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c7b6c5df9-czxf9" Oct 03 08:11:54 crc kubenswrapper[4664]: I1003 08:11:54.537120 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-rg57h"] Oct 03 08:11:54 crc kubenswrapper[4664]: I1003 08:11:54.537348 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-865f5d856f-rg57h" podUID="2ea47450-c8b3-4b8f-85f0-53a44121a988" containerName="dnsmasq-dns" containerID="cri-o://c5862623667e86b9f2bdfef90ee1d72f4f75ee5592aecd00629913a74dbaff85" gracePeriod=10 Oct 03 08:11:55 crc kubenswrapper[4664]: I1003 08:11:55.171715 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-rg57h" Oct 03 08:11:55 crc kubenswrapper[4664]: I1003 08:11:55.184696 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhkgw\" (UniqueName: \"kubernetes.io/projected/2ea47450-c8b3-4b8f-85f0-53a44121a988-kube-api-access-bhkgw\") pod \"2ea47450-c8b3-4b8f-85f0-53a44121a988\" (UID: \"2ea47450-c8b3-4b8f-85f0-53a44121a988\") " Oct 03 08:11:55 crc kubenswrapper[4664]: I1003 08:11:55.184828 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ea47450-c8b3-4b8f-85f0-53a44121a988-ovsdbserver-nb\") pod \"2ea47450-c8b3-4b8f-85f0-53a44121a988\" (UID: \"2ea47450-c8b3-4b8f-85f0-53a44121a988\") " Oct 03 08:11:55 crc kubenswrapper[4664]: I1003 08:11:55.184908 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ea47450-c8b3-4b8f-85f0-53a44121a988-config\") pod \"2ea47450-c8b3-4b8f-85f0-53a44121a988\" (UID: \"2ea47450-c8b3-4b8f-85f0-53a44121a988\") " Oct 03 08:11:55 crc kubenswrapper[4664]: I1003 08:11:55.184948 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2ea47450-c8b3-4b8f-85f0-53a44121a988-dns-swift-storage-0\") pod \"2ea47450-c8b3-4b8f-85f0-53a44121a988\" (UID: \"2ea47450-c8b3-4b8f-85f0-53a44121a988\") " Oct 03 08:11:55 crc kubenswrapper[4664]: I1003 08:11:55.184982 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ea47450-c8b3-4b8f-85f0-53a44121a988-ovsdbserver-sb\") pod \"2ea47450-c8b3-4b8f-85f0-53a44121a988\" (UID: \"2ea47450-c8b3-4b8f-85f0-53a44121a988\") " Oct 03 08:11:55 crc kubenswrapper[4664]: I1003 08:11:55.185012 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ea47450-c8b3-4b8f-85f0-53a44121a988-dns-svc\") pod \"2ea47450-c8b3-4b8f-85f0-53a44121a988\" (UID: \"2ea47450-c8b3-4b8f-85f0-53a44121a988\") " Oct 03 08:11:55 crc kubenswrapper[4664]: I1003 08:11:55.196064 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ea47450-c8b3-4b8f-85f0-53a44121a988-kube-api-access-bhkgw" (OuterVolumeSpecName: "kube-api-access-bhkgw") pod "2ea47450-c8b3-4b8f-85f0-53a44121a988" (UID: "2ea47450-c8b3-4b8f-85f0-53a44121a988"). InnerVolumeSpecName "kube-api-access-bhkgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:11:55 crc kubenswrapper[4664]: I1003 08:11:55.212294 4664 generic.go:334] "Generic (PLEG): container finished" podID="2ea47450-c8b3-4b8f-85f0-53a44121a988" containerID="c5862623667e86b9f2bdfef90ee1d72f4f75ee5592aecd00629913a74dbaff85" exitCode=0 Oct 03 08:11:55 crc kubenswrapper[4664]: I1003 08:11:55.212773 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-rg57h" event={"ID":"2ea47450-c8b3-4b8f-85f0-53a44121a988","Type":"ContainerDied","Data":"c5862623667e86b9f2bdfef90ee1d72f4f75ee5592aecd00629913a74dbaff85"} Oct 03 08:11:55 crc kubenswrapper[4664]: I1003 08:11:55.212810 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-rg57h" event={"ID":"2ea47450-c8b3-4b8f-85f0-53a44121a988","Type":"ContainerDied","Data":"78d73dfe6db2b5b05f368f002c6033be6e2fffa2e003f5c5fe22633ebe1e2c5b"} Oct 03 08:11:55 crc kubenswrapper[4664]: I1003 08:11:55.212854 4664 scope.go:117] "RemoveContainer" containerID="c5862623667e86b9f2bdfef90ee1d72f4f75ee5592aecd00629913a74dbaff85" Oct 03 08:11:55 crc kubenswrapper[4664]: I1003 08:11:55.213022 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-rg57h" Oct 03 08:11:55 crc kubenswrapper[4664]: I1003 08:11:55.223234 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9d15d050-2edb-489f-aa55-439467f10bd8","Type":"ContainerStarted","Data":"2b6d35d7333be51c355bd10fb504904216e2e3fee98a1930552f019691406b87"} Oct 03 08:11:55 crc kubenswrapper[4664]: I1003 08:11:55.224807 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 08:11:55 crc kubenswrapper[4664]: I1003 08:11:55.257842 4664 scope.go:117] "RemoveContainer" containerID="c3225e3260eea2df07e48298267610627d37cfda8da937b93e1b975ee59bb843" Oct 03 08:11:55 crc kubenswrapper[4664]: I1003 08:11:55.260226 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.931604128 podStartE2EDuration="5.260207918s" podCreationTimestamp="2025-10-03 08:11:50 +0000 UTC" firstStartedPulling="2025-10-03 08:11:51.073419925 +0000 UTC m=+1411.894610415" lastFinishedPulling="2025-10-03 08:11:54.402023715 +0000 UTC m=+1415.223214205" observedRunningTime="2025-10-03 08:11:55.257789767 +0000 UTC m=+1416.078980277" watchObservedRunningTime="2025-10-03 08:11:55.260207918 +0000 UTC m=+1416.081398398" Oct 03 08:11:55 crc kubenswrapper[4664]: I1003 08:11:55.271161 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ea47450-c8b3-4b8f-85f0-53a44121a988-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2ea47450-c8b3-4b8f-85f0-53a44121a988" (UID: "2ea47450-c8b3-4b8f-85f0-53a44121a988"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:11:55 crc kubenswrapper[4664]: I1003 08:11:55.273912 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ea47450-c8b3-4b8f-85f0-53a44121a988-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2ea47450-c8b3-4b8f-85f0-53a44121a988" (UID: "2ea47450-c8b3-4b8f-85f0-53a44121a988"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:11:55 crc kubenswrapper[4664]: I1003 08:11:55.279115 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ea47450-c8b3-4b8f-85f0-53a44121a988-config" (OuterVolumeSpecName: "config") pod "2ea47450-c8b3-4b8f-85f0-53a44121a988" (UID: "2ea47450-c8b3-4b8f-85f0-53a44121a988"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:11:55 crc kubenswrapper[4664]: I1003 08:11:55.282253 4664 scope.go:117] "RemoveContainer" containerID="c5862623667e86b9f2bdfef90ee1d72f4f75ee5592aecd00629913a74dbaff85" Oct 03 08:11:55 crc kubenswrapper[4664]: E1003 08:11:55.283190 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5862623667e86b9f2bdfef90ee1d72f4f75ee5592aecd00629913a74dbaff85\": container with ID starting with c5862623667e86b9f2bdfef90ee1d72f4f75ee5592aecd00629913a74dbaff85 not found: ID does not exist" containerID="c5862623667e86b9f2bdfef90ee1d72f4f75ee5592aecd00629913a74dbaff85" Oct 03 08:11:55 crc kubenswrapper[4664]: I1003 08:11:55.283236 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5862623667e86b9f2bdfef90ee1d72f4f75ee5592aecd00629913a74dbaff85"} err="failed to get container status \"c5862623667e86b9f2bdfef90ee1d72f4f75ee5592aecd00629913a74dbaff85\": rpc error: code = NotFound desc = could not find container \"c5862623667e86b9f2bdfef90ee1d72f4f75ee5592aecd00629913a74dbaff85\": container with ID starting with c5862623667e86b9f2bdfef90ee1d72f4f75ee5592aecd00629913a74dbaff85 not found: ID does not exist" Oct 03 08:11:55 crc kubenswrapper[4664]: I1003 08:11:55.283264 4664 scope.go:117] "RemoveContainer" containerID="c3225e3260eea2df07e48298267610627d37cfda8da937b93e1b975ee59bb843" Oct 03 08:11:55 crc kubenswrapper[4664]: E1003 08:11:55.283669 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3225e3260eea2df07e48298267610627d37cfda8da937b93e1b975ee59bb843\": container with ID starting with c3225e3260eea2df07e48298267610627d37cfda8da937b93e1b975ee59bb843 not found: ID does not exist" containerID="c3225e3260eea2df07e48298267610627d37cfda8da937b93e1b975ee59bb843" Oct 03 08:11:55 crc kubenswrapper[4664]: I1003 08:11:55.283711 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3225e3260eea2df07e48298267610627d37cfda8da937b93e1b975ee59bb843"} err="failed to get container status \"c3225e3260eea2df07e48298267610627d37cfda8da937b93e1b975ee59bb843\": rpc error: code = NotFound desc = could not find container \"c3225e3260eea2df07e48298267610627d37cfda8da937b93e1b975ee59bb843\": container with ID starting with c3225e3260eea2df07e48298267610627d37cfda8da937b93e1b975ee59bb843 not found: ID does not exist" Oct 03 08:11:55 crc kubenswrapper[4664]: I1003 08:11:55.287698 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhkgw\" (UniqueName: \"kubernetes.io/projected/2ea47450-c8b3-4b8f-85f0-53a44121a988-kube-api-access-bhkgw\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:55 crc kubenswrapper[4664]: I1003 08:11:55.287725 4664 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ea47450-c8b3-4b8f-85f0-53a44121a988-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:55 crc kubenswrapper[4664]: I1003 08:11:55.287735 4664 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ea47450-c8b3-4b8f-85f0-53a44121a988-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:55 crc kubenswrapper[4664]: I1003 08:11:55.287743 4664 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ea47450-c8b3-4b8f-85f0-53a44121a988-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:55 crc kubenswrapper[4664]: I1003 08:11:55.288991 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ea47450-c8b3-4b8f-85f0-53a44121a988-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2ea47450-c8b3-4b8f-85f0-53a44121a988" (UID: "2ea47450-c8b3-4b8f-85f0-53a44121a988"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:11:55 crc kubenswrapper[4664]: I1003 08:11:55.295282 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ea47450-c8b3-4b8f-85f0-53a44121a988-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2ea47450-c8b3-4b8f-85f0-53a44121a988" (UID: "2ea47450-c8b3-4b8f-85f0-53a44121a988"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:11:55 crc kubenswrapper[4664]: I1003 08:11:55.390030 4664 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ea47450-c8b3-4b8f-85f0-53a44121a988-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:55 crc kubenswrapper[4664]: I1003 08:11:55.390066 4664 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2ea47450-c8b3-4b8f-85f0-53a44121a988-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:55 crc kubenswrapper[4664]: I1003 08:11:55.581569 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-rg57h"] Oct 03 08:11:55 crc kubenswrapper[4664]: I1003 08:11:55.590740 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-rg57h"] Oct 03 08:11:55 crc kubenswrapper[4664]: I1003 08:11:55.895857 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ea47450-c8b3-4b8f-85f0-53a44121a988" path="/var/lib/kubelet/pods/2ea47450-c8b3-4b8f-85f0-53a44121a988/volumes" Oct 03 08:11:58 crc kubenswrapper[4664]: I1003 08:11:58.254161 4664 generic.go:334] "Generic (PLEG): container finished" podID="149fac5c-dc4f-47b5-94f7-c36271ad6bc6" containerID="f618544ea84ac2201d9c8f6c7f6c36caf56a676ebd6fd4faf75985b6cc2c7a97" exitCode=0 Oct 03 08:11:58 crc kubenswrapper[4664]: I1003 08:11:58.254268 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hvj8z" event={"ID":"149fac5c-dc4f-47b5-94f7-c36271ad6bc6","Type":"ContainerDied","Data":"f618544ea84ac2201d9c8f6c7f6c36caf56a676ebd6fd4faf75985b6cc2c7a97"} Oct 03 08:11:59 crc kubenswrapper[4664]: I1003 08:11:59.609789 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hvj8z" Oct 03 08:11:59 crc kubenswrapper[4664]: I1003 08:11:59.674394 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/149fac5c-dc4f-47b5-94f7-c36271ad6bc6-config-data\") pod \"149fac5c-dc4f-47b5-94f7-c36271ad6bc6\" (UID: \"149fac5c-dc4f-47b5-94f7-c36271ad6bc6\") " Oct 03 08:11:59 crc kubenswrapper[4664]: I1003 08:11:59.674520 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/149fac5c-dc4f-47b5-94f7-c36271ad6bc6-combined-ca-bundle\") pod \"149fac5c-dc4f-47b5-94f7-c36271ad6bc6\" (UID: \"149fac5c-dc4f-47b5-94f7-c36271ad6bc6\") " Oct 03 08:11:59 crc kubenswrapper[4664]: I1003 08:11:59.674587 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/149fac5c-dc4f-47b5-94f7-c36271ad6bc6-scripts\") pod \"149fac5c-dc4f-47b5-94f7-c36271ad6bc6\" (UID: \"149fac5c-dc4f-47b5-94f7-c36271ad6bc6\") " Oct 03 08:11:59 crc kubenswrapper[4664]: I1003 08:11:59.674633 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfg7k\" (UniqueName: \"kubernetes.io/projected/149fac5c-dc4f-47b5-94f7-c36271ad6bc6-kube-api-access-vfg7k\") pod \"149fac5c-dc4f-47b5-94f7-c36271ad6bc6\" (UID: \"149fac5c-dc4f-47b5-94f7-c36271ad6bc6\") " Oct 03 08:11:59 crc kubenswrapper[4664]: I1003 08:11:59.695712 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/149fac5c-dc4f-47b5-94f7-c36271ad6bc6-kube-api-access-vfg7k" (OuterVolumeSpecName: "kube-api-access-vfg7k") pod "149fac5c-dc4f-47b5-94f7-c36271ad6bc6" (UID: "149fac5c-dc4f-47b5-94f7-c36271ad6bc6"). InnerVolumeSpecName "kube-api-access-vfg7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:11:59 crc kubenswrapper[4664]: I1003 08:11:59.696198 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/149fac5c-dc4f-47b5-94f7-c36271ad6bc6-scripts" (OuterVolumeSpecName: "scripts") pod "149fac5c-dc4f-47b5-94f7-c36271ad6bc6" (UID: "149fac5c-dc4f-47b5-94f7-c36271ad6bc6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:11:59 crc kubenswrapper[4664]: I1003 08:11:59.708824 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/149fac5c-dc4f-47b5-94f7-c36271ad6bc6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "149fac5c-dc4f-47b5-94f7-c36271ad6bc6" (UID: "149fac5c-dc4f-47b5-94f7-c36271ad6bc6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:11:59 crc kubenswrapper[4664]: I1003 08:11:59.718125 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/149fac5c-dc4f-47b5-94f7-c36271ad6bc6-config-data" (OuterVolumeSpecName: "config-data") pod "149fac5c-dc4f-47b5-94f7-c36271ad6bc6" (UID: "149fac5c-dc4f-47b5-94f7-c36271ad6bc6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:11:59 crc kubenswrapper[4664]: I1003 08:11:59.776690 4664 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/149fac5c-dc4f-47b5-94f7-c36271ad6bc6-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:59 crc kubenswrapper[4664]: I1003 08:11:59.776725 4664 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/149fac5c-dc4f-47b5-94f7-c36271ad6bc6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:59 crc kubenswrapper[4664]: I1003 08:11:59.776737 4664 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/149fac5c-dc4f-47b5-94f7-c36271ad6bc6-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:59 crc kubenswrapper[4664]: I1003 08:11:59.776745 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfg7k\" (UniqueName: \"kubernetes.io/projected/149fac5c-dc4f-47b5-94f7-c36271ad6bc6-kube-api-access-vfg7k\") on node \"crc\" DevicePath \"\"" Oct 03 08:12:00 crc kubenswrapper[4664]: I1003 08:12:00.223582 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 03 08:12:00 crc kubenswrapper[4664]: I1003 08:12:00.226694 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 03 08:12:00 crc kubenswrapper[4664]: I1003 08:12:00.236317 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 03 08:12:00 crc kubenswrapper[4664]: I1003 08:12:00.276271 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hvj8z" Oct 03 08:12:00 crc kubenswrapper[4664]: I1003 08:12:00.276263 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hvj8z" event={"ID":"149fac5c-dc4f-47b5-94f7-c36271ad6bc6","Type":"ContainerDied","Data":"728ac9014d49d372ff20bd97dda732cfb3e3eeaa79cfd782e6da5f02e03c5c14"} Oct 03 08:12:00 crc kubenswrapper[4664]: I1003 08:12:00.276312 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="728ac9014d49d372ff20bd97dda732cfb3e3eeaa79cfd782e6da5f02e03c5c14" Oct 03 08:12:00 crc kubenswrapper[4664]: I1003 08:12:00.282623 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 03 08:12:00 crc kubenswrapper[4664]: I1003 08:12:00.474858 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 08:12:00 crc kubenswrapper[4664]: I1003 08:12:00.475104 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1348c670-9f44-4f00-8044-0afb98bf7368" containerName="nova-api-log" containerID="cri-o://0a6da16e6aa2056d3954a2a40ab83727491a3d0d8d2c290880e31d497ad288de" gracePeriod=30 Oct 03 08:12:00 crc kubenswrapper[4664]: I1003 08:12:00.475530 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1348c670-9f44-4f00-8044-0afb98bf7368" containerName="nova-api-api" containerID="cri-o://32f457212fd2dceee0257de8c9acb57e806bd3f4c44bdf4012bfa570c064cbbd" gracePeriod=30 Oct 03 08:12:00 crc kubenswrapper[4664]: I1003 08:12:00.506829 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 08:12:00 crc kubenswrapper[4664]: I1003 08:12:00.512114 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="b62e85e1-8010-42b2-b674-271b49596620" containerName="nova-scheduler-scheduler" containerID="cri-o://191612892f5f8a7fadf384544affdfc81939e55e4ae4d741a1451acc4f2e8551" gracePeriod=30 Oct 03 08:12:00 crc kubenswrapper[4664]: I1003 08:12:00.520878 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.064649 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.102927 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1348c670-9f44-4f00-8044-0afb98bf7368-public-tls-certs\") pod \"1348c670-9f44-4f00-8044-0afb98bf7368\" (UID: \"1348c670-9f44-4f00-8044-0afb98bf7368\") " Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.103083 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7t6lq\" (UniqueName: \"kubernetes.io/projected/1348c670-9f44-4f00-8044-0afb98bf7368-kube-api-access-7t6lq\") pod \"1348c670-9f44-4f00-8044-0afb98bf7368\" (UID: \"1348c670-9f44-4f00-8044-0afb98bf7368\") " Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.103132 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1348c670-9f44-4f00-8044-0afb98bf7368-internal-tls-certs\") pod \"1348c670-9f44-4f00-8044-0afb98bf7368\" (UID: \"1348c670-9f44-4f00-8044-0afb98bf7368\") " Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.103208 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1348c670-9f44-4f00-8044-0afb98bf7368-combined-ca-bundle\") pod \"1348c670-9f44-4f00-8044-0afb98bf7368\" (UID: \"1348c670-9f44-4f00-8044-0afb98bf7368\") " Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.103264 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1348c670-9f44-4f00-8044-0afb98bf7368-config-data\") pod \"1348c670-9f44-4f00-8044-0afb98bf7368\" (UID: \"1348c670-9f44-4f00-8044-0afb98bf7368\") " Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.103304 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1348c670-9f44-4f00-8044-0afb98bf7368-logs\") pod \"1348c670-9f44-4f00-8044-0afb98bf7368\" (UID: \"1348c670-9f44-4f00-8044-0afb98bf7368\") " Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.104191 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1348c670-9f44-4f00-8044-0afb98bf7368-logs" (OuterVolumeSpecName: "logs") pod "1348c670-9f44-4f00-8044-0afb98bf7368" (UID: "1348c670-9f44-4f00-8044-0afb98bf7368"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:12:01 crc kubenswrapper[4664]: E1003 08:12:01.106459 4664 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="191612892f5f8a7fadf384544affdfc81939e55e4ae4d741a1451acc4f2e8551" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 08:12:01 crc kubenswrapper[4664]: E1003 08:12:01.112131 4664 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="191612892f5f8a7fadf384544affdfc81939e55e4ae4d741a1451acc4f2e8551" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.114701 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1348c670-9f44-4f00-8044-0afb98bf7368-kube-api-access-7t6lq" (OuterVolumeSpecName: "kube-api-access-7t6lq") pod "1348c670-9f44-4f00-8044-0afb98bf7368" (UID: "1348c670-9f44-4f00-8044-0afb98bf7368"). InnerVolumeSpecName "kube-api-access-7t6lq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:12:01 crc kubenswrapper[4664]: E1003 08:12:01.116754 4664 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="191612892f5f8a7fadf384544affdfc81939e55e4ae4d741a1451acc4f2e8551" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 08:12:01 crc kubenswrapper[4664]: E1003 08:12:01.116820 4664 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="b62e85e1-8010-42b2-b674-271b49596620" containerName="nova-scheduler-scheduler" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.143757 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1348c670-9f44-4f00-8044-0afb98bf7368-config-data" (OuterVolumeSpecName: "config-data") pod "1348c670-9f44-4f00-8044-0afb98bf7368" (UID: "1348c670-9f44-4f00-8044-0afb98bf7368"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.153220 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1348c670-9f44-4f00-8044-0afb98bf7368-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1348c670-9f44-4f00-8044-0afb98bf7368" (UID: "1348c670-9f44-4f00-8044-0afb98bf7368"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.201855 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1348c670-9f44-4f00-8044-0afb98bf7368-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1348c670-9f44-4f00-8044-0afb98bf7368" (UID: "1348c670-9f44-4f00-8044-0afb98bf7368"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.205145 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7t6lq\" (UniqueName: \"kubernetes.io/projected/1348c670-9f44-4f00-8044-0afb98bf7368-kube-api-access-7t6lq\") on node \"crc\" DevicePath \"\"" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.205181 4664 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1348c670-9f44-4f00-8044-0afb98bf7368-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.205193 4664 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1348c670-9f44-4f00-8044-0afb98bf7368-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.205204 4664 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1348c670-9f44-4f00-8044-0afb98bf7368-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.205216 4664 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1348c670-9f44-4f00-8044-0afb98bf7368-logs\") on node \"crc\" DevicePath \"\"" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.217749 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1348c670-9f44-4f00-8044-0afb98bf7368-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1348c670-9f44-4f00-8044-0afb98bf7368" (UID: "1348c670-9f44-4f00-8044-0afb98bf7368"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.293778 4664 generic.go:334] "Generic (PLEG): container finished" podID="1348c670-9f44-4f00-8044-0afb98bf7368" containerID="32f457212fd2dceee0257de8c9acb57e806bd3f4c44bdf4012bfa570c064cbbd" exitCode=0 Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.293829 4664 generic.go:334] "Generic (PLEG): container finished" podID="1348c670-9f44-4f00-8044-0afb98bf7368" containerID="0a6da16e6aa2056d3954a2a40ab83727491a3d0d8d2c290880e31d497ad288de" exitCode=143 Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.293833 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1348c670-9f44-4f00-8044-0afb98bf7368","Type":"ContainerDied","Data":"32f457212fd2dceee0257de8c9acb57e806bd3f4c44bdf4012bfa570c064cbbd"} Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.293907 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1348c670-9f44-4f00-8044-0afb98bf7368","Type":"ContainerDied","Data":"0a6da16e6aa2056d3954a2a40ab83727491a3d0d8d2c290880e31d497ad288de"} Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.293931 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1348c670-9f44-4f00-8044-0afb98bf7368","Type":"ContainerDied","Data":"46613ef0706b1537fdae60939f79b7dbb51b7255ecca6db15559908d48f7c3fa"} Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.293956 4664 scope.go:117] "RemoveContainer" containerID="32f457212fd2dceee0257de8c9acb57e806bd3f4c44bdf4012bfa570c064cbbd" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.293859 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.308542 4664 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1348c670-9f44-4f00-8044-0afb98bf7368-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.323975 4664 scope.go:117] "RemoveContainer" containerID="0a6da16e6aa2056d3954a2a40ab83727491a3d0d8d2c290880e31d497ad288de" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.329142 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.338040 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.355706 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.356726 4664 scope.go:117] "RemoveContainer" containerID="32f457212fd2dceee0257de8c9acb57e806bd3f4c44bdf4012bfa570c064cbbd" Oct 03 08:12:01 crc kubenswrapper[4664]: E1003 08:12:01.357186 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="149fac5c-dc4f-47b5-94f7-c36271ad6bc6" containerName="nova-manage" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.357205 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="149fac5c-dc4f-47b5-94f7-c36271ad6bc6" containerName="nova-manage" Oct 03 08:12:01 crc kubenswrapper[4664]: E1003 08:12:01.357228 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba25e34e-5665-4cf0-aaf4-b23e705acd52" containerName="extract-utilities" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.357238 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba25e34e-5665-4cf0-aaf4-b23e705acd52" containerName="extract-utilities" Oct 03 08:12:01 crc kubenswrapper[4664]: E1003 08:12:01.357255 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ea47450-c8b3-4b8f-85f0-53a44121a988" containerName="dnsmasq-dns" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.357262 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ea47450-c8b3-4b8f-85f0-53a44121a988" containerName="dnsmasq-dns" Oct 03 08:12:01 crc kubenswrapper[4664]: E1003 08:12:01.357276 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1348c670-9f44-4f00-8044-0afb98bf7368" containerName="nova-api-api" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.357333 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="1348c670-9f44-4f00-8044-0afb98bf7368" containerName="nova-api-api" Oct 03 08:12:01 crc kubenswrapper[4664]: E1003 08:12:01.357342 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba25e34e-5665-4cf0-aaf4-b23e705acd52" containerName="extract-content" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.357348 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba25e34e-5665-4cf0-aaf4-b23e705acd52" containerName="extract-content" Oct 03 08:12:01 crc kubenswrapper[4664]: E1003 08:12:01.357357 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba25e34e-5665-4cf0-aaf4-b23e705acd52" containerName="registry-server" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.357364 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba25e34e-5665-4cf0-aaf4-b23e705acd52" containerName="registry-server" Oct 03 08:12:01 crc kubenswrapper[4664]: E1003 08:12:01.357363 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32f457212fd2dceee0257de8c9acb57e806bd3f4c44bdf4012bfa570c064cbbd\": container with ID starting with 32f457212fd2dceee0257de8c9acb57e806bd3f4c44bdf4012bfa570c064cbbd not found: ID does not exist" containerID="32f457212fd2dceee0257de8c9acb57e806bd3f4c44bdf4012bfa570c064cbbd" Oct 03 08:12:01 crc kubenswrapper[4664]: E1003 08:12:01.357389 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ea47450-c8b3-4b8f-85f0-53a44121a988" containerName="init" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.357397 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ea47450-c8b3-4b8f-85f0-53a44121a988" containerName="init" Oct 03 08:12:01 crc kubenswrapper[4664]: E1003 08:12:01.357419 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1348c670-9f44-4f00-8044-0afb98bf7368" containerName="nova-api-log" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.357426 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="1348c670-9f44-4f00-8044-0afb98bf7368" containerName="nova-api-log" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.357400 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32f457212fd2dceee0257de8c9acb57e806bd3f4c44bdf4012bfa570c064cbbd"} err="failed to get container status \"32f457212fd2dceee0257de8c9acb57e806bd3f4c44bdf4012bfa570c064cbbd\": rpc error: code = NotFound desc = could not find container \"32f457212fd2dceee0257de8c9acb57e806bd3f4c44bdf4012bfa570c064cbbd\": container with ID starting with 32f457212fd2dceee0257de8c9acb57e806bd3f4c44bdf4012bfa570c064cbbd not found: ID does not exist" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.357451 4664 scope.go:117] "RemoveContainer" containerID="0a6da16e6aa2056d3954a2a40ab83727491a3d0d8d2c290880e31d497ad288de" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.357746 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="149fac5c-dc4f-47b5-94f7-c36271ad6bc6" containerName="nova-manage" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.357775 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ea47450-c8b3-4b8f-85f0-53a44121a988" containerName="dnsmasq-dns" Oct 03 08:12:01 crc kubenswrapper[4664]: E1003 08:12:01.357768 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a6da16e6aa2056d3954a2a40ab83727491a3d0d8d2c290880e31d497ad288de\": container with ID starting with 0a6da16e6aa2056d3954a2a40ab83727491a3d0d8d2c290880e31d497ad288de not found: ID does not exist" containerID="0a6da16e6aa2056d3954a2a40ab83727491a3d0d8d2c290880e31d497ad288de" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.357788 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="1348c670-9f44-4f00-8044-0afb98bf7368" containerName="nova-api-api" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.357797 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba25e34e-5665-4cf0-aaf4-b23e705acd52" containerName="registry-server" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.357794 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a6da16e6aa2056d3954a2a40ab83727491a3d0d8d2c290880e31d497ad288de"} err="failed to get container status \"0a6da16e6aa2056d3954a2a40ab83727491a3d0d8d2c290880e31d497ad288de\": rpc error: code = NotFound desc = could not find container \"0a6da16e6aa2056d3954a2a40ab83727491a3d0d8d2c290880e31d497ad288de\": container with ID starting with 0a6da16e6aa2056d3954a2a40ab83727491a3d0d8d2c290880e31d497ad288de not found: ID does not exist" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.357813 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="1348c670-9f44-4f00-8044-0afb98bf7368" containerName="nova-api-log" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.357812 4664 scope.go:117] "RemoveContainer" containerID="32f457212fd2dceee0257de8c9acb57e806bd3f4c44bdf4012bfa570c064cbbd" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.358382 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32f457212fd2dceee0257de8c9acb57e806bd3f4c44bdf4012bfa570c064cbbd"} err="failed to get container status \"32f457212fd2dceee0257de8c9acb57e806bd3f4c44bdf4012bfa570c064cbbd\": rpc error: code = NotFound desc = could not find container \"32f457212fd2dceee0257de8c9acb57e806bd3f4c44bdf4012bfa570c064cbbd\": container with ID starting with 32f457212fd2dceee0257de8c9acb57e806bd3f4c44bdf4012bfa570c064cbbd not found: ID does not exist" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.358427 4664 scope.go:117] "RemoveContainer" containerID="0a6da16e6aa2056d3954a2a40ab83727491a3d0d8d2c290880e31d497ad288de" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.359045 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.361772 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a6da16e6aa2056d3954a2a40ab83727491a3d0d8d2c290880e31d497ad288de"} err="failed to get container status \"0a6da16e6aa2056d3954a2a40ab83727491a3d0d8d2c290880e31d497ad288de\": rpc error: code = NotFound desc = could not find container \"0a6da16e6aa2056d3954a2a40ab83727491a3d0d8d2c290880e31d497ad288de\": container with ID starting with 0a6da16e6aa2056d3954a2a40ab83727491a3d0d8d2c290880e31d497ad288de not found: ID does not exist" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.362499 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.362503 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.366108 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.372823 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.409786 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b708c60d-d069-4a68-8bf7-0d2e9a325eb0-config-data\") pod \"nova-api-0\" (UID: \"b708c60d-d069-4a68-8bf7-0d2e9a325eb0\") " pod="openstack/nova-api-0" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.409880 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b708c60d-d069-4a68-8bf7-0d2e9a325eb0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b708c60d-d069-4a68-8bf7-0d2e9a325eb0\") " pod="openstack/nova-api-0" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.409907 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b708c60d-d069-4a68-8bf7-0d2e9a325eb0-public-tls-certs\") pod \"nova-api-0\" (UID: \"b708c60d-d069-4a68-8bf7-0d2e9a325eb0\") " pod="openstack/nova-api-0" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.409955 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b708c60d-d069-4a68-8bf7-0d2e9a325eb0-logs\") pod \"nova-api-0\" (UID: \"b708c60d-d069-4a68-8bf7-0d2e9a325eb0\") " pod="openstack/nova-api-0" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.409995 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf2cj\" (UniqueName: \"kubernetes.io/projected/b708c60d-d069-4a68-8bf7-0d2e9a325eb0-kube-api-access-nf2cj\") pod \"nova-api-0\" (UID: \"b708c60d-d069-4a68-8bf7-0d2e9a325eb0\") " pod="openstack/nova-api-0" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.410069 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b708c60d-d069-4a68-8bf7-0d2e9a325eb0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b708c60d-d069-4a68-8bf7-0d2e9a325eb0\") " pod="openstack/nova-api-0" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.512386 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b708c60d-d069-4a68-8bf7-0d2e9a325eb0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b708c60d-d069-4a68-8bf7-0d2e9a325eb0\") " pod="openstack/nova-api-0" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.512472 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b708c60d-d069-4a68-8bf7-0d2e9a325eb0-config-data\") pod \"nova-api-0\" (UID: \"b708c60d-d069-4a68-8bf7-0d2e9a325eb0\") " pod="openstack/nova-api-0" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.512527 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b708c60d-d069-4a68-8bf7-0d2e9a325eb0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b708c60d-d069-4a68-8bf7-0d2e9a325eb0\") " pod="openstack/nova-api-0" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.512562 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b708c60d-d069-4a68-8bf7-0d2e9a325eb0-public-tls-certs\") pod \"nova-api-0\" (UID: \"b708c60d-d069-4a68-8bf7-0d2e9a325eb0\") " pod="openstack/nova-api-0" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.512628 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b708c60d-d069-4a68-8bf7-0d2e9a325eb0-logs\") pod \"nova-api-0\" (UID: \"b708c60d-d069-4a68-8bf7-0d2e9a325eb0\") " pod="openstack/nova-api-0" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.512669 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nf2cj\" (UniqueName: \"kubernetes.io/projected/b708c60d-d069-4a68-8bf7-0d2e9a325eb0-kube-api-access-nf2cj\") pod \"nova-api-0\" (UID: \"b708c60d-d069-4a68-8bf7-0d2e9a325eb0\") " pod="openstack/nova-api-0" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.513566 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b708c60d-d069-4a68-8bf7-0d2e9a325eb0-logs\") pod \"nova-api-0\" (UID: \"b708c60d-d069-4a68-8bf7-0d2e9a325eb0\") " pod="openstack/nova-api-0" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.517268 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b708c60d-d069-4a68-8bf7-0d2e9a325eb0-public-tls-certs\") pod \"nova-api-0\" (UID: \"b708c60d-d069-4a68-8bf7-0d2e9a325eb0\") " pod="openstack/nova-api-0" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.517711 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b708c60d-d069-4a68-8bf7-0d2e9a325eb0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b708c60d-d069-4a68-8bf7-0d2e9a325eb0\") " pod="openstack/nova-api-0" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.518582 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b708c60d-d069-4a68-8bf7-0d2e9a325eb0-config-data\") pod \"nova-api-0\" (UID: \"b708c60d-d069-4a68-8bf7-0d2e9a325eb0\") " pod="openstack/nova-api-0" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.519136 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b708c60d-d069-4a68-8bf7-0d2e9a325eb0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b708c60d-d069-4a68-8bf7-0d2e9a325eb0\") " pod="openstack/nova-api-0" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.534235 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf2cj\" (UniqueName: \"kubernetes.io/projected/b708c60d-d069-4a68-8bf7-0d2e9a325eb0-kube-api-access-nf2cj\") pod \"nova-api-0\" (UID: \"b708c60d-d069-4a68-8bf7-0d2e9a325eb0\") " pod="openstack/nova-api-0" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.681222 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 08:12:01 crc kubenswrapper[4664]: I1003 08:12:01.887640 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1348c670-9f44-4f00-8044-0afb98bf7368" path="/var/lib/kubelet/pods/1348c670-9f44-4f00-8044-0afb98bf7368/volumes" Oct 03 08:12:02 crc kubenswrapper[4664]: I1003 08:12:02.122428 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 08:12:02 crc kubenswrapper[4664]: I1003 08:12:02.309243 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b708c60d-d069-4a68-8bf7-0d2e9a325eb0","Type":"ContainerStarted","Data":"a0ccd64037bcf05de5d839cf70a6d5ba5a977346316b2d1a6bd9dbfd9749147a"} Oct 03 08:12:02 crc kubenswrapper[4664]: I1003 08:12:02.309382 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0c157f08-279b-48dd-85a3-14dd87b27864" containerName="nova-metadata-log" containerID="cri-o://958aa7ca0a5322005f9012262823ce36614c9dc908ce654220fe4e337d2168b0" gracePeriod=30 Oct 03 08:12:02 crc kubenswrapper[4664]: I1003 08:12:02.309420 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0c157f08-279b-48dd-85a3-14dd87b27864" containerName="nova-metadata-metadata" containerID="cri-o://5b2a58751be16c596b359d4ce29d6e09592cec35efa1747f430670c24874ea23" gracePeriod=30 Oct 03 08:12:03 crc kubenswrapper[4664]: I1003 08:12:03.329895 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b708c60d-d069-4a68-8bf7-0d2e9a325eb0","Type":"ContainerStarted","Data":"05075f0e745b0b09d72ed18e4e0ccb470572a6af98171d74552c50462198fa0a"} Oct 03 08:12:03 crc kubenswrapper[4664]: I1003 08:12:03.331382 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b708c60d-d069-4a68-8bf7-0d2e9a325eb0","Type":"ContainerStarted","Data":"a20d94e53f4f9cd0f471957ca1bbe2be1f7e669fd6d7044937cd129865042019"} Oct 03 08:12:03 crc kubenswrapper[4664]: I1003 08:12:03.334288 4664 generic.go:334] "Generic (PLEG): container finished" podID="0c157f08-279b-48dd-85a3-14dd87b27864" containerID="958aa7ca0a5322005f9012262823ce36614c9dc908ce654220fe4e337d2168b0" exitCode=143 Oct 03 08:12:03 crc kubenswrapper[4664]: I1003 08:12:03.334362 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0c157f08-279b-48dd-85a3-14dd87b27864","Type":"ContainerDied","Data":"958aa7ca0a5322005f9012262823ce36614c9dc908ce654220fe4e337d2168b0"} Oct 03 08:12:03 crc kubenswrapper[4664]: I1003 08:12:03.362455 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.362428539 podStartE2EDuration="2.362428539s" podCreationTimestamp="2025-10-03 08:12:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:12:03.35354324 +0000 UTC m=+1424.174733730" watchObservedRunningTime="2025-10-03 08:12:03.362428539 +0000 UTC m=+1424.183619029" Oct 03 08:12:05 crc kubenswrapper[4664]: I1003 08:12:05.458448 4664 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="0c157f08-279b-48dd-85a3-14dd87b27864" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.196:8775/\": read tcp 10.217.0.2:56560->10.217.0.196:8775: read: connection reset by peer" Oct 03 08:12:05 crc kubenswrapper[4664]: I1003 08:12:05.458459 4664 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="0c157f08-279b-48dd-85a3-14dd87b27864" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.196:8775/\": read tcp 10.217.0.2:56562->10.217.0.196:8775: read: connection reset by peer" Oct 03 08:12:05 crc kubenswrapper[4664]: I1003 08:12:05.909732 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.004721 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.010599 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b62e85e1-8010-42b2-b674-271b49596620-combined-ca-bundle\") pod \"b62e85e1-8010-42b2-b674-271b49596620\" (UID: \"b62e85e1-8010-42b2-b674-271b49596620\") " Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.010791 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c157f08-279b-48dd-85a3-14dd87b27864-nova-metadata-tls-certs\") pod \"0c157f08-279b-48dd-85a3-14dd87b27864\" (UID: \"0c157f08-279b-48dd-85a3-14dd87b27864\") " Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.010824 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c157f08-279b-48dd-85a3-14dd87b27864-config-data\") pod \"0c157f08-279b-48dd-85a3-14dd87b27864\" (UID: \"0c157f08-279b-48dd-85a3-14dd87b27864\") " Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.010879 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b62e85e1-8010-42b2-b674-271b49596620-config-data\") pod \"b62e85e1-8010-42b2-b674-271b49596620\" (UID: \"b62e85e1-8010-42b2-b674-271b49596620\") " Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.010970 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtn9t\" (UniqueName: \"kubernetes.io/projected/0c157f08-279b-48dd-85a3-14dd87b27864-kube-api-access-wtn9t\") pod \"0c157f08-279b-48dd-85a3-14dd87b27864\" (UID: \"0c157f08-279b-48dd-85a3-14dd87b27864\") " Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.011029 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x87lc\" (UniqueName: \"kubernetes.io/projected/b62e85e1-8010-42b2-b674-271b49596620-kube-api-access-x87lc\") pod \"b62e85e1-8010-42b2-b674-271b49596620\" (UID: \"b62e85e1-8010-42b2-b674-271b49596620\") " Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.011052 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c157f08-279b-48dd-85a3-14dd87b27864-combined-ca-bundle\") pod \"0c157f08-279b-48dd-85a3-14dd87b27864\" (UID: \"0c157f08-279b-48dd-85a3-14dd87b27864\") " Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.011161 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c157f08-279b-48dd-85a3-14dd87b27864-logs\") pod \"0c157f08-279b-48dd-85a3-14dd87b27864\" (UID: \"0c157f08-279b-48dd-85a3-14dd87b27864\") " Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.012160 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c157f08-279b-48dd-85a3-14dd87b27864-logs" (OuterVolumeSpecName: "logs") pod "0c157f08-279b-48dd-85a3-14dd87b27864" (UID: "0c157f08-279b-48dd-85a3-14dd87b27864"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.017552 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b62e85e1-8010-42b2-b674-271b49596620-kube-api-access-x87lc" (OuterVolumeSpecName: "kube-api-access-x87lc") pod "b62e85e1-8010-42b2-b674-271b49596620" (UID: "b62e85e1-8010-42b2-b674-271b49596620"). InnerVolumeSpecName "kube-api-access-x87lc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.022783 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c157f08-279b-48dd-85a3-14dd87b27864-kube-api-access-wtn9t" (OuterVolumeSpecName: "kube-api-access-wtn9t") pod "0c157f08-279b-48dd-85a3-14dd87b27864" (UID: "0c157f08-279b-48dd-85a3-14dd87b27864"). InnerVolumeSpecName "kube-api-access-wtn9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.054699 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c157f08-279b-48dd-85a3-14dd87b27864-config-data" (OuterVolumeSpecName: "config-data") pod "0c157f08-279b-48dd-85a3-14dd87b27864" (UID: "0c157f08-279b-48dd-85a3-14dd87b27864"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.055075 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c157f08-279b-48dd-85a3-14dd87b27864-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0c157f08-279b-48dd-85a3-14dd87b27864" (UID: "0c157f08-279b-48dd-85a3-14dd87b27864"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.057147 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b62e85e1-8010-42b2-b674-271b49596620-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b62e85e1-8010-42b2-b674-271b49596620" (UID: "b62e85e1-8010-42b2-b674-271b49596620"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.078953 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b62e85e1-8010-42b2-b674-271b49596620-config-data" (OuterVolumeSpecName: "config-data") pod "b62e85e1-8010-42b2-b674-271b49596620" (UID: "b62e85e1-8010-42b2-b674-271b49596620"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.092073 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c157f08-279b-48dd-85a3-14dd87b27864-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "0c157f08-279b-48dd-85a3-14dd87b27864" (UID: "0c157f08-279b-48dd-85a3-14dd87b27864"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.112852 4664 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c157f08-279b-48dd-85a3-14dd87b27864-logs\") on node \"crc\" DevicePath \"\"" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.112897 4664 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b62e85e1-8010-42b2-b674-271b49596620-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.112910 4664 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c157f08-279b-48dd-85a3-14dd87b27864-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.112921 4664 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c157f08-279b-48dd-85a3-14dd87b27864-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.112932 4664 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b62e85e1-8010-42b2-b674-271b49596620-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.112943 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtn9t\" (UniqueName: \"kubernetes.io/projected/0c157f08-279b-48dd-85a3-14dd87b27864-kube-api-access-wtn9t\") on node \"crc\" DevicePath \"\"" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.112955 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x87lc\" (UniqueName: \"kubernetes.io/projected/b62e85e1-8010-42b2-b674-271b49596620-kube-api-access-x87lc\") on node \"crc\" DevicePath \"\"" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.112967 4664 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c157f08-279b-48dd-85a3-14dd87b27864-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.363582 4664 generic.go:334] "Generic (PLEG): container finished" podID="0c157f08-279b-48dd-85a3-14dd87b27864" containerID="5b2a58751be16c596b359d4ce29d6e09592cec35efa1747f430670c24874ea23" exitCode=0 Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.363688 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0c157f08-279b-48dd-85a3-14dd87b27864","Type":"ContainerDied","Data":"5b2a58751be16c596b359d4ce29d6e09592cec35efa1747f430670c24874ea23"} Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.363722 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0c157f08-279b-48dd-85a3-14dd87b27864","Type":"ContainerDied","Data":"e012e106913ea77f1084dd61c14fad3c53e449083574b74f9909fd8fe99572e9"} Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.363741 4664 scope.go:117] "RemoveContainer" containerID="5b2a58751be16c596b359d4ce29d6e09592cec35efa1747f430670c24874ea23" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.363769 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.367999 4664 generic.go:334] "Generic (PLEG): container finished" podID="b62e85e1-8010-42b2-b674-271b49596620" containerID="191612892f5f8a7fadf384544affdfc81939e55e4ae4d741a1451acc4f2e8551" exitCode=0 Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.368035 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b62e85e1-8010-42b2-b674-271b49596620","Type":"ContainerDied","Data":"191612892f5f8a7fadf384544affdfc81939e55e4ae4d741a1451acc4f2e8551"} Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.368057 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b62e85e1-8010-42b2-b674-271b49596620","Type":"ContainerDied","Data":"d80acc6cb40df6df76226fed4de0cf3921f094654f765edcf8bad1726760d7a8"} Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.368116 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.410577 4664 scope.go:117] "RemoveContainer" containerID="958aa7ca0a5322005f9012262823ce36614c9dc908ce654220fe4e337d2168b0" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.416119 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.445014 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.454327 4664 scope.go:117] "RemoveContainer" containerID="5b2a58751be16c596b359d4ce29d6e09592cec35efa1747f430670c24874ea23" Oct 03 08:12:06 crc kubenswrapper[4664]: E1003 08:12:06.465121 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b2a58751be16c596b359d4ce29d6e09592cec35efa1747f430670c24874ea23\": container with ID starting with 5b2a58751be16c596b359d4ce29d6e09592cec35efa1747f430670c24874ea23 not found: ID does not exist" containerID="5b2a58751be16c596b359d4ce29d6e09592cec35efa1747f430670c24874ea23" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.465186 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b2a58751be16c596b359d4ce29d6e09592cec35efa1747f430670c24874ea23"} err="failed to get container status \"5b2a58751be16c596b359d4ce29d6e09592cec35efa1747f430670c24874ea23\": rpc error: code = NotFound desc = could not find container \"5b2a58751be16c596b359d4ce29d6e09592cec35efa1747f430670c24874ea23\": container with ID starting with 5b2a58751be16c596b359d4ce29d6e09592cec35efa1747f430670c24874ea23 not found: ID does not exist" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.465212 4664 scope.go:117] "RemoveContainer" containerID="958aa7ca0a5322005f9012262823ce36614c9dc908ce654220fe4e337d2168b0" Oct 03 08:12:06 crc kubenswrapper[4664]: E1003 08:12:06.470343 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"958aa7ca0a5322005f9012262823ce36614c9dc908ce654220fe4e337d2168b0\": container with ID starting with 958aa7ca0a5322005f9012262823ce36614c9dc908ce654220fe4e337d2168b0 not found: ID does not exist" containerID="958aa7ca0a5322005f9012262823ce36614c9dc908ce654220fe4e337d2168b0" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.470418 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"958aa7ca0a5322005f9012262823ce36614c9dc908ce654220fe4e337d2168b0"} err="failed to get container status \"958aa7ca0a5322005f9012262823ce36614c9dc908ce654220fe4e337d2168b0\": rpc error: code = NotFound desc = could not find container \"958aa7ca0a5322005f9012262823ce36614c9dc908ce654220fe4e337d2168b0\": container with ID starting with 958aa7ca0a5322005f9012262823ce36614c9dc908ce654220fe4e337d2168b0 not found: ID does not exist" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.470457 4664 scope.go:117] "RemoveContainer" containerID="191612892f5f8a7fadf384544affdfc81939e55e4ae4d741a1451acc4f2e8551" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.470649 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.490534 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 03 08:12:06 crc kubenswrapper[4664]: E1003 08:12:06.491108 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c157f08-279b-48dd-85a3-14dd87b27864" containerName="nova-metadata-log" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.491130 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c157f08-279b-48dd-85a3-14dd87b27864" containerName="nova-metadata-log" Oct 03 08:12:06 crc kubenswrapper[4664]: E1003 08:12:06.491142 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b62e85e1-8010-42b2-b674-271b49596620" containerName="nova-scheduler-scheduler" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.491150 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="b62e85e1-8010-42b2-b674-271b49596620" containerName="nova-scheduler-scheduler" Oct 03 08:12:06 crc kubenswrapper[4664]: E1003 08:12:06.491166 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c157f08-279b-48dd-85a3-14dd87b27864" containerName="nova-metadata-metadata" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.491174 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c157f08-279b-48dd-85a3-14dd87b27864" containerName="nova-metadata-metadata" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.491425 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c157f08-279b-48dd-85a3-14dd87b27864" containerName="nova-metadata-metadata" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.491445 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c157f08-279b-48dd-85a3-14dd87b27864" containerName="nova-metadata-log" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.491457 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="b62e85e1-8010-42b2-b674-271b49596620" containerName="nova-scheduler-scheduler" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.492976 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.497734 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.499766 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.501965 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.512858 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.514288 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.527034 4664 scope.go:117] "RemoveContainer" containerID="191612892f5f8a7fadf384544affdfc81939e55e4ae4d741a1451acc4f2e8551" Oct 03 08:12:06 crc kubenswrapper[4664]: E1003 08:12:06.527488 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"191612892f5f8a7fadf384544affdfc81939e55e4ae4d741a1451acc4f2e8551\": container with ID starting with 191612892f5f8a7fadf384544affdfc81939e55e4ae4d741a1451acc4f2e8551 not found: ID does not exist" containerID="191612892f5f8a7fadf384544affdfc81939e55e4ae4d741a1451acc4f2e8551" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.527531 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"191612892f5f8a7fadf384544affdfc81939e55e4ae4d741a1451acc4f2e8551"} err="failed to get container status \"191612892f5f8a7fadf384544affdfc81939e55e4ae4d741a1451acc4f2e8551\": rpc error: code = NotFound desc = could not find container \"191612892f5f8a7fadf384544affdfc81939e55e4ae4d741a1451acc4f2e8551\": container with ID starting with 191612892f5f8a7fadf384544affdfc81939e55e4ae4d741a1451acc4f2e8551 not found: ID does not exist" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.533187 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.546451 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.556487 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.628136 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlqnl\" (UniqueName: \"kubernetes.io/projected/817b8b02-eef7-4753-ad19-8bf7fd3fbe9a-kube-api-access-jlqnl\") pod \"nova-metadata-0\" (UID: \"817b8b02-eef7-4753-ad19-8bf7fd3fbe9a\") " pod="openstack/nova-metadata-0" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.628207 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/817b8b02-eef7-4753-ad19-8bf7fd3fbe9a-logs\") pod \"nova-metadata-0\" (UID: \"817b8b02-eef7-4753-ad19-8bf7fd3fbe9a\") " pod="openstack/nova-metadata-0" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.628360 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7gqg\" (UniqueName: \"kubernetes.io/projected/e271fae6-d173-43f6-ad2b-27e3c182134b-kube-api-access-m7gqg\") pod \"nova-scheduler-0\" (UID: \"e271fae6-d173-43f6-ad2b-27e3c182134b\") " pod="openstack/nova-scheduler-0" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.628437 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e271fae6-d173-43f6-ad2b-27e3c182134b-config-data\") pod \"nova-scheduler-0\" (UID: \"e271fae6-d173-43f6-ad2b-27e3c182134b\") " pod="openstack/nova-scheduler-0" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.628653 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/817b8b02-eef7-4753-ad19-8bf7fd3fbe9a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"817b8b02-eef7-4753-ad19-8bf7fd3fbe9a\") " pod="openstack/nova-metadata-0" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.628710 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/817b8b02-eef7-4753-ad19-8bf7fd3fbe9a-config-data\") pod \"nova-metadata-0\" (UID: \"817b8b02-eef7-4753-ad19-8bf7fd3fbe9a\") " pod="openstack/nova-metadata-0" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.628894 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/817b8b02-eef7-4753-ad19-8bf7fd3fbe9a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"817b8b02-eef7-4753-ad19-8bf7fd3fbe9a\") " pod="openstack/nova-metadata-0" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.629126 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e271fae6-d173-43f6-ad2b-27e3c182134b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e271fae6-d173-43f6-ad2b-27e3c182134b\") " pod="openstack/nova-scheduler-0" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.730945 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/817b8b02-eef7-4753-ad19-8bf7fd3fbe9a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"817b8b02-eef7-4753-ad19-8bf7fd3fbe9a\") " pod="openstack/nova-metadata-0" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.730998 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/817b8b02-eef7-4753-ad19-8bf7fd3fbe9a-config-data\") pod \"nova-metadata-0\" (UID: \"817b8b02-eef7-4753-ad19-8bf7fd3fbe9a\") " pod="openstack/nova-metadata-0" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.731058 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/817b8b02-eef7-4753-ad19-8bf7fd3fbe9a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"817b8b02-eef7-4753-ad19-8bf7fd3fbe9a\") " pod="openstack/nova-metadata-0" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.731089 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e271fae6-d173-43f6-ad2b-27e3c182134b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e271fae6-d173-43f6-ad2b-27e3c182134b\") " pod="openstack/nova-scheduler-0" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.731128 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlqnl\" (UniqueName: \"kubernetes.io/projected/817b8b02-eef7-4753-ad19-8bf7fd3fbe9a-kube-api-access-jlqnl\") pod \"nova-metadata-0\" (UID: \"817b8b02-eef7-4753-ad19-8bf7fd3fbe9a\") " pod="openstack/nova-metadata-0" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.731156 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/817b8b02-eef7-4753-ad19-8bf7fd3fbe9a-logs\") pod \"nova-metadata-0\" (UID: \"817b8b02-eef7-4753-ad19-8bf7fd3fbe9a\") " pod="openstack/nova-metadata-0" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.731178 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7gqg\" (UniqueName: \"kubernetes.io/projected/e271fae6-d173-43f6-ad2b-27e3c182134b-kube-api-access-m7gqg\") pod \"nova-scheduler-0\" (UID: \"e271fae6-d173-43f6-ad2b-27e3c182134b\") " pod="openstack/nova-scheduler-0" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.731201 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e271fae6-d173-43f6-ad2b-27e3c182134b-config-data\") pod \"nova-scheduler-0\" (UID: \"e271fae6-d173-43f6-ad2b-27e3c182134b\") " pod="openstack/nova-scheduler-0" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.733648 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/817b8b02-eef7-4753-ad19-8bf7fd3fbe9a-logs\") pod \"nova-metadata-0\" (UID: \"817b8b02-eef7-4753-ad19-8bf7fd3fbe9a\") " pod="openstack/nova-metadata-0" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.736514 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e271fae6-d173-43f6-ad2b-27e3c182134b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e271fae6-d173-43f6-ad2b-27e3c182134b\") " pod="openstack/nova-scheduler-0" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.737280 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e271fae6-d173-43f6-ad2b-27e3c182134b-config-data\") pod \"nova-scheduler-0\" (UID: \"e271fae6-d173-43f6-ad2b-27e3c182134b\") " pod="openstack/nova-scheduler-0" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.738847 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/817b8b02-eef7-4753-ad19-8bf7fd3fbe9a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"817b8b02-eef7-4753-ad19-8bf7fd3fbe9a\") " pod="openstack/nova-metadata-0" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.739341 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/817b8b02-eef7-4753-ad19-8bf7fd3fbe9a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"817b8b02-eef7-4753-ad19-8bf7fd3fbe9a\") " pod="openstack/nova-metadata-0" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.739534 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/817b8b02-eef7-4753-ad19-8bf7fd3fbe9a-config-data\") pod \"nova-metadata-0\" (UID: \"817b8b02-eef7-4753-ad19-8bf7fd3fbe9a\") " pod="openstack/nova-metadata-0" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.750180 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7gqg\" (UniqueName: \"kubernetes.io/projected/e271fae6-d173-43f6-ad2b-27e3c182134b-kube-api-access-m7gqg\") pod \"nova-scheduler-0\" (UID: \"e271fae6-d173-43f6-ad2b-27e3c182134b\") " pod="openstack/nova-scheduler-0" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.750308 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlqnl\" (UniqueName: \"kubernetes.io/projected/817b8b02-eef7-4753-ad19-8bf7fd3fbe9a-kube-api-access-jlqnl\") pod \"nova-metadata-0\" (UID: \"817b8b02-eef7-4753-ad19-8bf7fd3fbe9a\") " pod="openstack/nova-metadata-0" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.819057 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 08:12:06 crc kubenswrapper[4664]: I1003 08:12:06.896437 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 08:12:07 crc kubenswrapper[4664]: I1003 08:12:07.293017 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xgmwb"] Oct 03 08:12:07 crc kubenswrapper[4664]: I1003 08:12:07.295355 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xgmwb" Oct 03 08:12:07 crc kubenswrapper[4664]: I1003 08:12:07.364759 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xgmwb"] Oct 03 08:12:07 crc kubenswrapper[4664]: W1003 08:12:07.372058 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod817b8b02_eef7_4753_ad19_8bf7fd3fbe9a.slice/crio-080b136b55fd0d2399d9bc7785a206cd7f20c06ad1148495d233e36f567affc0 WatchSource:0}: Error finding container 080b136b55fd0d2399d9bc7785a206cd7f20c06ad1148495d233e36f567affc0: Status 404 returned error can't find the container with id 080b136b55fd0d2399d9bc7785a206cd7f20c06ad1148495d233e36f567affc0 Oct 03 08:12:07 crc kubenswrapper[4664]: I1003 08:12:07.404592 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 08:12:07 crc kubenswrapper[4664]: I1003 08:12:07.449066 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh869\" (UniqueName: \"kubernetes.io/projected/b3de7938-81c4-44e0-a3e5-d050b3c9dcc4-kube-api-access-qh869\") pod \"certified-operators-xgmwb\" (UID: \"b3de7938-81c4-44e0-a3e5-d050b3c9dcc4\") " pod="openshift-marketplace/certified-operators-xgmwb" Oct 03 08:12:07 crc kubenswrapper[4664]: I1003 08:12:07.449189 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3de7938-81c4-44e0-a3e5-d050b3c9dcc4-utilities\") pod \"certified-operators-xgmwb\" (UID: \"b3de7938-81c4-44e0-a3e5-d050b3c9dcc4\") " pod="openshift-marketplace/certified-operators-xgmwb" Oct 03 08:12:07 crc kubenswrapper[4664]: I1003 08:12:07.449207 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3de7938-81c4-44e0-a3e5-d050b3c9dcc4-catalog-content\") pod \"certified-operators-xgmwb\" (UID: \"b3de7938-81c4-44e0-a3e5-d050b3c9dcc4\") " pod="openshift-marketplace/certified-operators-xgmwb" Oct 03 08:12:07 crc kubenswrapper[4664]: I1003 08:12:07.508724 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 08:12:07 crc kubenswrapper[4664]: W1003 08:12:07.520667 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode271fae6_d173_43f6_ad2b_27e3c182134b.slice/crio-6376d2529d029e32c7f29dde0017b31d2a123c478b50a7a8ca7e28cea458eadc WatchSource:0}: Error finding container 6376d2529d029e32c7f29dde0017b31d2a123c478b50a7a8ca7e28cea458eadc: Status 404 returned error can't find the container with id 6376d2529d029e32c7f29dde0017b31d2a123c478b50a7a8ca7e28cea458eadc Oct 03 08:12:07 crc kubenswrapper[4664]: I1003 08:12:07.550642 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qh869\" (UniqueName: \"kubernetes.io/projected/b3de7938-81c4-44e0-a3e5-d050b3c9dcc4-kube-api-access-qh869\") pod \"certified-operators-xgmwb\" (UID: \"b3de7938-81c4-44e0-a3e5-d050b3c9dcc4\") " pod="openshift-marketplace/certified-operators-xgmwb" Oct 03 08:12:07 crc kubenswrapper[4664]: I1003 08:12:07.551081 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3de7938-81c4-44e0-a3e5-d050b3c9dcc4-utilities\") pod \"certified-operators-xgmwb\" (UID: \"b3de7938-81c4-44e0-a3e5-d050b3c9dcc4\") " pod="openshift-marketplace/certified-operators-xgmwb" Oct 03 08:12:07 crc kubenswrapper[4664]: I1003 08:12:07.551119 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3de7938-81c4-44e0-a3e5-d050b3c9dcc4-catalog-content\") pod \"certified-operators-xgmwb\" (UID: \"b3de7938-81c4-44e0-a3e5-d050b3c9dcc4\") " pod="openshift-marketplace/certified-operators-xgmwb" Oct 03 08:12:07 crc kubenswrapper[4664]: I1003 08:12:07.551678 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3de7938-81c4-44e0-a3e5-d050b3c9dcc4-utilities\") pod \"certified-operators-xgmwb\" (UID: \"b3de7938-81c4-44e0-a3e5-d050b3c9dcc4\") " pod="openshift-marketplace/certified-operators-xgmwb" Oct 03 08:12:07 crc kubenswrapper[4664]: I1003 08:12:07.551822 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3de7938-81c4-44e0-a3e5-d050b3c9dcc4-catalog-content\") pod \"certified-operators-xgmwb\" (UID: \"b3de7938-81c4-44e0-a3e5-d050b3c9dcc4\") " pod="openshift-marketplace/certified-operators-xgmwb" Oct 03 08:12:07 crc kubenswrapper[4664]: I1003 08:12:07.572869 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh869\" (UniqueName: \"kubernetes.io/projected/b3de7938-81c4-44e0-a3e5-d050b3c9dcc4-kube-api-access-qh869\") pod \"certified-operators-xgmwb\" (UID: \"b3de7938-81c4-44e0-a3e5-d050b3c9dcc4\") " pod="openshift-marketplace/certified-operators-xgmwb" Oct 03 08:12:07 crc kubenswrapper[4664]: I1003 08:12:07.690638 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xgmwb" Oct 03 08:12:07 crc kubenswrapper[4664]: I1003 08:12:07.920589 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c157f08-279b-48dd-85a3-14dd87b27864" path="/var/lib/kubelet/pods/0c157f08-279b-48dd-85a3-14dd87b27864/volumes" Oct 03 08:12:07 crc kubenswrapper[4664]: I1003 08:12:07.921665 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b62e85e1-8010-42b2-b674-271b49596620" path="/var/lib/kubelet/pods/b62e85e1-8010-42b2-b674-271b49596620/volumes" Oct 03 08:12:08 crc kubenswrapper[4664]: I1003 08:12:08.380929 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xgmwb"] Oct 03 08:12:08 crc kubenswrapper[4664]: W1003 08:12:08.383196 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3de7938_81c4_44e0_a3e5_d050b3c9dcc4.slice/crio-c921d6393f031ecc4ec1b5835c56be9724a765bfb4258b237cfed30d865402af WatchSource:0}: Error finding container c921d6393f031ecc4ec1b5835c56be9724a765bfb4258b237cfed30d865402af: Status 404 returned error can't find the container with id c921d6393f031ecc4ec1b5835c56be9724a765bfb4258b237cfed30d865402af Oct 03 08:12:08 crc kubenswrapper[4664]: I1003 08:12:08.455472 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e271fae6-d173-43f6-ad2b-27e3c182134b","Type":"ContainerStarted","Data":"9f6491bf0cc0a6d9d6b834565456b9ba82bb29d82ebcab8e209e2798b9002996"} Oct 03 08:12:08 crc kubenswrapper[4664]: I1003 08:12:08.456348 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e271fae6-d173-43f6-ad2b-27e3c182134b","Type":"ContainerStarted","Data":"6376d2529d029e32c7f29dde0017b31d2a123c478b50a7a8ca7e28cea458eadc"} Oct 03 08:12:08 crc kubenswrapper[4664]: I1003 08:12:08.457029 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xgmwb" event={"ID":"b3de7938-81c4-44e0-a3e5-d050b3c9dcc4","Type":"ContainerStarted","Data":"c921d6393f031ecc4ec1b5835c56be9724a765bfb4258b237cfed30d865402af"} Oct 03 08:12:08 crc kubenswrapper[4664]: I1003 08:12:08.459157 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"817b8b02-eef7-4753-ad19-8bf7fd3fbe9a","Type":"ContainerStarted","Data":"27ee16ce089f94155738e0399c4d280447829dd80cdafaccd4cde5bd996aafe3"} Oct 03 08:12:08 crc kubenswrapper[4664]: I1003 08:12:08.459215 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"817b8b02-eef7-4753-ad19-8bf7fd3fbe9a","Type":"ContainerStarted","Data":"8df0c71efd7316d904b22d40447cc8aa9df45ce33a1846525ec5a83ee6a95bbe"} Oct 03 08:12:08 crc kubenswrapper[4664]: I1003 08:12:08.459227 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"817b8b02-eef7-4753-ad19-8bf7fd3fbe9a","Type":"ContainerStarted","Data":"080b136b55fd0d2399d9bc7785a206cd7f20c06ad1148495d233e36f567affc0"} Oct 03 08:12:08 crc kubenswrapper[4664]: I1003 08:12:08.481240 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.4812215970000002 podStartE2EDuration="2.481221597s" podCreationTimestamp="2025-10-03 08:12:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:12:08.471453341 +0000 UTC m=+1429.292643831" watchObservedRunningTime="2025-10-03 08:12:08.481221597 +0000 UTC m=+1429.302412087" Oct 03 08:12:08 crc kubenswrapper[4664]: I1003 08:12:08.495863 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.49584205 podStartE2EDuration="2.49584205s" podCreationTimestamp="2025-10-03 08:12:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:12:08.490603441 +0000 UTC m=+1429.311793951" watchObservedRunningTime="2025-10-03 08:12:08.49584205 +0000 UTC m=+1429.317032530" Oct 03 08:12:09 crc kubenswrapper[4664]: I1003 08:12:09.469171 4664 generic.go:334] "Generic (PLEG): container finished" podID="b3de7938-81c4-44e0-a3e5-d050b3c9dcc4" containerID="545d464a8b92272475155259146737cc34c0d821148042f23933721e62e32e89" exitCode=0 Oct 03 08:12:09 crc kubenswrapper[4664]: I1003 08:12:09.470609 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xgmwb" event={"ID":"b3de7938-81c4-44e0-a3e5-d050b3c9dcc4","Type":"ContainerDied","Data":"545d464a8b92272475155259146737cc34c0d821148042f23933721e62e32e89"} Oct 03 08:12:10 crc kubenswrapper[4664]: I1003 08:12:10.481211 4664 generic.go:334] "Generic (PLEG): container finished" podID="b3de7938-81c4-44e0-a3e5-d050b3c9dcc4" containerID="9c6fc0d2ed8ef97fb80cc460d5629f796641adcf87b3c18274bc11d443c0ba26" exitCode=0 Oct 03 08:12:10 crc kubenswrapper[4664]: I1003 08:12:10.481284 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xgmwb" event={"ID":"b3de7938-81c4-44e0-a3e5-d050b3c9dcc4","Type":"ContainerDied","Data":"9c6fc0d2ed8ef97fb80cc460d5629f796641adcf87b3c18274bc11d443c0ba26"} Oct 03 08:12:11 crc kubenswrapper[4664]: I1003 08:12:11.495277 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xgmwb" event={"ID":"b3de7938-81c4-44e0-a3e5-d050b3c9dcc4","Type":"ContainerStarted","Data":"94daf1e49e39b0f43104a72b9afa83a8015f34f426257cd3cb0b7bdbadbd17ad"} Oct 03 08:12:11 crc kubenswrapper[4664]: I1003 08:12:11.519249 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xgmwb" podStartSLOduration=3.081415993 podStartE2EDuration="4.519229595s" podCreationTimestamp="2025-10-03 08:12:07 +0000 UTC" firstStartedPulling="2025-10-03 08:12:09.472047119 +0000 UTC m=+1430.293237609" lastFinishedPulling="2025-10-03 08:12:10.909860731 +0000 UTC m=+1431.731051211" observedRunningTime="2025-10-03 08:12:11.510640186 +0000 UTC m=+1432.331830686" watchObservedRunningTime="2025-10-03 08:12:11.519229595 +0000 UTC m=+1432.340420085" Oct 03 08:12:11 crc kubenswrapper[4664]: I1003 08:12:11.681751 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 08:12:11 crc kubenswrapper[4664]: I1003 08:12:11.681805 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 08:12:11 crc kubenswrapper[4664]: I1003 08:12:11.819376 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 08:12:11 crc kubenswrapper[4664]: I1003 08:12:11.819424 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 08:12:11 crc kubenswrapper[4664]: I1003 08:12:11.897643 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 03 08:12:11 crc kubenswrapper[4664]: I1003 08:12:11.987019 4664 patch_prober.go:28] interesting pod/machine-config-daemon-x9dgm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:12:11 crc kubenswrapper[4664]: I1003 08:12:11.987080 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:12:12 crc kubenswrapper[4664]: I1003 08:12:12.696732 4664 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b708c60d-d069-4a68-8bf7-0d2e9a325eb0" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.201:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 08:12:12 crc kubenswrapper[4664]: I1003 08:12:12.696881 4664 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b708c60d-d069-4a68-8bf7-0d2e9a325eb0" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.201:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 08:12:14 crc kubenswrapper[4664]: I1003 08:12:14.657765 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jxbkv"] Oct 03 08:12:14 crc kubenswrapper[4664]: I1003 08:12:14.662917 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jxbkv" Oct 03 08:12:14 crc kubenswrapper[4664]: I1003 08:12:14.672300 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jxbkv"] Oct 03 08:12:14 crc kubenswrapper[4664]: I1003 08:12:14.707457 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnxj2\" (UniqueName: \"kubernetes.io/projected/3040da34-0901-40ca-9231-ca7a8dccf838-kube-api-access-tnxj2\") pod \"redhat-marketplace-jxbkv\" (UID: \"3040da34-0901-40ca-9231-ca7a8dccf838\") " pod="openshift-marketplace/redhat-marketplace-jxbkv" Oct 03 08:12:14 crc kubenswrapper[4664]: I1003 08:12:14.707723 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3040da34-0901-40ca-9231-ca7a8dccf838-utilities\") pod \"redhat-marketplace-jxbkv\" (UID: \"3040da34-0901-40ca-9231-ca7a8dccf838\") " pod="openshift-marketplace/redhat-marketplace-jxbkv" Oct 03 08:12:14 crc kubenswrapper[4664]: I1003 08:12:14.707753 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3040da34-0901-40ca-9231-ca7a8dccf838-catalog-content\") pod \"redhat-marketplace-jxbkv\" (UID: \"3040da34-0901-40ca-9231-ca7a8dccf838\") " pod="openshift-marketplace/redhat-marketplace-jxbkv" Oct 03 08:12:14 crc kubenswrapper[4664]: I1003 08:12:14.809437 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3040da34-0901-40ca-9231-ca7a8dccf838-utilities\") pod \"redhat-marketplace-jxbkv\" (UID: \"3040da34-0901-40ca-9231-ca7a8dccf838\") " pod="openshift-marketplace/redhat-marketplace-jxbkv" Oct 03 08:12:14 crc kubenswrapper[4664]: I1003 08:12:14.810032 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3040da34-0901-40ca-9231-ca7a8dccf838-catalog-content\") pod \"redhat-marketplace-jxbkv\" (UID: \"3040da34-0901-40ca-9231-ca7a8dccf838\") " pod="openshift-marketplace/redhat-marketplace-jxbkv" Oct 03 08:12:14 crc kubenswrapper[4664]: I1003 08:12:14.810056 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3040da34-0901-40ca-9231-ca7a8dccf838-utilities\") pod \"redhat-marketplace-jxbkv\" (UID: \"3040da34-0901-40ca-9231-ca7a8dccf838\") " pod="openshift-marketplace/redhat-marketplace-jxbkv" Oct 03 08:12:14 crc kubenswrapper[4664]: I1003 08:12:14.810187 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnxj2\" (UniqueName: \"kubernetes.io/projected/3040da34-0901-40ca-9231-ca7a8dccf838-kube-api-access-tnxj2\") pod \"redhat-marketplace-jxbkv\" (UID: \"3040da34-0901-40ca-9231-ca7a8dccf838\") " pod="openshift-marketplace/redhat-marketplace-jxbkv" Oct 03 08:12:14 crc kubenswrapper[4664]: I1003 08:12:14.810315 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3040da34-0901-40ca-9231-ca7a8dccf838-catalog-content\") pod \"redhat-marketplace-jxbkv\" (UID: \"3040da34-0901-40ca-9231-ca7a8dccf838\") " pod="openshift-marketplace/redhat-marketplace-jxbkv" Oct 03 08:12:14 crc kubenswrapper[4664]: I1003 08:12:14.832931 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnxj2\" (UniqueName: \"kubernetes.io/projected/3040da34-0901-40ca-9231-ca7a8dccf838-kube-api-access-tnxj2\") pod \"redhat-marketplace-jxbkv\" (UID: \"3040da34-0901-40ca-9231-ca7a8dccf838\") " pod="openshift-marketplace/redhat-marketplace-jxbkv" Oct 03 08:12:15 crc kubenswrapper[4664]: I1003 08:12:15.022010 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jxbkv" Oct 03 08:12:15 crc kubenswrapper[4664]: I1003 08:12:15.540291 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jxbkv"] Oct 03 08:12:16 crc kubenswrapper[4664]: I1003 08:12:16.567249 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jxbkv" event={"ID":"3040da34-0901-40ca-9231-ca7a8dccf838","Type":"ContainerStarted","Data":"83ec0687a3aaf234bf90791c4e02e42790c2e8b1630b3cbbcd7d3a56c5989707"} Oct 03 08:12:16 crc kubenswrapper[4664]: I1003 08:12:16.567546 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jxbkv" event={"ID":"3040da34-0901-40ca-9231-ca7a8dccf838","Type":"ContainerStarted","Data":"451226b0e869c73d7ac4398b7bbe583e1825f6968cd4044f6fa0d340ec69bd34"} Oct 03 08:12:16 crc kubenswrapper[4664]: I1003 08:12:16.819213 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 03 08:12:16 crc kubenswrapper[4664]: I1003 08:12:16.820653 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 03 08:12:16 crc kubenswrapper[4664]: I1003 08:12:16.899901 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 03 08:12:16 crc kubenswrapper[4664]: I1003 08:12:16.960967 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 03 08:12:17 crc kubenswrapper[4664]: I1003 08:12:17.581785 4664 generic.go:334] "Generic (PLEG): container finished" podID="3040da34-0901-40ca-9231-ca7a8dccf838" containerID="83ec0687a3aaf234bf90791c4e02e42790c2e8b1630b3cbbcd7d3a56c5989707" exitCode=0 Oct 03 08:12:17 crc kubenswrapper[4664]: I1003 08:12:17.581830 4664 generic.go:334] "Generic (PLEG): container finished" podID="3040da34-0901-40ca-9231-ca7a8dccf838" containerID="2dcc55d10aa329f7296a6a45d71b5060d2a58a9e16cc0831a0b900cb1b617032" exitCode=0 Oct 03 08:12:17 crc kubenswrapper[4664]: I1003 08:12:17.581868 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jxbkv" event={"ID":"3040da34-0901-40ca-9231-ca7a8dccf838","Type":"ContainerDied","Data":"83ec0687a3aaf234bf90791c4e02e42790c2e8b1630b3cbbcd7d3a56c5989707"} Oct 03 08:12:17 crc kubenswrapper[4664]: I1003 08:12:17.581913 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jxbkv" event={"ID":"3040da34-0901-40ca-9231-ca7a8dccf838","Type":"ContainerDied","Data":"2dcc55d10aa329f7296a6a45d71b5060d2a58a9e16cc0831a0b900cb1b617032"} Oct 03 08:12:17 crc kubenswrapper[4664]: I1003 08:12:17.622363 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 03 08:12:17 crc kubenswrapper[4664]: I1003 08:12:17.691803 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xgmwb" Oct 03 08:12:17 crc kubenswrapper[4664]: I1003 08:12:17.691873 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xgmwb" Oct 03 08:12:17 crc kubenswrapper[4664]: I1003 08:12:17.747628 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xgmwb" Oct 03 08:12:17 crc kubenswrapper[4664]: I1003 08:12:17.873994 4664 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="817b8b02-eef7-4753-ad19-8bf7fd3fbe9a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 08:12:17 crc kubenswrapper[4664]: I1003 08:12:17.874072 4664 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="817b8b02-eef7-4753-ad19-8bf7fd3fbe9a" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 08:12:18 crc kubenswrapper[4664]: I1003 08:12:18.601201 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jxbkv" event={"ID":"3040da34-0901-40ca-9231-ca7a8dccf838","Type":"ContainerStarted","Data":"f6c9d6e42745f370601c152766c5735fe04cf93436e994a70c020c40618f412b"} Oct 03 08:12:18 crc kubenswrapper[4664]: I1003 08:12:18.638022 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jxbkv" podStartSLOduration=2.852640934 podStartE2EDuration="4.638000586s" podCreationTimestamp="2025-10-03 08:12:14 +0000 UTC" firstStartedPulling="2025-10-03 08:12:16.569134743 +0000 UTC m=+1437.390325233" lastFinishedPulling="2025-10-03 08:12:18.354494405 +0000 UTC m=+1439.175684885" observedRunningTime="2025-10-03 08:12:18.633422057 +0000 UTC m=+1439.454612577" watchObservedRunningTime="2025-10-03 08:12:18.638000586 +0000 UTC m=+1439.459191096" Oct 03 08:12:18 crc kubenswrapper[4664]: I1003 08:12:18.657815 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xgmwb" Oct 03 08:12:20 crc kubenswrapper[4664]: I1003 08:12:20.052437 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xgmwb"] Oct 03 08:12:20 crc kubenswrapper[4664]: I1003 08:12:20.449758 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 03 08:12:20 crc kubenswrapper[4664]: I1003 08:12:20.628247 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xgmwb" podUID="b3de7938-81c4-44e0-a3e5-d050b3c9dcc4" containerName="registry-server" containerID="cri-o://94daf1e49e39b0f43104a72b9afa83a8015f34f426257cd3cb0b7bdbadbd17ad" gracePeriod=2 Oct 03 08:12:21 crc kubenswrapper[4664]: I1003 08:12:21.092677 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xgmwb" Oct 03 08:12:21 crc kubenswrapper[4664]: I1003 08:12:21.138808 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qh869\" (UniqueName: \"kubernetes.io/projected/b3de7938-81c4-44e0-a3e5-d050b3c9dcc4-kube-api-access-qh869\") pod \"b3de7938-81c4-44e0-a3e5-d050b3c9dcc4\" (UID: \"b3de7938-81c4-44e0-a3e5-d050b3c9dcc4\") " Oct 03 08:12:21 crc kubenswrapper[4664]: I1003 08:12:21.141832 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3de7938-81c4-44e0-a3e5-d050b3c9dcc4-catalog-content\") pod \"b3de7938-81c4-44e0-a3e5-d050b3c9dcc4\" (UID: \"b3de7938-81c4-44e0-a3e5-d050b3c9dcc4\") " Oct 03 08:12:21 crc kubenswrapper[4664]: I1003 08:12:21.141899 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3de7938-81c4-44e0-a3e5-d050b3c9dcc4-utilities\") pod \"b3de7938-81c4-44e0-a3e5-d050b3c9dcc4\" (UID: \"b3de7938-81c4-44e0-a3e5-d050b3c9dcc4\") " Oct 03 08:12:21 crc kubenswrapper[4664]: I1003 08:12:21.143846 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3de7938-81c4-44e0-a3e5-d050b3c9dcc4-utilities" (OuterVolumeSpecName: "utilities") pod "b3de7938-81c4-44e0-a3e5-d050b3c9dcc4" (UID: "b3de7938-81c4-44e0-a3e5-d050b3c9dcc4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:12:21 crc kubenswrapper[4664]: I1003 08:12:21.147682 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3de7938-81c4-44e0-a3e5-d050b3c9dcc4-kube-api-access-qh869" (OuterVolumeSpecName: "kube-api-access-qh869") pod "b3de7938-81c4-44e0-a3e5-d050b3c9dcc4" (UID: "b3de7938-81c4-44e0-a3e5-d050b3c9dcc4"). InnerVolumeSpecName "kube-api-access-qh869". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:12:21 crc kubenswrapper[4664]: I1003 08:12:21.244897 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3de7938-81c4-44e0-a3e5-d050b3c9dcc4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b3de7938-81c4-44e0-a3e5-d050b3c9dcc4" (UID: "b3de7938-81c4-44e0-a3e5-d050b3c9dcc4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:12:21 crc kubenswrapper[4664]: I1003 08:12:21.245709 4664 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3de7938-81c4-44e0-a3e5-d050b3c9dcc4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:12:21 crc kubenswrapper[4664]: I1003 08:12:21.246563 4664 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3de7938-81c4-44e0-a3e5-d050b3c9dcc4-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:12:21 crc kubenswrapper[4664]: I1003 08:12:21.246585 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qh869\" (UniqueName: \"kubernetes.io/projected/b3de7938-81c4-44e0-a3e5-d050b3c9dcc4-kube-api-access-qh869\") on node \"crc\" DevicePath \"\"" Oct 03 08:12:21 crc kubenswrapper[4664]: I1003 08:12:21.644746 4664 generic.go:334] "Generic (PLEG): container finished" podID="b3de7938-81c4-44e0-a3e5-d050b3c9dcc4" containerID="94daf1e49e39b0f43104a72b9afa83a8015f34f426257cd3cb0b7bdbadbd17ad" exitCode=0 Oct 03 08:12:21 crc kubenswrapper[4664]: I1003 08:12:21.644804 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xgmwb" event={"ID":"b3de7938-81c4-44e0-a3e5-d050b3c9dcc4","Type":"ContainerDied","Data":"94daf1e49e39b0f43104a72b9afa83a8015f34f426257cd3cb0b7bdbadbd17ad"} Oct 03 08:12:21 crc kubenswrapper[4664]: I1003 08:12:21.644846 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xgmwb" Oct 03 08:12:21 crc kubenswrapper[4664]: I1003 08:12:21.644867 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xgmwb" event={"ID":"b3de7938-81c4-44e0-a3e5-d050b3c9dcc4","Type":"ContainerDied","Data":"c921d6393f031ecc4ec1b5835c56be9724a765bfb4258b237cfed30d865402af"} Oct 03 08:12:21 crc kubenswrapper[4664]: I1003 08:12:21.644889 4664 scope.go:117] "RemoveContainer" containerID="94daf1e49e39b0f43104a72b9afa83a8015f34f426257cd3cb0b7bdbadbd17ad" Oct 03 08:12:21 crc kubenswrapper[4664]: I1003 08:12:21.674842 4664 scope.go:117] "RemoveContainer" containerID="9c6fc0d2ed8ef97fb80cc460d5629f796641adcf87b3c18274bc11d443c0ba26" Oct 03 08:12:21 crc kubenswrapper[4664]: I1003 08:12:21.695807 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 03 08:12:21 crc kubenswrapper[4664]: I1003 08:12:21.698122 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 03 08:12:21 crc kubenswrapper[4664]: I1003 08:12:21.698848 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 03 08:12:21 crc kubenswrapper[4664]: I1003 08:12:21.702991 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xgmwb"] Oct 03 08:12:21 crc kubenswrapper[4664]: I1003 08:12:21.706810 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 03 08:12:21 crc kubenswrapper[4664]: I1003 08:12:21.713768 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xgmwb"] Oct 03 08:12:21 crc kubenswrapper[4664]: I1003 08:12:21.717041 4664 scope.go:117] "RemoveContainer" containerID="545d464a8b92272475155259146737cc34c0d821148042f23933721e62e32e89" Oct 03 08:12:21 crc kubenswrapper[4664]: I1003 08:12:21.757092 4664 scope.go:117] "RemoveContainer" containerID="94daf1e49e39b0f43104a72b9afa83a8015f34f426257cd3cb0b7bdbadbd17ad" Oct 03 08:12:21 crc kubenswrapper[4664]: E1003 08:12:21.757886 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94daf1e49e39b0f43104a72b9afa83a8015f34f426257cd3cb0b7bdbadbd17ad\": container with ID starting with 94daf1e49e39b0f43104a72b9afa83a8015f34f426257cd3cb0b7bdbadbd17ad not found: ID does not exist" containerID="94daf1e49e39b0f43104a72b9afa83a8015f34f426257cd3cb0b7bdbadbd17ad" Oct 03 08:12:21 crc kubenswrapper[4664]: I1003 08:12:21.757940 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94daf1e49e39b0f43104a72b9afa83a8015f34f426257cd3cb0b7bdbadbd17ad"} err="failed to get container status \"94daf1e49e39b0f43104a72b9afa83a8015f34f426257cd3cb0b7bdbadbd17ad\": rpc error: code = NotFound desc = could not find container \"94daf1e49e39b0f43104a72b9afa83a8015f34f426257cd3cb0b7bdbadbd17ad\": container with ID starting with 94daf1e49e39b0f43104a72b9afa83a8015f34f426257cd3cb0b7bdbadbd17ad not found: ID does not exist" Oct 03 08:12:21 crc kubenswrapper[4664]: I1003 08:12:21.757972 4664 scope.go:117] "RemoveContainer" containerID="9c6fc0d2ed8ef97fb80cc460d5629f796641adcf87b3c18274bc11d443c0ba26" Oct 03 08:12:21 crc kubenswrapper[4664]: E1003 08:12:21.758588 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c6fc0d2ed8ef97fb80cc460d5629f796641adcf87b3c18274bc11d443c0ba26\": container with ID starting with 9c6fc0d2ed8ef97fb80cc460d5629f796641adcf87b3c18274bc11d443c0ba26 not found: ID does not exist" containerID="9c6fc0d2ed8ef97fb80cc460d5629f796641adcf87b3c18274bc11d443c0ba26" Oct 03 08:12:21 crc kubenswrapper[4664]: I1003 08:12:21.758652 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c6fc0d2ed8ef97fb80cc460d5629f796641adcf87b3c18274bc11d443c0ba26"} err="failed to get container status \"9c6fc0d2ed8ef97fb80cc460d5629f796641adcf87b3c18274bc11d443c0ba26\": rpc error: code = NotFound desc = could not find container \"9c6fc0d2ed8ef97fb80cc460d5629f796641adcf87b3c18274bc11d443c0ba26\": container with ID starting with 9c6fc0d2ed8ef97fb80cc460d5629f796641adcf87b3c18274bc11d443c0ba26 not found: ID does not exist" Oct 03 08:12:21 crc kubenswrapper[4664]: I1003 08:12:21.758688 4664 scope.go:117] "RemoveContainer" containerID="545d464a8b92272475155259146737cc34c0d821148042f23933721e62e32e89" Oct 03 08:12:21 crc kubenswrapper[4664]: E1003 08:12:21.759200 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"545d464a8b92272475155259146737cc34c0d821148042f23933721e62e32e89\": container with ID starting with 545d464a8b92272475155259146737cc34c0d821148042f23933721e62e32e89 not found: ID does not exist" containerID="545d464a8b92272475155259146737cc34c0d821148042f23933721e62e32e89" Oct 03 08:12:21 crc kubenswrapper[4664]: I1003 08:12:21.759227 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"545d464a8b92272475155259146737cc34c0d821148042f23933721e62e32e89"} err="failed to get container status \"545d464a8b92272475155259146737cc34c0d821148042f23933721e62e32e89\": rpc error: code = NotFound desc = could not find container \"545d464a8b92272475155259146737cc34c0d821148042f23933721e62e32e89\": container with ID starting with 545d464a8b92272475155259146737cc34c0d821148042f23933721e62e32e89 not found: ID does not exist" Oct 03 08:12:21 crc kubenswrapper[4664]: I1003 08:12:21.888696 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3de7938-81c4-44e0-a3e5-d050b3c9dcc4" path="/var/lib/kubelet/pods/b3de7938-81c4-44e0-a3e5-d050b3c9dcc4/volumes" Oct 03 08:12:22 crc kubenswrapper[4664]: I1003 08:12:22.656354 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 03 08:12:22 crc kubenswrapper[4664]: I1003 08:12:22.663191 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 03 08:12:25 crc kubenswrapper[4664]: I1003 08:12:25.022582 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jxbkv" Oct 03 08:12:25 crc kubenswrapper[4664]: I1003 08:12:25.023089 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jxbkv" Oct 03 08:12:25 crc kubenswrapper[4664]: I1003 08:12:25.087903 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jxbkv" Oct 03 08:12:25 crc kubenswrapper[4664]: I1003 08:12:25.747488 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jxbkv" Oct 03 08:12:25 crc kubenswrapper[4664]: I1003 08:12:25.811644 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jxbkv"] Oct 03 08:12:26 crc kubenswrapper[4664]: I1003 08:12:26.827440 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 03 08:12:26 crc kubenswrapper[4664]: I1003 08:12:26.828492 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 03 08:12:26 crc kubenswrapper[4664]: I1003 08:12:26.840069 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 03 08:12:27 crc kubenswrapper[4664]: I1003 08:12:27.710409 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jxbkv" podUID="3040da34-0901-40ca-9231-ca7a8dccf838" containerName="registry-server" containerID="cri-o://f6c9d6e42745f370601c152766c5735fe04cf93436e994a70c020c40618f412b" gracePeriod=2 Oct 03 08:12:27 crc kubenswrapper[4664]: I1003 08:12:27.719485 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 03 08:12:28 crc kubenswrapper[4664]: I1003 08:12:28.263841 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jxbkv" Oct 03 08:12:28 crc kubenswrapper[4664]: I1003 08:12:28.318420 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnxj2\" (UniqueName: \"kubernetes.io/projected/3040da34-0901-40ca-9231-ca7a8dccf838-kube-api-access-tnxj2\") pod \"3040da34-0901-40ca-9231-ca7a8dccf838\" (UID: \"3040da34-0901-40ca-9231-ca7a8dccf838\") " Oct 03 08:12:28 crc kubenswrapper[4664]: I1003 08:12:28.318752 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3040da34-0901-40ca-9231-ca7a8dccf838-utilities\") pod \"3040da34-0901-40ca-9231-ca7a8dccf838\" (UID: \"3040da34-0901-40ca-9231-ca7a8dccf838\") " Oct 03 08:12:28 crc kubenswrapper[4664]: I1003 08:12:28.319018 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3040da34-0901-40ca-9231-ca7a8dccf838-catalog-content\") pod \"3040da34-0901-40ca-9231-ca7a8dccf838\" (UID: \"3040da34-0901-40ca-9231-ca7a8dccf838\") " Oct 03 08:12:28 crc kubenswrapper[4664]: I1003 08:12:28.319999 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3040da34-0901-40ca-9231-ca7a8dccf838-utilities" (OuterVolumeSpecName: "utilities") pod "3040da34-0901-40ca-9231-ca7a8dccf838" (UID: "3040da34-0901-40ca-9231-ca7a8dccf838"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:12:28 crc kubenswrapper[4664]: I1003 08:12:28.335981 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3040da34-0901-40ca-9231-ca7a8dccf838-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3040da34-0901-40ca-9231-ca7a8dccf838" (UID: "3040da34-0901-40ca-9231-ca7a8dccf838"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:12:28 crc kubenswrapper[4664]: I1003 08:12:28.337771 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3040da34-0901-40ca-9231-ca7a8dccf838-kube-api-access-tnxj2" (OuterVolumeSpecName: "kube-api-access-tnxj2") pod "3040da34-0901-40ca-9231-ca7a8dccf838" (UID: "3040da34-0901-40ca-9231-ca7a8dccf838"). InnerVolumeSpecName "kube-api-access-tnxj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:12:28 crc kubenswrapper[4664]: I1003 08:12:28.421505 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnxj2\" (UniqueName: \"kubernetes.io/projected/3040da34-0901-40ca-9231-ca7a8dccf838-kube-api-access-tnxj2\") on node \"crc\" DevicePath \"\"" Oct 03 08:12:28 crc kubenswrapper[4664]: I1003 08:12:28.421700 4664 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3040da34-0901-40ca-9231-ca7a8dccf838-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:12:28 crc kubenswrapper[4664]: I1003 08:12:28.421727 4664 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3040da34-0901-40ca-9231-ca7a8dccf838-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:12:28 crc kubenswrapper[4664]: I1003 08:12:28.722002 4664 generic.go:334] "Generic (PLEG): container finished" podID="3040da34-0901-40ca-9231-ca7a8dccf838" containerID="f6c9d6e42745f370601c152766c5735fe04cf93436e994a70c020c40618f412b" exitCode=0 Oct 03 08:12:28 crc kubenswrapper[4664]: I1003 08:12:28.722085 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jxbkv" Oct 03 08:12:28 crc kubenswrapper[4664]: I1003 08:12:28.722097 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jxbkv" event={"ID":"3040da34-0901-40ca-9231-ca7a8dccf838","Type":"ContainerDied","Data":"f6c9d6e42745f370601c152766c5735fe04cf93436e994a70c020c40618f412b"} Oct 03 08:12:28 crc kubenswrapper[4664]: I1003 08:12:28.722189 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jxbkv" event={"ID":"3040da34-0901-40ca-9231-ca7a8dccf838","Type":"ContainerDied","Data":"451226b0e869c73d7ac4398b7bbe583e1825f6968cd4044f6fa0d340ec69bd34"} Oct 03 08:12:28 crc kubenswrapper[4664]: I1003 08:12:28.722219 4664 scope.go:117] "RemoveContainer" containerID="f6c9d6e42745f370601c152766c5735fe04cf93436e994a70c020c40618f412b" Oct 03 08:12:28 crc kubenswrapper[4664]: I1003 08:12:28.755331 4664 scope.go:117] "RemoveContainer" containerID="2dcc55d10aa329f7296a6a45d71b5060d2a58a9e16cc0831a0b900cb1b617032" Oct 03 08:12:28 crc kubenswrapper[4664]: I1003 08:12:28.788194 4664 scope.go:117] "RemoveContainer" containerID="83ec0687a3aaf234bf90791c4e02e42790c2e8b1630b3cbbcd7d3a56c5989707" Oct 03 08:12:28 crc kubenswrapper[4664]: I1003 08:12:28.792734 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jxbkv"] Oct 03 08:12:28 crc kubenswrapper[4664]: I1003 08:12:28.808463 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jxbkv"] Oct 03 08:12:28 crc kubenswrapper[4664]: I1003 08:12:28.838263 4664 scope.go:117] "RemoveContainer" containerID="f6c9d6e42745f370601c152766c5735fe04cf93436e994a70c020c40618f412b" Oct 03 08:12:28 crc kubenswrapper[4664]: E1003 08:12:28.842693 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6c9d6e42745f370601c152766c5735fe04cf93436e994a70c020c40618f412b\": container with ID starting with f6c9d6e42745f370601c152766c5735fe04cf93436e994a70c020c40618f412b not found: ID does not exist" containerID="f6c9d6e42745f370601c152766c5735fe04cf93436e994a70c020c40618f412b" Oct 03 08:12:28 crc kubenswrapper[4664]: I1003 08:12:28.842736 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6c9d6e42745f370601c152766c5735fe04cf93436e994a70c020c40618f412b"} err="failed to get container status \"f6c9d6e42745f370601c152766c5735fe04cf93436e994a70c020c40618f412b\": rpc error: code = NotFound desc = could not find container \"f6c9d6e42745f370601c152766c5735fe04cf93436e994a70c020c40618f412b\": container with ID starting with f6c9d6e42745f370601c152766c5735fe04cf93436e994a70c020c40618f412b not found: ID does not exist" Oct 03 08:12:28 crc kubenswrapper[4664]: I1003 08:12:28.842761 4664 scope.go:117] "RemoveContainer" containerID="2dcc55d10aa329f7296a6a45d71b5060d2a58a9e16cc0831a0b900cb1b617032" Oct 03 08:12:28 crc kubenswrapper[4664]: E1003 08:12:28.846118 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2dcc55d10aa329f7296a6a45d71b5060d2a58a9e16cc0831a0b900cb1b617032\": container with ID starting with 2dcc55d10aa329f7296a6a45d71b5060d2a58a9e16cc0831a0b900cb1b617032 not found: ID does not exist" containerID="2dcc55d10aa329f7296a6a45d71b5060d2a58a9e16cc0831a0b900cb1b617032" Oct 03 08:12:28 crc kubenswrapper[4664]: I1003 08:12:28.846161 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dcc55d10aa329f7296a6a45d71b5060d2a58a9e16cc0831a0b900cb1b617032"} err="failed to get container status \"2dcc55d10aa329f7296a6a45d71b5060d2a58a9e16cc0831a0b900cb1b617032\": rpc error: code = NotFound desc = could not find container \"2dcc55d10aa329f7296a6a45d71b5060d2a58a9e16cc0831a0b900cb1b617032\": container with ID starting with 2dcc55d10aa329f7296a6a45d71b5060d2a58a9e16cc0831a0b900cb1b617032 not found: ID does not exist" Oct 03 08:12:28 crc kubenswrapper[4664]: I1003 08:12:28.846185 4664 scope.go:117] "RemoveContainer" containerID="83ec0687a3aaf234bf90791c4e02e42790c2e8b1630b3cbbcd7d3a56c5989707" Oct 03 08:12:28 crc kubenswrapper[4664]: E1003 08:12:28.860856 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83ec0687a3aaf234bf90791c4e02e42790c2e8b1630b3cbbcd7d3a56c5989707\": container with ID starting with 83ec0687a3aaf234bf90791c4e02e42790c2e8b1630b3cbbcd7d3a56c5989707 not found: ID does not exist" containerID="83ec0687a3aaf234bf90791c4e02e42790c2e8b1630b3cbbcd7d3a56c5989707" Oct 03 08:12:28 crc kubenswrapper[4664]: I1003 08:12:28.860931 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83ec0687a3aaf234bf90791c4e02e42790c2e8b1630b3cbbcd7d3a56c5989707"} err="failed to get container status \"83ec0687a3aaf234bf90791c4e02e42790c2e8b1630b3cbbcd7d3a56c5989707\": rpc error: code = NotFound desc = could not find container \"83ec0687a3aaf234bf90791c4e02e42790c2e8b1630b3cbbcd7d3a56c5989707\": container with ID starting with 83ec0687a3aaf234bf90791c4e02e42790c2e8b1630b3cbbcd7d3a56c5989707 not found: ID does not exist" Oct 03 08:12:29 crc kubenswrapper[4664]: I1003 08:12:29.889552 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3040da34-0901-40ca-9231-ca7a8dccf838" path="/var/lib/kubelet/pods/3040da34-0901-40ca-9231-ca7a8dccf838/volumes" Oct 03 08:12:32 crc kubenswrapper[4664]: I1003 08:12:32.149026 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vtzfh"] Oct 03 08:12:32 crc kubenswrapper[4664]: E1003 08:12:32.149745 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3040da34-0901-40ca-9231-ca7a8dccf838" containerName="extract-utilities" Oct 03 08:12:32 crc kubenswrapper[4664]: I1003 08:12:32.149760 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="3040da34-0901-40ca-9231-ca7a8dccf838" containerName="extract-utilities" Oct 03 08:12:32 crc kubenswrapper[4664]: E1003 08:12:32.149775 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3040da34-0901-40ca-9231-ca7a8dccf838" containerName="registry-server" Oct 03 08:12:32 crc kubenswrapper[4664]: I1003 08:12:32.149787 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="3040da34-0901-40ca-9231-ca7a8dccf838" containerName="registry-server" Oct 03 08:12:32 crc kubenswrapper[4664]: E1003 08:12:32.149816 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3040da34-0901-40ca-9231-ca7a8dccf838" containerName="extract-content" Oct 03 08:12:32 crc kubenswrapper[4664]: I1003 08:12:32.149825 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="3040da34-0901-40ca-9231-ca7a8dccf838" containerName="extract-content" Oct 03 08:12:32 crc kubenswrapper[4664]: E1003 08:12:32.149839 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3de7938-81c4-44e0-a3e5-d050b3c9dcc4" containerName="registry-server" Oct 03 08:12:32 crc kubenswrapper[4664]: I1003 08:12:32.149847 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3de7938-81c4-44e0-a3e5-d050b3c9dcc4" containerName="registry-server" Oct 03 08:12:32 crc kubenswrapper[4664]: E1003 08:12:32.149873 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3de7938-81c4-44e0-a3e5-d050b3c9dcc4" containerName="extract-content" Oct 03 08:12:32 crc kubenswrapper[4664]: I1003 08:12:32.149880 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3de7938-81c4-44e0-a3e5-d050b3c9dcc4" containerName="extract-content" Oct 03 08:12:32 crc kubenswrapper[4664]: E1003 08:12:32.149893 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3de7938-81c4-44e0-a3e5-d050b3c9dcc4" containerName="extract-utilities" Oct 03 08:12:32 crc kubenswrapper[4664]: I1003 08:12:32.149899 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3de7938-81c4-44e0-a3e5-d050b3c9dcc4" containerName="extract-utilities" Oct 03 08:12:32 crc kubenswrapper[4664]: I1003 08:12:32.150140 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3de7938-81c4-44e0-a3e5-d050b3c9dcc4" containerName="registry-server" Oct 03 08:12:32 crc kubenswrapper[4664]: I1003 08:12:32.150171 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="3040da34-0901-40ca-9231-ca7a8dccf838" containerName="registry-server" Oct 03 08:12:32 crc kubenswrapper[4664]: I1003 08:12:32.151720 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vtzfh" Oct 03 08:12:32 crc kubenswrapper[4664]: I1003 08:12:32.160069 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vtzfh"] Oct 03 08:12:32 crc kubenswrapper[4664]: I1003 08:12:32.212060 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85cabdd1-8b09-496e-938c-f9ad4f6732cf-utilities\") pod \"redhat-operators-vtzfh\" (UID: \"85cabdd1-8b09-496e-938c-f9ad4f6732cf\") " pod="openshift-marketplace/redhat-operators-vtzfh" Oct 03 08:12:32 crc kubenswrapper[4664]: I1003 08:12:32.212183 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjmts\" (UniqueName: \"kubernetes.io/projected/85cabdd1-8b09-496e-938c-f9ad4f6732cf-kube-api-access-qjmts\") pod \"redhat-operators-vtzfh\" (UID: \"85cabdd1-8b09-496e-938c-f9ad4f6732cf\") " pod="openshift-marketplace/redhat-operators-vtzfh" Oct 03 08:12:32 crc kubenswrapper[4664]: I1003 08:12:32.212232 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85cabdd1-8b09-496e-938c-f9ad4f6732cf-catalog-content\") pod \"redhat-operators-vtzfh\" (UID: \"85cabdd1-8b09-496e-938c-f9ad4f6732cf\") " pod="openshift-marketplace/redhat-operators-vtzfh" Oct 03 08:12:32 crc kubenswrapper[4664]: I1003 08:12:32.314053 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85cabdd1-8b09-496e-938c-f9ad4f6732cf-catalog-content\") pod \"redhat-operators-vtzfh\" (UID: \"85cabdd1-8b09-496e-938c-f9ad4f6732cf\") " pod="openshift-marketplace/redhat-operators-vtzfh" Oct 03 08:12:32 crc kubenswrapper[4664]: I1003 08:12:32.314204 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85cabdd1-8b09-496e-938c-f9ad4f6732cf-utilities\") pod \"redhat-operators-vtzfh\" (UID: \"85cabdd1-8b09-496e-938c-f9ad4f6732cf\") " pod="openshift-marketplace/redhat-operators-vtzfh" Oct 03 08:12:32 crc kubenswrapper[4664]: I1003 08:12:32.314315 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjmts\" (UniqueName: \"kubernetes.io/projected/85cabdd1-8b09-496e-938c-f9ad4f6732cf-kube-api-access-qjmts\") pod \"redhat-operators-vtzfh\" (UID: \"85cabdd1-8b09-496e-938c-f9ad4f6732cf\") " pod="openshift-marketplace/redhat-operators-vtzfh" Oct 03 08:12:32 crc kubenswrapper[4664]: I1003 08:12:32.314582 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85cabdd1-8b09-496e-938c-f9ad4f6732cf-catalog-content\") pod \"redhat-operators-vtzfh\" (UID: \"85cabdd1-8b09-496e-938c-f9ad4f6732cf\") " pod="openshift-marketplace/redhat-operators-vtzfh" Oct 03 08:12:32 crc kubenswrapper[4664]: I1003 08:12:32.314852 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85cabdd1-8b09-496e-938c-f9ad4f6732cf-utilities\") pod \"redhat-operators-vtzfh\" (UID: \"85cabdd1-8b09-496e-938c-f9ad4f6732cf\") " pod="openshift-marketplace/redhat-operators-vtzfh" Oct 03 08:12:32 crc kubenswrapper[4664]: I1003 08:12:32.336160 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjmts\" (UniqueName: \"kubernetes.io/projected/85cabdd1-8b09-496e-938c-f9ad4f6732cf-kube-api-access-qjmts\") pod \"redhat-operators-vtzfh\" (UID: \"85cabdd1-8b09-496e-938c-f9ad4f6732cf\") " pod="openshift-marketplace/redhat-operators-vtzfh" Oct 03 08:12:32 crc kubenswrapper[4664]: I1003 08:12:32.473881 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vtzfh" Oct 03 08:12:32 crc kubenswrapper[4664]: I1003 08:12:32.980632 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vtzfh"] Oct 03 08:12:32 crc kubenswrapper[4664]: W1003 08:12:32.984124 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85cabdd1_8b09_496e_938c_f9ad4f6732cf.slice/crio-1f35b82e1f83d8e63a88cc59ad44467c6c1381c1d85b2bc67fa1f2fd4cf6a22c WatchSource:0}: Error finding container 1f35b82e1f83d8e63a88cc59ad44467c6c1381c1d85b2bc67fa1f2fd4cf6a22c: Status 404 returned error can't find the container with id 1f35b82e1f83d8e63a88cc59ad44467c6c1381c1d85b2bc67fa1f2fd4cf6a22c Oct 03 08:12:33 crc kubenswrapper[4664]: E1003 08:12:33.460226 4664 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85cabdd1_8b09_496e_938c_f9ad4f6732cf.slice/crio-conmon-f846351eddd35022b8e0da55a32b8eb7936ddb67d41196297e251eeda370d528.scope\": RecentStats: unable to find data in memory cache]" Oct 03 08:12:33 crc kubenswrapper[4664]: I1003 08:12:33.782208 4664 generic.go:334] "Generic (PLEG): container finished" podID="85cabdd1-8b09-496e-938c-f9ad4f6732cf" containerID="f846351eddd35022b8e0da55a32b8eb7936ddb67d41196297e251eeda370d528" exitCode=0 Oct 03 08:12:33 crc kubenswrapper[4664]: I1003 08:12:33.782306 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vtzfh" event={"ID":"85cabdd1-8b09-496e-938c-f9ad4f6732cf","Type":"ContainerDied","Data":"f846351eddd35022b8e0da55a32b8eb7936ddb67d41196297e251eeda370d528"} Oct 03 08:12:33 crc kubenswrapper[4664]: I1003 08:12:33.782844 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vtzfh" event={"ID":"85cabdd1-8b09-496e-938c-f9ad4f6732cf","Type":"ContainerStarted","Data":"1f35b82e1f83d8e63a88cc59ad44467c6c1381c1d85b2bc67fa1f2fd4cf6a22c"} Oct 03 08:12:34 crc kubenswrapper[4664]: I1003 08:12:34.796308 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vtzfh" event={"ID":"85cabdd1-8b09-496e-938c-f9ad4f6732cf","Type":"ContainerStarted","Data":"9d88540927335426d27144de403563f1b4d5c575f1decff04e78361885bf861f"} Oct 03 08:12:35 crc kubenswrapper[4664]: I1003 08:12:35.809708 4664 generic.go:334] "Generic (PLEG): container finished" podID="85cabdd1-8b09-496e-938c-f9ad4f6732cf" containerID="9d88540927335426d27144de403563f1b4d5c575f1decff04e78361885bf861f" exitCode=0 Oct 03 08:12:35 crc kubenswrapper[4664]: I1003 08:12:35.809781 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vtzfh" event={"ID":"85cabdd1-8b09-496e-938c-f9ad4f6732cf","Type":"ContainerDied","Data":"9d88540927335426d27144de403563f1b4d5c575f1decff04e78361885bf861f"} Oct 03 08:12:35 crc kubenswrapper[4664]: I1003 08:12:35.978296 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 08:12:36 crc kubenswrapper[4664]: I1003 08:12:36.824519 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vtzfh" event={"ID":"85cabdd1-8b09-496e-938c-f9ad4f6732cf","Type":"ContainerStarted","Data":"505e904c006f9bff26e121e95dfeb0515600450b82f3d87d5bfe3f7ded7529fc"} Oct 03 08:12:36 crc kubenswrapper[4664]: I1003 08:12:36.865304 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vtzfh" podStartSLOduration=2.427448753 podStartE2EDuration="4.865286726s" podCreationTimestamp="2025-10-03 08:12:32 +0000 UTC" firstStartedPulling="2025-10-03 08:12:33.78574421 +0000 UTC m=+1454.606934700" lastFinishedPulling="2025-10-03 08:12:36.223582183 +0000 UTC m=+1457.044772673" observedRunningTime="2025-10-03 08:12:36.863269015 +0000 UTC m=+1457.684459525" watchObservedRunningTime="2025-10-03 08:12:36.865286726 +0000 UTC m=+1457.686477216" Oct 03 08:12:36 crc kubenswrapper[4664]: I1003 08:12:36.927319 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 08:12:41 crc kubenswrapper[4664]: I1003 08:12:41.497487 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="8bb7b62d-f030-45a7-b9f8-87852ea275de" containerName="rabbitmq" containerID="cri-o://d549b761d0aa07a6d12f415fc19e3a85b5e532acd6ca6a8b918c4583dae2b9fe" gracePeriod=604795 Oct 03 08:12:41 crc kubenswrapper[4664]: I1003 08:12:41.987843 4664 patch_prober.go:28] interesting pod/machine-config-daemon-x9dgm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:12:41 crc kubenswrapper[4664]: I1003 08:12:41.988281 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:12:42 crc kubenswrapper[4664]: I1003 08:12:42.432439 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="b8ae1def-1d1a-4acd-af78-204219a99fe6" containerName="rabbitmq" containerID="cri-o://99a69382c766c5f7bd613bbec49e828ed57ab948713f9652a5673c0488212a4d" gracePeriod=604795 Oct 03 08:12:42 crc kubenswrapper[4664]: I1003 08:12:42.475176 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vtzfh" Oct 03 08:12:42 crc kubenswrapper[4664]: I1003 08:12:42.475275 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vtzfh" Oct 03 08:12:42 crc kubenswrapper[4664]: I1003 08:12:42.528350 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vtzfh" Oct 03 08:12:42 crc kubenswrapper[4664]: I1003 08:12:42.954140 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vtzfh" Oct 03 08:12:43 crc kubenswrapper[4664]: I1003 08:12:43.024912 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vtzfh"] Oct 03 08:12:44 crc kubenswrapper[4664]: I1003 08:12:44.925748 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vtzfh" podUID="85cabdd1-8b09-496e-938c-f9ad4f6732cf" containerName="registry-server" containerID="cri-o://505e904c006f9bff26e121e95dfeb0515600450b82f3d87d5bfe3f7ded7529fc" gracePeriod=2 Oct 03 08:12:45 crc kubenswrapper[4664]: I1003 08:12:45.420382 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vtzfh" Oct 03 08:12:45 crc kubenswrapper[4664]: I1003 08:12:45.569357 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjmts\" (UniqueName: \"kubernetes.io/projected/85cabdd1-8b09-496e-938c-f9ad4f6732cf-kube-api-access-qjmts\") pod \"85cabdd1-8b09-496e-938c-f9ad4f6732cf\" (UID: \"85cabdd1-8b09-496e-938c-f9ad4f6732cf\") " Oct 03 08:12:45 crc kubenswrapper[4664]: I1003 08:12:45.569599 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85cabdd1-8b09-496e-938c-f9ad4f6732cf-catalog-content\") pod \"85cabdd1-8b09-496e-938c-f9ad4f6732cf\" (UID: \"85cabdd1-8b09-496e-938c-f9ad4f6732cf\") " Oct 03 08:12:45 crc kubenswrapper[4664]: I1003 08:12:45.569647 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85cabdd1-8b09-496e-938c-f9ad4f6732cf-utilities\") pod \"85cabdd1-8b09-496e-938c-f9ad4f6732cf\" (UID: \"85cabdd1-8b09-496e-938c-f9ad4f6732cf\") " Oct 03 08:12:45 crc kubenswrapper[4664]: I1003 08:12:45.571151 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85cabdd1-8b09-496e-938c-f9ad4f6732cf-utilities" (OuterVolumeSpecName: "utilities") pod "85cabdd1-8b09-496e-938c-f9ad4f6732cf" (UID: "85cabdd1-8b09-496e-938c-f9ad4f6732cf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:12:45 crc kubenswrapper[4664]: I1003 08:12:45.579833 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85cabdd1-8b09-496e-938c-f9ad4f6732cf-kube-api-access-qjmts" (OuterVolumeSpecName: "kube-api-access-qjmts") pod "85cabdd1-8b09-496e-938c-f9ad4f6732cf" (UID: "85cabdd1-8b09-496e-938c-f9ad4f6732cf"). InnerVolumeSpecName "kube-api-access-qjmts". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:12:45 crc kubenswrapper[4664]: I1003 08:12:45.673071 4664 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85cabdd1-8b09-496e-938c-f9ad4f6732cf-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:12:45 crc kubenswrapper[4664]: I1003 08:12:45.673119 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjmts\" (UniqueName: \"kubernetes.io/projected/85cabdd1-8b09-496e-938c-f9ad4f6732cf-kube-api-access-qjmts\") on node \"crc\" DevicePath \"\"" Oct 03 08:12:45 crc kubenswrapper[4664]: I1003 08:12:45.674361 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85cabdd1-8b09-496e-938c-f9ad4f6732cf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "85cabdd1-8b09-496e-938c-f9ad4f6732cf" (UID: "85cabdd1-8b09-496e-938c-f9ad4f6732cf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:12:45 crc kubenswrapper[4664]: I1003 08:12:45.776808 4664 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85cabdd1-8b09-496e-938c-f9ad4f6732cf-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:12:45 crc kubenswrapper[4664]: I1003 08:12:45.940289 4664 generic.go:334] "Generic (PLEG): container finished" podID="85cabdd1-8b09-496e-938c-f9ad4f6732cf" containerID="505e904c006f9bff26e121e95dfeb0515600450b82f3d87d5bfe3f7ded7529fc" exitCode=0 Oct 03 08:12:45 crc kubenswrapper[4664]: I1003 08:12:45.940349 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vtzfh" event={"ID":"85cabdd1-8b09-496e-938c-f9ad4f6732cf","Type":"ContainerDied","Data":"505e904c006f9bff26e121e95dfeb0515600450b82f3d87d5bfe3f7ded7529fc"} Oct 03 08:12:45 crc kubenswrapper[4664]: I1003 08:12:45.940415 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vtzfh" event={"ID":"85cabdd1-8b09-496e-938c-f9ad4f6732cf","Type":"ContainerDied","Data":"1f35b82e1f83d8e63a88cc59ad44467c6c1381c1d85b2bc67fa1f2fd4cf6a22c"} Oct 03 08:12:45 crc kubenswrapper[4664]: I1003 08:12:45.940447 4664 scope.go:117] "RemoveContainer" containerID="505e904c006f9bff26e121e95dfeb0515600450b82f3d87d5bfe3f7ded7529fc" Oct 03 08:12:45 crc kubenswrapper[4664]: I1003 08:12:45.940917 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vtzfh" Oct 03 08:12:45 crc kubenswrapper[4664]: I1003 08:12:45.968680 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vtzfh"] Oct 03 08:12:45 crc kubenswrapper[4664]: I1003 08:12:45.976342 4664 scope.go:117] "RemoveContainer" containerID="9d88540927335426d27144de403563f1b4d5c575f1decff04e78361885bf861f" Oct 03 08:12:45 crc kubenswrapper[4664]: I1003 08:12:45.983053 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vtzfh"] Oct 03 08:12:46 crc kubenswrapper[4664]: I1003 08:12:46.003228 4664 scope.go:117] "RemoveContainer" containerID="f846351eddd35022b8e0da55a32b8eb7936ddb67d41196297e251eeda370d528" Oct 03 08:12:46 crc kubenswrapper[4664]: I1003 08:12:46.053491 4664 scope.go:117] "RemoveContainer" containerID="505e904c006f9bff26e121e95dfeb0515600450b82f3d87d5bfe3f7ded7529fc" Oct 03 08:12:46 crc kubenswrapper[4664]: E1003 08:12:46.054267 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"505e904c006f9bff26e121e95dfeb0515600450b82f3d87d5bfe3f7ded7529fc\": container with ID starting with 505e904c006f9bff26e121e95dfeb0515600450b82f3d87d5bfe3f7ded7529fc not found: ID does not exist" containerID="505e904c006f9bff26e121e95dfeb0515600450b82f3d87d5bfe3f7ded7529fc" Oct 03 08:12:46 crc kubenswrapper[4664]: I1003 08:12:46.054305 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"505e904c006f9bff26e121e95dfeb0515600450b82f3d87d5bfe3f7ded7529fc"} err="failed to get container status \"505e904c006f9bff26e121e95dfeb0515600450b82f3d87d5bfe3f7ded7529fc\": rpc error: code = NotFound desc = could not find container \"505e904c006f9bff26e121e95dfeb0515600450b82f3d87d5bfe3f7ded7529fc\": container with ID starting with 505e904c006f9bff26e121e95dfeb0515600450b82f3d87d5bfe3f7ded7529fc not found: ID does not exist" Oct 03 08:12:46 crc kubenswrapper[4664]: I1003 08:12:46.054334 4664 scope.go:117] "RemoveContainer" containerID="9d88540927335426d27144de403563f1b4d5c575f1decff04e78361885bf861f" Oct 03 08:12:46 crc kubenswrapper[4664]: E1003 08:12:46.054754 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d88540927335426d27144de403563f1b4d5c575f1decff04e78361885bf861f\": container with ID starting with 9d88540927335426d27144de403563f1b4d5c575f1decff04e78361885bf861f not found: ID does not exist" containerID="9d88540927335426d27144de403563f1b4d5c575f1decff04e78361885bf861f" Oct 03 08:12:46 crc kubenswrapper[4664]: I1003 08:12:46.054792 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d88540927335426d27144de403563f1b4d5c575f1decff04e78361885bf861f"} err="failed to get container status \"9d88540927335426d27144de403563f1b4d5c575f1decff04e78361885bf861f\": rpc error: code = NotFound desc = could not find container \"9d88540927335426d27144de403563f1b4d5c575f1decff04e78361885bf861f\": container with ID starting with 9d88540927335426d27144de403563f1b4d5c575f1decff04e78361885bf861f not found: ID does not exist" Oct 03 08:12:46 crc kubenswrapper[4664]: I1003 08:12:46.054808 4664 scope.go:117] "RemoveContainer" containerID="f846351eddd35022b8e0da55a32b8eb7936ddb67d41196297e251eeda370d528" Oct 03 08:12:46 crc kubenswrapper[4664]: E1003 08:12:46.055106 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f846351eddd35022b8e0da55a32b8eb7936ddb67d41196297e251eeda370d528\": container with ID starting with f846351eddd35022b8e0da55a32b8eb7936ddb67d41196297e251eeda370d528 not found: ID does not exist" containerID="f846351eddd35022b8e0da55a32b8eb7936ddb67d41196297e251eeda370d528" Oct 03 08:12:46 crc kubenswrapper[4664]: I1003 08:12:46.055134 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f846351eddd35022b8e0da55a32b8eb7936ddb67d41196297e251eeda370d528"} err="failed to get container status \"f846351eddd35022b8e0da55a32b8eb7936ddb67d41196297e251eeda370d528\": rpc error: code = NotFound desc = could not find container \"f846351eddd35022b8e0da55a32b8eb7936ddb67d41196297e251eeda370d528\": container with ID starting with f846351eddd35022b8e0da55a32b8eb7936ddb67d41196297e251eeda370d528 not found: ID does not exist" Oct 03 08:12:47 crc kubenswrapper[4664]: I1003 08:12:47.520897 4664 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="8bb7b62d-f030-45a7-b9f8-87852ea275de" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.96:5671: connect: connection refused" Oct 03 08:12:47 crc kubenswrapper[4664]: I1003 08:12:47.801857 4664 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="b8ae1def-1d1a-4acd-af78-204219a99fe6" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.97:5671: connect: connection refused" Oct 03 08:12:47 crc kubenswrapper[4664]: I1003 08:12:47.892737 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85cabdd1-8b09-496e-938c-f9ad4f6732cf" path="/var/lib/kubelet/pods/85cabdd1-8b09-496e-938c-f9ad4f6732cf/volumes" Oct 03 08:12:47 crc kubenswrapper[4664]: I1003 08:12:47.981009 4664 generic.go:334] "Generic (PLEG): container finished" podID="8bb7b62d-f030-45a7-b9f8-87852ea275de" containerID="d549b761d0aa07a6d12f415fc19e3a85b5e532acd6ca6a8b918c4583dae2b9fe" exitCode=0 Oct 03 08:12:47 crc kubenswrapper[4664]: I1003 08:12:47.981071 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8bb7b62d-f030-45a7-b9f8-87852ea275de","Type":"ContainerDied","Data":"d549b761d0aa07a6d12f415fc19e3a85b5e532acd6ca6a8b918c4583dae2b9fe"} Oct 03 08:12:48 crc kubenswrapper[4664]: I1003 08:12:48.144394 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 08:12:48 crc kubenswrapper[4664]: I1003 08:12:48.240370 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8bb7b62d-f030-45a7-b9f8-87852ea275de-server-conf\") pod \"8bb7b62d-f030-45a7-b9f8-87852ea275de\" (UID: \"8bb7b62d-f030-45a7-b9f8-87852ea275de\") " Oct 03 08:12:48 crc kubenswrapper[4664]: I1003 08:12:48.240882 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8bb7b62d-f030-45a7-b9f8-87852ea275de-rabbitmq-confd\") pod \"8bb7b62d-f030-45a7-b9f8-87852ea275de\" (UID: \"8bb7b62d-f030-45a7-b9f8-87852ea275de\") " Oct 03 08:12:48 crc kubenswrapper[4664]: I1003 08:12:48.240918 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"8bb7b62d-f030-45a7-b9f8-87852ea275de\" (UID: \"8bb7b62d-f030-45a7-b9f8-87852ea275de\") " Oct 03 08:12:48 crc kubenswrapper[4664]: I1003 08:12:48.240985 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8bb7b62d-f030-45a7-b9f8-87852ea275de-pod-info\") pod \"8bb7b62d-f030-45a7-b9f8-87852ea275de\" (UID: \"8bb7b62d-f030-45a7-b9f8-87852ea275de\") " Oct 03 08:12:48 crc kubenswrapper[4664]: I1003 08:12:48.241061 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8bb7b62d-f030-45a7-b9f8-87852ea275de-rabbitmq-tls\") pod \"8bb7b62d-f030-45a7-b9f8-87852ea275de\" (UID: \"8bb7b62d-f030-45a7-b9f8-87852ea275de\") " Oct 03 08:12:48 crc kubenswrapper[4664]: I1003 08:12:48.241114 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8bb7b62d-f030-45a7-b9f8-87852ea275de-rabbitmq-erlang-cookie\") pod \"8bb7b62d-f030-45a7-b9f8-87852ea275de\" (UID: \"8bb7b62d-f030-45a7-b9f8-87852ea275de\") " Oct 03 08:12:48 crc kubenswrapper[4664]: I1003 08:12:48.241227 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8bb7b62d-f030-45a7-b9f8-87852ea275de-plugins-conf\") pod \"8bb7b62d-f030-45a7-b9f8-87852ea275de\" (UID: \"8bb7b62d-f030-45a7-b9f8-87852ea275de\") " Oct 03 08:12:48 crc kubenswrapper[4664]: I1003 08:12:48.241286 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8bb7b62d-f030-45a7-b9f8-87852ea275de-erlang-cookie-secret\") pod \"8bb7b62d-f030-45a7-b9f8-87852ea275de\" (UID: \"8bb7b62d-f030-45a7-b9f8-87852ea275de\") " Oct 03 08:12:48 crc kubenswrapper[4664]: I1003 08:12:48.241465 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8bb7b62d-f030-45a7-b9f8-87852ea275de-rabbitmq-plugins\") pod \"8bb7b62d-f030-45a7-b9f8-87852ea275de\" (UID: \"8bb7b62d-f030-45a7-b9f8-87852ea275de\") " Oct 03 08:12:48 crc kubenswrapper[4664]: I1003 08:12:48.241506 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8bb7b62d-f030-45a7-b9f8-87852ea275de-config-data\") pod \"8bb7b62d-f030-45a7-b9f8-87852ea275de\" (UID: \"8bb7b62d-f030-45a7-b9f8-87852ea275de\") " Oct 03 08:12:48 crc kubenswrapper[4664]: I1003 08:12:48.241549 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdgwf\" (UniqueName: \"kubernetes.io/projected/8bb7b62d-f030-45a7-b9f8-87852ea275de-kube-api-access-wdgwf\") pod \"8bb7b62d-f030-45a7-b9f8-87852ea275de\" (UID: \"8bb7b62d-f030-45a7-b9f8-87852ea275de\") " Oct 03 08:12:48 crc kubenswrapper[4664]: I1003 08:12:48.242269 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bb7b62d-f030-45a7-b9f8-87852ea275de-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "8bb7b62d-f030-45a7-b9f8-87852ea275de" (UID: "8bb7b62d-f030-45a7-b9f8-87852ea275de"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:12:48 crc kubenswrapper[4664]: I1003 08:12:48.250188 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bb7b62d-f030-45a7-b9f8-87852ea275de-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "8bb7b62d-f030-45a7-b9f8-87852ea275de" (UID: "8bb7b62d-f030-45a7-b9f8-87852ea275de"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:12:48 crc kubenswrapper[4664]: I1003 08:12:48.250358 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bb7b62d-f030-45a7-b9f8-87852ea275de-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "8bb7b62d-f030-45a7-b9f8-87852ea275de" (UID: "8bb7b62d-f030-45a7-b9f8-87852ea275de"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:12:48 crc kubenswrapper[4664]: I1003 08:12:48.252592 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/8bb7b62d-f030-45a7-b9f8-87852ea275de-pod-info" (OuterVolumeSpecName: "pod-info") pod "8bb7b62d-f030-45a7-b9f8-87852ea275de" (UID: "8bb7b62d-f030-45a7-b9f8-87852ea275de"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 03 08:12:48 crc kubenswrapper[4664]: I1003 08:12:48.265805 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bb7b62d-f030-45a7-b9f8-87852ea275de-kube-api-access-wdgwf" (OuterVolumeSpecName: "kube-api-access-wdgwf") pod "8bb7b62d-f030-45a7-b9f8-87852ea275de" (UID: "8bb7b62d-f030-45a7-b9f8-87852ea275de"). InnerVolumeSpecName "kube-api-access-wdgwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:12:48 crc kubenswrapper[4664]: I1003 08:12:48.266918 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bb7b62d-f030-45a7-b9f8-87852ea275de-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "8bb7b62d-f030-45a7-b9f8-87852ea275de" (UID: "8bb7b62d-f030-45a7-b9f8-87852ea275de"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:12:48 crc kubenswrapper[4664]: I1003 08:12:48.267238 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bb7b62d-f030-45a7-b9f8-87852ea275de-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "8bb7b62d-f030-45a7-b9f8-87852ea275de" (UID: "8bb7b62d-f030-45a7-b9f8-87852ea275de"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:12:48 crc kubenswrapper[4664]: I1003 08:12:48.323760 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "8bb7b62d-f030-45a7-b9f8-87852ea275de" (UID: "8bb7b62d-f030-45a7-b9f8-87852ea275de"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 08:12:48 crc kubenswrapper[4664]: I1003 08:12:48.345898 4664 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Oct 03 08:12:48 crc kubenswrapper[4664]: I1003 08:12:48.349676 4664 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8bb7b62d-f030-45a7-b9f8-87852ea275de-pod-info\") on node \"crc\" DevicePath \"\"" Oct 03 08:12:48 crc kubenswrapper[4664]: I1003 08:12:48.349806 4664 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8bb7b62d-f030-45a7-b9f8-87852ea275de-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 03 08:12:48 crc kubenswrapper[4664]: I1003 08:12:48.349863 4664 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8bb7b62d-f030-45a7-b9f8-87852ea275de-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 03 08:12:48 crc kubenswrapper[4664]: I1003 08:12:48.349916 4664 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8bb7b62d-f030-45a7-b9f8-87852ea275de-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 03 08:12:48 crc kubenswrapper[4664]: I1003 08:12:48.349967 4664 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8bb7b62d-f030-45a7-b9f8-87852ea275de-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 03 08:12:48 crc kubenswrapper[4664]: I1003 08:12:48.350014 4664 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8bb7b62d-f030-45a7-b9f8-87852ea275de-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 03 08:12:48 crc kubenswrapper[4664]: I1003 08:12:48.350084 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdgwf\" (UniqueName: \"kubernetes.io/projected/8bb7b62d-f030-45a7-b9f8-87852ea275de-kube-api-access-wdgwf\") on node \"crc\" DevicePath \"\"" Oct 03 08:12:48 crc kubenswrapper[4664]: I1003 08:12:48.406462 4664 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Oct 03 08:12:48 crc kubenswrapper[4664]: I1003 08:12:48.412179 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bb7b62d-f030-45a7-b9f8-87852ea275de-config-data" (OuterVolumeSpecName: "config-data") pod "8bb7b62d-f030-45a7-b9f8-87852ea275de" (UID: "8bb7b62d-f030-45a7-b9f8-87852ea275de"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:12:48 crc kubenswrapper[4664]: I1003 08:12:48.434211 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bb7b62d-f030-45a7-b9f8-87852ea275de-server-conf" (OuterVolumeSpecName: "server-conf") pod "8bb7b62d-f030-45a7-b9f8-87852ea275de" (UID: "8bb7b62d-f030-45a7-b9f8-87852ea275de"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:12:48 crc kubenswrapper[4664]: I1003 08:12:48.452374 4664 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8bb7b62d-f030-45a7-b9f8-87852ea275de-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:12:48 crc kubenswrapper[4664]: I1003 08:12:48.452674 4664 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8bb7b62d-f030-45a7-b9f8-87852ea275de-server-conf\") on node \"crc\" DevicePath \"\"" Oct 03 08:12:48 crc kubenswrapper[4664]: I1003 08:12:48.452791 4664 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Oct 03 08:12:48 crc kubenswrapper[4664]: I1003 08:12:48.508839 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bb7b62d-f030-45a7-b9f8-87852ea275de-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "8bb7b62d-f030-45a7-b9f8-87852ea275de" (UID: "8bb7b62d-f030-45a7-b9f8-87852ea275de"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:12:48 crc kubenswrapper[4664]: I1003 08:12:48.554693 4664 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8bb7b62d-f030-45a7-b9f8-87852ea275de-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 03 08:12:48 crc kubenswrapper[4664]: I1003 08:12:48.993271 4664 generic.go:334] "Generic (PLEG): container finished" podID="b8ae1def-1d1a-4acd-af78-204219a99fe6" containerID="99a69382c766c5f7bd613bbec49e828ed57ab948713f9652a5673c0488212a4d" exitCode=0 Oct 03 08:12:48 crc kubenswrapper[4664]: I1003 08:12:48.993376 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b8ae1def-1d1a-4acd-af78-204219a99fe6","Type":"ContainerDied","Data":"99a69382c766c5f7bd613bbec49e828ed57ab948713f9652a5673c0488212a4d"} Oct 03 08:12:48 crc kubenswrapper[4664]: I1003 08:12:48.997441 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8bb7b62d-f030-45a7-b9f8-87852ea275de","Type":"ContainerDied","Data":"56fe3d12375be207cde5b8e108243abbb55d19cce00dcf9b4720c4d39ddd2f81"} Oct 03 08:12:48 crc kubenswrapper[4664]: I1003 08:12:48.997537 4664 scope.go:117] "RemoveContainer" containerID="d549b761d0aa07a6d12f415fc19e3a85b5e532acd6ca6a8b918c4583dae2b9fe" Oct 03 08:12:48 crc kubenswrapper[4664]: I1003 08:12:48.997545 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.100650 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.113180 4664 scope.go:117] "RemoveContainer" containerID="dea969824e6c260d6adfd6cb873a9ed48c1243cced9fbb9c84161d22c5a1daa9" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.121472 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.135619 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.162355 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 08:12:49 crc kubenswrapper[4664]: E1003 08:12:49.162959 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ae1def-1d1a-4acd-af78-204219a99fe6" containerName="rabbitmq" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.162987 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ae1def-1d1a-4acd-af78-204219a99fe6" containerName="rabbitmq" Oct 03 08:12:49 crc kubenswrapper[4664]: E1003 08:12:49.163004 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bb7b62d-f030-45a7-b9f8-87852ea275de" containerName="setup-container" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.163014 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bb7b62d-f030-45a7-b9f8-87852ea275de" containerName="setup-container" Oct 03 08:12:49 crc kubenswrapper[4664]: E1003 08:12:49.163040 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ae1def-1d1a-4acd-af78-204219a99fe6" containerName="setup-container" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.163049 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ae1def-1d1a-4acd-af78-204219a99fe6" containerName="setup-container" Oct 03 08:12:49 crc kubenswrapper[4664]: E1003 08:12:49.163432 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85cabdd1-8b09-496e-938c-f9ad4f6732cf" containerName="extract-utilities" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.163453 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="85cabdd1-8b09-496e-938c-f9ad4f6732cf" containerName="extract-utilities" Oct 03 08:12:49 crc kubenswrapper[4664]: E1003 08:12:49.163476 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85cabdd1-8b09-496e-938c-f9ad4f6732cf" containerName="registry-server" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.163484 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="85cabdd1-8b09-496e-938c-f9ad4f6732cf" containerName="registry-server" Oct 03 08:12:49 crc kubenswrapper[4664]: E1003 08:12:49.163505 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85cabdd1-8b09-496e-938c-f9ad4f6732cf" containerName="extract-content" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.163512 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="85cabdd1-8b09-496e-938c-f9ad4f6732cf" containerName="extract-content" Oct 03 08:12:49 crc kubenswrapper[4664]: E1003 08:12:49.163522 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bb7b62d-f030-45a7-b9f8-87852ea275de" containerName="rabbitmq" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.163528 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bb7b62d-f030-45a7-b9f8-87852ea275de" containerName="rabbitmq" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.163828 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ae1def-1d1a-4acd-af78-204219a99fe6" containerName="rabbitmq" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.163858 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bb7b62d-f030-45a7-b9f8-87852ea275de" containerName="rabbitmq" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.163884 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="85cabdd1-8b09-496e-938c-f9ad4f6732cf" containerName="registry-server" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.165405 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.168002 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.168274 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.169320 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.169573 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.169800 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.169947 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.170164 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-dndjq" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.250052 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.268446 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b8ae1def-1d1a-4acd-af78-204219a99fe6-server-conf\") pod \"b8ae1def-1d1a-4acd-af78-204219a99fe6\" (UID: \"b8ae1def-1d1a-4acd-af78-204219a99fe6\") " Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.268528 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b8ae1def-1d1a-4acd-af78-204219a99fe6-config-data\") pod \"b8ae1def-1d1a-4acd-af78-204219a99fe6\" (UID: \"b8ae1def-1d1a-4acd-af78-204219a99fe6\") " Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.268563 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b8ae1def-1d1a-4acd-af78-204219a99fe6-pod-info\") pod \"b8ae1def-1d1a-4acd-af78-204219a99fe6\" (UID: \"b8ae1def-1d1a-4acd-af78-204219a99fe6\") " Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.268639 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b8ae1def-1d1a-4acd-af78-204219a99fe6-plugins-conf\") pod \"b8ae1def-1d1a-4acd-af78-204219a99fe6\" (UID: \"b8ae1def-1d1a-4acd-af78-204219a99fe6\") " Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.268709 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b8ae1def-1d1a-4acd-af78-204219a99fe6-rabbitmq-erlang-cookie\") pod \"b8ae1def-1d1a-4acd-af78-204219a99fe6\" (UID: \"b8ae1def-1d1a-4acd-af78-204219a99fe6\") " Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.268757 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b8ae1def-1d1a-4acd-af78-204219a99fe6-rabbitmq-confd\") pod \"b8ae1def-1d1a-4acd-af78-204219a99fe6\" (UID: \"b8ae1def-1d1a-4acd-af78-204219a99fe6\") " Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.268782 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4rm9\" (UniqueName: \"kubernetes.io/projected/b8ae1def-1d1a-4acd-af78-204219a99fe6-kube-api-access-g4rm9\") pod \"b8ae1def-1d1a-4acd-af78-204219a99fe6\" (UID: \"b8ae1def-1d1a-4acd-af78-204219a99fe6\") " Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.268820 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b8ae1def-1d1a-4acd-af78-204219a99fe6-erlang-cookie-secret\") pod \"b8ae1def-1d1a-4acd-af78-204219a99fe6\" (UID: \"b8ae1def-1d1a-4acd-af78-204219a99fe6\") " Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.268860 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b8ae1def-1d1a-4acd-af78-204219a99fe6-rabbitmq-plugins\") pod \"b8ae1def-1d1a-4acd-af78-204219a99fe6\" (UID: \"b8ae1def-1d1a-4acd-af78-204219a99fe6\") " Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.268891 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"b8ae1def-1d1a-4acd-af78-204219a99fe6\" (UID: \"b8ae1def-1d1a-4acd-af78-204219a99fe6\") " Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.268980 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b8ae1def-1d1a-4acd-af78-204219a99fe6-rabbitmq-tls\") pod \"b8ae1def-1d1a-4acd-af78-204219a99fe6\" (UID: \"b8ae1def-1d1a-4acd-af78-204219a99fe6\") " Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.269584 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2d2ecc09-bcf3-4702-9456-e6c6880256cb-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2d2ecc09-bcf3-4702-9456-e6c6880256cb\") " pod="openstack/rabbitmq-server-0" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.269674 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2d2ecc09-bcf3-4702-9456-e6c6880256cb-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2d2ecc09-bcf3-4702-9456-e6c6880256cb\") " pod="openstack/rabbitmq-server-0" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.269709 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2d2ecc09-bcf3-4702-9456-e6c6880256cb-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2d2ecc09-bcf3-4702-9456-e6c6880256cb\") " pod="openstack/rabbitmq-server-0" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.269766 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2d2ecc09-bcf3-4702-9456-e6c6880256cb-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2d2ecc09-bcf3-4702-9456-e6c6880256cb\") " pod="openstack/rabbitmq-server-0" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.269801 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"2d2ecc09-bcf3-4702-9456-e6c6880256cb\") " pod="openstack/rabbitmq-server-0" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.269825 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8ae1def-1d1a-4acd-af78-204219a99fe6-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "b8ae1def-1d1a-4acd-af78-204219a99fe6" (UID: "b8ae1def-1d1a-4acd-af78-204219a99fe6"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.269849 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2d2ecc09-bcf3-4702-9456-e6c6880256cb-config-data\") pod \"rabbitmq-server-0\" (UID: \"2d2ecc09-bcf3-4702-9456-e6c6880256cb\") " pod="openstack/rabbitmq-server-0" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.269866 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2d2ecc09-bcf3-4702-9456-e6c6880256cb-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2d2ecc09-bcf3-4702-9456-e6c6880256cb\") " pod="openstack/rabbitmq-server-0" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.269930 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2d2ecc09-bcf3-4702-9456-e6c6880256cb-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"2d2ecc09-bcf3-4702-9456-e6c6880256cb\") " pod="openstack/rabbitmq-server-0" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.269957 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2d2ecc09-bcf3-4702-9456-e6c6880256cb-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2d2ecc09-bcf3-4702-9456-e6c6880256cb\") " pod="openstack/rabbitmq-server-0" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.269997 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2d2ecc09-bcf3-4702-9456-e6c6880256cb-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2d2ecc09-bcf3-4702-9456-e6c6880256cb\") " pod="openstack/rabbitmq-server-0" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.270060 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn4kx\" (UniqueName: \"kubernetes.io/projected/2d2ecc09-bcf3-4702-9456-e6c6880256cb-kube-api-access-dn4kx\") pod \"rabbitmq-server-0\" (UID: \"2d2ecc09-bcf3-4702-9456-e6c6880256cb\") " pod="openstack/rabbitmq-server-0" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.270136 4664 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b8ae1def-1d1a-4acd-af78-204219a99fe6-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.270528 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8ae1def-1d1a-4acd-af78-204219a99fe6-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "b8ae1def-1d1a-4acd-af78-204219a99fe6" (UID: "b8ae1def-1d1a-4acd-af78-204219a99fe6"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.271583 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8ae1def-1d1a-4acd-af78-204219a99fe6-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "b8ae1def-1d1a-4acd-af78-204219a99fe6" (UID: "b8ae1def-1d1a-4acd-af78-204219a99fe6"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.275329 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "b8ae1def-1d1a-4acd-af78-204219a99fe6" (UID: "b8ae1def-1d1a-4acd-af78-204219a99fe6"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.283415 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/b8ae1def-1d1a-4acd-af78-204219a99fe6-pod-info" (OuterVolumeSpecName: "pod-info") pod "b8ae1def-1d1a-4acd-af78-204219a99fe6" (UID: "b8ae1def-1d1a-4acd-af78-204219a99fe6"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.283629 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8ae1def-1d1a-4acd-af78-204219a99fe6-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "b8ae1def-1d1a-4acd-af78-204219a99fe6" (UID: "b8ae1def-1d1a-4acd-af78-204219a99fe6"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.286178 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8ae1def-1d1a-4acd-af78-204219a99fe6-kube-api-access-g4rm9" (OuterVolumeSpecName: "kube-api-access-g4rm9") pod "b8ae1def-1d1a-4acd-af78-204219a99fe6" (UID: "b8ae1def-1d1a-4acd-af78-204219a99fe6"). InnerVolumeSpecName "kube-api-access-g4rm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.288479 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8ae1def-1d1a-4acd-af78-204219a99fe6-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "b8ae1def-1d1a-4acd-af78-204219a99fe6" (UID: "b8ae1def-1d1a-4acd-af78-204219a99fe6"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.310529 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8ae1def-1d1a-4acd-af78-204219a99fe6-config-data" (OuterVolumeSpecName: "config-data") pod "b8ae1def-1d1a-4acd-af78-204219a99fe6" (UID: "b8ae1def-1d1a-4acd-af78-204219a99fe6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.344241 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8ae1def-1d1a-4acd-af78-204219a99fe6-server-conf" (OuterVolumeSpecName: "server-conf") pod "b8ae1def-1d1a-4acd-af78-204219a99fe6" (UID: "b8ae1def-1d1a-4acd-af78-204219a99fe6"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.372925 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dn4kx\" (UniqueName: \"kubernetes.io/projected/2d2ecc09-bcf3-4702-9456-e6c6880256cb-kube-api-access-dn4kx\") pod \"rabbitmq-server-0\" (UID: \"2d2ecc09-bcf3-4702-9456-e6c6880256cb\") " pod="openstack/rabbitmq-server-0" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.373034 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2d2ecc09-bcf3-4702-9456-e6c6880256cb-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2d2ecc09-bcf3-4702-9456-e6c6880256cb\") " pod="openstack/rabbitmq-server-0" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.373079 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2d2ecc09-bcf3-4702-9456-e6c6880256cb-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2d2ecc09-bcf3-4702-9456-e6c6880256cb\") " pod="openstack/rabbitmq-server-0" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.373125 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2d2ecc09-bcf3-4702-9456-e6c6880256cb-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2d2ecc09-bcf3-4702-9456-e6c6880256cb\") " pod="openstack/rabbitmq-server-0" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.373174 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2d2ecc09-bcf3-4702-9456-e6c6880256cb-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2d2ecc09-bcf3-4702-9456-e6c6880256cb\") " pod="openstack/rabbitmq-server-0" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.373208 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"2d2ecc09-bcf3-4702-9456-e6c6880256cb\") " pod="openstack/rabbitmq-server-0" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.373248 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2d2ecc09-bcf3-4702-9456-e6c6880256cb-config-data\") pod \"rabbitmq-server-0\" (UID: \"2d2ecc09-bcf3-4702-9456-e6c6880256cb\") " pod="openstack/rabbitmq-server-0" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.373273 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2d2ecc09-bcf3-4702-9456-e6c6880256cb-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2d2ecc09-bcf3-4702-9456-e6c6880256cb\") " pod="openstack/rabbitmq-server-0" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.373326 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2d2ecc09-bcf3-4702-9456-e6c6880256cb-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"2d2ecc09-bcf3-4702-9456-e6c6880256cb\") " pod="openstack/rabbitmq-server-0" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.373354 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2d2ecc09-bcf3-4702-9456-e6c6880256cb-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2d2ecc09-bcf3-4702-9456-e6c6880256cb\") " pod="openstack/rabbitmq-server-0" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.373390 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2d2ecc09-bcf3-4702-9456-e6c6880256cb-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2d2ecc09-bcf3-4702-9456-e6c6880256cb\") " pod="openstack/rabbitmq-server-0" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.373468 4664 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b8ae1def-1d1a-4acd-af78-204219a99fe6-server-conf\") on node \"crc\" DevicePath \"\"" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.373482 4664 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b8ae1def-1d1a-4acd-af78-204219a99fe6-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.373495 4664 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b8ae1def-1d1a-4acd-af78-204219a99fe6-pod-info\") on node \"crc\" DevicePath \"\"" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.374262 4664 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b8ae1def-1d1a-4acd-af78-204219a99fe6-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.374286 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4rm9\" (UniqueName: \"kubernetes.io/projected/b8ae1def-1d1a-4acd-af78-204219a99fe6-kube-api-access-g4rm9\") on node \"crc\" DevicePath \"\"" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.374309 4664 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b8ae1def-1d1a-4acd-af78-204219a99fe6-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.374321 4664 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b8ae1def-1d1a-4acd-af78-204219a99fe6-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.374354 4664 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.374367 4664 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b8ae1def-1d1a-4acd-af78-204219a99fe6-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.378459 4664 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"2d2ecc09-bcf3-4702-9456-e6c6880256cb\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.382304 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2d2ecc09-bcf3-4702-9456-e6c6880256cb-config-data\") pod \"rabbitmq-server-0\" (UID: \"2d2ecc09-bcf3-4702-9456-e6c6880256cb\") " pod="openstack/rabbitmq-server-0" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.383247 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2d2ecc09-bcf3-4702-9456-e6c6880256cb-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2d2ecc09-bcf3-4702-9456-e6c6880256cb\") " pod="openstack/rabbitmq-server-0" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.386329 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2d2ecc09-bcf3-4702-9456-e6c6880256cb-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2d2ecc09-bcf3-4702-9456-e6c6880256cb\") " pod="openstack/rabbitmq-server-0" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.386661 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2d2ecc09-bcf3-4702-9456-e6c6880256cb-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2d2ecc09-bcf3-4702-9456-e6c6880256cb\") " pod="openstack/rabbitmq-server-0" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.386770 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2d2ecc09-bcf3-4702-9456-e6c6880256cb-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2d2ecc09-bcf3-4702-9456-e6c6880256cb\") " pod="openstack/rabbitmq-server-0" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.389206 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2d2ecc09-bcf3-4702-9456-e6c6880256cb-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2d2ecc09-bcf3-4702-9456-e6c6880256cb\") " pod="openstack/rabbitmq-server-0" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.390112 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2d2ecc09-bcf3-4702-9456-e6c6880256cb-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2d2ecc09-bcf3-4702-9456-e6c6880256cb\") " pod="openstack/rabbitmq-server-0" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.390324 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2d2ecc09-bcf3-4702-9456-e6c6880256cb-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2d2ecc09-bcf3-4702-9456-e6c6880256cb\") " pod="openstack/rabbitmq-server-0" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.391034 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2d2ecc09-bcf3-4702-9456-e6c6880256cb-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"2d2ecc09-bcf3-4702-9456-e6c6880256cb\") " pod="openstack/rabbitmq-server-0" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.411796 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dn4kx\" (UniqueName: \"kubernetes.io/projected/2d2ecc09-bcf3-4702-9456-e6c6880256cb-kube-api-access-dn4kx\") pod \"rabbitmq-server-0\" (UID: \"2d2ecc09-bcf3-4702-9456-e6c6880256cb\") " pod="openstack/rabbitmq-server-0" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.427491 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"2d2ecc09-bcf3-4702-9456-e6c6880256cb\") " pod="openstack/rabbitmq-server-0" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.438739 4664 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.450828 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8ae1def-1d1a-4acd-af78-204219a99fe6-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "b8ae1def-1d1a-4acd-af78-204219a99fe6" (UID: "b8ae1def-1d1a-4acd-af78-204219a99fe6"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.476530 4664 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b8ae1def-1d1a-4acd-af78-204219a99fe6-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.476582 4664 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.503069 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 08:12:49 crc kubenswrapper[4664]: I1003 08:12:49.890582 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bb7b62d-f030-45a7-b9f8-87852ea275de" path="/var/lib/kubelet/pods/8bb7b62d-f030-45a7-b9f8-87852ea275de/volumes" Oct 03 08:12:50 crc kubenswrapper[4664]: I1003 08:12:50.010118 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b8ae1def-1d1a-4acd-af78-204219a99fe6","Type":"ContainerDied","Data":"89fafbe8bc674ee21df5e8484a48004dd14c2fdd355ca43e1401ccde72f7e7bd"} Oct 03 08:12:50 crc kubenswrapper[4664]: I1003 08:12:50.010175 4664 scope.go:117] "RemoveContainer" containerID="99a69382c766c5f7bd613bbec49e828ed57ab948713f9652a5673c0488212a4d" Oct 03 08:12:50 crc kubenswrapper[4664]: I1003 08:12:50.010334 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:12:50 crc kubenswrapper[4664]: I1003 08:12:50.036413 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 08:12:50 crc kubenswrapper[4664]: I1003 08:12:50.053257 4664 scope.go:117] "RemoveContainer" containerID="62f158f76865a1bd74283911d566ec6f8f9e54cea9ffdb4b21088f5420dd8544" Oct 03 08:12:50 crc kubenswrapper[4664]: I1003 08:12:50.056027 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 08:12:50 crc kubenswrapper[4664]: I1003 08:12:50.079691 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 08:12:50 crc kubenswrapper[4664]: I1003 08:12:50.095440 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 08:12:50 crc kubenswrapper[4664]: I1003 08:12:50.097714 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:12:50 crc kubenswrapper[4664]: I1003 08:12:50.101357 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 03 08:12:50 crc kubenswrapper[4664]: I1003 08:12:50.101578 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 03 08:12:50 crc kubenswrapper[4664]: I1003 08:12:50.101705 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 03 08:12:50 crc kubenswrapper[4664]: I1003 08:12:50.109144 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 08:12:50 crc kubenswrapper[4664]: I1003 08:12:50.115522 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 03 08:12:50 crc kubenswrapper[4664]: I1003 08:12:50.115881 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 03 08:12:50 crc kubenswrapper[4664]: I1003 08:12:50.115990 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 03 08:12:50 crc kubenswrapper[4664]: I1003 08:12:50.116104 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-fv2fr" Oct 03 08:12:50 crc kubenswrapper[4664]: I1003 08:12:50.203189 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/309f57eb-9191-4466-a9eb-4beac6f647ae-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"309f57eb-9191-4466-a9eb-4beac6f647ae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:12:50 crc kubenswrapper[4664]: I1003 08:12:50.203240 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"309f57eb-9191-4466-a9eb-4beac6f647ae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:12:50 crc kubenswrapper[4664]: I1003 08:12:50.203274 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/309f57eb-9191-4466-a9eb-4beac6f647ae-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"309f57eb-9191-4466-a9eb-4beac6f647ae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:12:50 crc kubenswrapper[4664]: I1003 08:12:50.203295 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/309f57eb-9191-4466-a9eb-4beac6f647ae-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"309f57eb-9191-4466-a9eb-4beac6f647ae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:12:50 crc kubenswrapper[4664]: I1003 08:12:50.203317 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/309f57eb-9191-4466-a9eb-4beac6f647ae-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"309f57eb-9191-4466-a9eb-4beac6f647ae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:12:50 crc kubenswrapper[4664]: I1003 08:12:50.203358 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/309f57eb-9191-4466-a9eb-4beac6f647ae-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"309f57eb-9191-4466-a9eb-4beac6f647ae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:12:50 crc kubenswrapper[4664]: I1003 08:12:50.203385 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/309f57eb-9191-4466-a9eb-4beac6f647ae-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"309f57eb-9191-4466-a9eb-4beac6f647ae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:12:50 crc kubenswrapper[4664]: I1003 08:12:50.203411 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hb7q\" (UniqueName: \"kubernetes.io/projected/309f57eb-9191-4466-a9eb-4beac6f647ae-kube-api-access-8hb7q\") pod \"rabbitmq-cell1-server-0\" (UID: \"309f57eb-9191-4466-a9eb-4beac6f647ae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:12:50 crc kubenswrapper[4664]: I1003 08:12:50.203451 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/309f57eb-9191-4466-a9eb-4beac6f647ae-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"309f57eb-9191-4466-a9eb-4beac6f647ae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:12:50 crc kubenswrapper[4664]: I1003 08:12:50.203468 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/309f57eb-9191-4466-a9eb-4beac6f647ae-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"309f57eb-9191-4466-a9eb-4beac6f647ae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:12:50 crc kubenswrapper[4664]: I1003 08:12:50.203494 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/309f57eb-9191-4466-a9eb-4beac6f647ae-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"309f57eb-9191-4466-a9eb-4beac6f647ae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:12:50 crc kubenswrapper[4664]: I1003 08:12:50.305589 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/309f57eb-9191-4466-a9eb-4beac6f647ae-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"309f57eb-9191-4466-a9eb-4beac6f647ae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:12:50 crc kubenswrapper[4664]: I1003 08:12:50.305674 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"309f57eb-9191-4466-a9eb-4beac6f647ae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:12:50 crc kubenswrapper[4664]: I1003 08:12:50.305719 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/309f57eb-9191-4466-a9eb-4beac6f647ae-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"309f57eb-9191-4466-a9eb-4beac6f647ae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:12:50 crc kubenswrapper[4664]: I1003 08:12:50.305752 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/309f57eb-9191-4466-a9eb-4beac6f647ae-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"309f57eb-9191-4466-a9eb-4beac6f647ae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:12:50 crc kubenswrapper[4664]: I1003 08:12:50.305785 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/309f57eb-9191-4466-a9eb-4beac6f647ae-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"309f57eb-9191-4466-a9eb-4beac6f647ae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:12:50 crc kubenswrapper[4664]: I1003 08:12:50.305839 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/309f57eb-9191-4466-a9eb-4beac6f647ae-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"309f57eb-9191-4466-a9eb-4beac6f647ae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:12:50 crc kubenswrapper[4664]: I1003 08:12:50.305871 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/309f57eb-9191-4466-a9eb-4beac6f647ae-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"309f57eb-9191-4466-a9eb-4beac6f647ae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:12:50 crc kubenswrapper[4664]: I1003 08:12:50.305905 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hb7q\" (UniqueName: \"kubernetes.io/projected/309f57eb-9191-4466-a9eb-4beac6f647ae-kube-api-access-8hb7q\") pod \"rabbitmq-cell1-server-0\" (UID: \"309f57eb-9191-4466-a9eb-4beac6f647ae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:12:50 crc kubenswrapper[4664]: I1003 08:12:50.305952 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/309f57eb-9191-4466-a9eb-4beac6f647ae-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"309f57eb-9191-4466-a9eb-4beac6f647ae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:12:50 crc kubenswrapper[4664]: I1003 08:12:50.305975 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/309f57eb-9191-4466-a9eb-4beac6f647ae-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"309f57eb-9191-4466-a9eb-4beac6f647ae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:12:50 crc kubenswrapper[4664]: I1003 08:12:50.306008 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/309f57eb-9191-4466-a9eb-4beac6f647ae-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"309f57eb-9191-4466-a9eb-4beac6f647ae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:12:50 crc kubenswrapper[4664]: I1003 08:12:50.306165 4664 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"309f57eb-9191-4466-a9eb-4beac6f647ae\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:12:50 crc kubenswrapper[4664]: I1003 08:12:50.306956 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/309f57eb-9191-4466-a9eb-4beac6f647ae-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"309f57eb-9191-4466-a9eb-4beac6f647ae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:12:50 crc kubenswrapper[4664]: I1003 08:12:50.307635 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/309f57eb-9191-4466-a9eb-4beac6f647ae-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"309f57eb-9191-4466-a9eb-4beac6f647ae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:12:50 crc kubenswrapper[4664]: I1003 08:12:50.307848 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/309f57eb-9191-4466-a9eb-4beac6f647ae-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"309f57eb-9191-4466-a9eb-4beac6f647ae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:12:50 crc kubenswrapper[4664]: I1003 08:12:50.308280 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/309f57eb-9191-4466-a9eb-4beac6f647ae-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"309f57eb-9191-4466-a9eb-4beac6f647ae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:12:50 crc kubenswrapper[4664]: I1003 08:12:50.308534 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/309f57eb-9191-4466-a9eb-4beac6f647ae-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"309f57eb-9191-4466-a9eb-4beac6f647ae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:12:50 crc kubenswrapper[4664]: I1003 08:12:50.311741 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/309f57eb-9191-4466-a9eb-4beac6f647ae-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"309f57eb-9191-4466-a9eb-4beac6f647ae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:12:50 crc kubenswrapper[4664]: I1003 08:12:50.312166 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/309f57eb-9191-4466-a9eb-4beac6f647ae-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"309f57eb-9191-4466-a9eb-4beac6f647ae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:12:50 crc kubenswrapper[4664]: I1003 08:12:50.312272 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/309f57eb-9191-4466-a9eb-4beac6f647ae-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"309f57eb-9191-4466-a9eb-4beac6f647ae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:12:50 crc kubenswrapper[4664]: I1003 08:12:50.316049 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/309f57eb-9191-4466-a9eb-4beac6f647ae-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"309f57eb-9191-4466-a9eb-4beac6f647ae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:12:50 crc kubenswrapper[4664]: I1003 08:12:50.328384 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hb7q\" (UniqueName: \"kubernetes.io/projected/309f57eb-9191-4466-a9eb-4beac6f647ae-kube-api-access-8hb7q\") pod \"rabbitmq-cell1-server-0\" (UID: \"309f57eb-9191-4466-a9eb-4beac6f647ae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:12:50 crc kubenswrapper[4664]: I1003 08:12:50.345303 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"309f57eb-9191-4466-a9eb-4beac6f647ae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:12:50 crc kubenswrapper[4664]: I1003 08:12:50.647296 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:12:51 crc kubenswrapper[4664]: I1003 08:12:51.026855 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2d2ecc09-bcf3-4702-9456-e6c6880256cb","Type":"ContainerStarted","Data":"d31e9acad4c265c2c57448999472bddad469318f4aa3ef4616c0e355fbbd5f8a"} Oct 03 08:12:51 crc kubenswrapper[4664]: I1003 08:12:51.076274 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-cs9h5"] Oct 03 08:12:51 crc kubenswrapper[4664]: I1003 08:12:51.078569 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-cs9h5" Oct 03 08:12:51 crc kubenswrapper[4664]: I1003 08:12:51.093281 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-cs9h5"] Oct 03 08:12:51 crc kubenswrapper[4664]: I1003 08:12:51.099933 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Oct 03 08:12:51 crc kubenswrapper[4664]: I1003 08:12:51.203027 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 08:12:51 crc kubenswrapper[4664]: I1003 08:12:51.230798 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e3132b2-1be9-42db-8653-927679a439f2-config\") pod \"dnsmasq-dns-5576978c7c-cs9h5\" (UID: \"2e3132b2-1be9-42db-8653-927679a439f2\") " pod="openstack/dnsmasq-dns-5576978c7c-cs9h5" Oct 03 08:12:51 crc kubenswrapper[4664]: I1003 08:12:51.230895 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e3132b2-1be9-42db-8653-927679a439f2-dns-svc\") pod \"dnsmasq-dns-5576978c7c-cs9h5\" (UID: \"2e3132b2-1be9-42db-8653-927679a439f2\") " pod="openstack/dnsmasq-dns-5576978c7c-cs9h5" Oct 03 08:12:51 crc kubenswrapper[4664]: I1003 08:12:51.230943 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e3132b2-1be9-42db-8653-927679a439f2-ovsdbserver-nb\") pod \"dnsmasq-dns-5576978c7c-cs9h5\" (UID: \"2e3132b2-1be9-42db-8653-927679a439f2\") " pod="openstack/dnsmasq-dns-5576978c7c-cs9h5" Oct 03 08:12:51 crc kubenswrapper[4664]: I1003 08:12:51.230975 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e3132b2-1be9-42db-8653-927679a439f2-dns-swift-storage-0\") pod \"dnsmasq-dns-5576978c7c-cs9h5\" (UID: \"2e3132b2-1be9-42db-8653-927679a439f2\") " pod="openstack/dnsmasq-dns-5576978c7c-cs9h5" Oct 03 08:12:51 crc kubenswrapper[4664]: I1003 08:12:51.231021 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2e3132b2-1be9-42db-8653-927679a439f2-openstack-edpm-ipam\") pod \"dnsmasq-dns-5576978c7c-cs9h5\" (UID: \"2e3132b2-1be9-42db-8653-927679a439f2\") " pod="openstack/dnsmasq-dns-5576978c7c-cs9h5" Oct 03 08:12:51 crc kubenswrapper[4664]: I1003 08:12:51.231060 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e3132b2-1be9-42db-8653-927679a439f2-ovsdbserver-sb\") pod \"dnsmasq-dns-5576978c7c-cs9h5\" (UID: \"2e3132b2-1be9-42db-8653-927679a439f2\") " pod="openstack/dnsmasq-dns-5576978c7c-cs9h5" Oct 03 08:12:51 crc kubenswrapper[4664]: I1003 08:12:51.231088 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrt77\" (UniqueName: \"kubernetes.io/projected/2e3132b2-1be9-42db-8653-927679a439f2-kube-api-access-rrt77\") pod \"dnsmasq-dns-5576978c7c-cs9h5\" (UID: \"2e3132b2-1be9-42db-8653-927679a439f2\") " pod="openstack/dnsmasq-dns-5576978c7c-cs9h5" Oct 03 08:12:51 crc kubenswrapper[4664]: I1003 08:12:51.334243 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2e3132b2-1be9-42db-8653-927679a439f2-openstack-edpm-ipam\") pod \"dnsmasq-dns-5576978c7c-cs9h5\" (UID: \"2e3132b2-1be9-42db-8653-927679a439f2\") " pod="openstack/dnsmasq-dns-5576978c7c-cs9h5" Oct 03 08:12:51 crc kubenswrapper[4664]: I1003 08:12:51.334382 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e3132b2-1be9-42db-8653-927679a439f2-ovsdbserver-sb\") pod \"dnsmasq-dns-5576978c7c-cs9h5\" (UID: \"2e3132b2-1be9-42db-8653-927679a439f2\") " pod="openstack/dnsmasq-dns-5576978c7c-cs9h5" Oct 03 08:12:51 crc kubenswrapper[4664]: I1003 08:12:51.334455 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrt77\" (UniqueName: \"kubernetes.io/projected/2e3132b2-1be9-42db-8653-927679a439f2-kube-api-access-rrt77\") pod \"dnsmasq-dns-5576978c7c-cs9h5\" (UID: \"2e3132b2-1be9-42db-8653-927679a439f2\") " pod="openstack/dnsmasq-dns-5576978c7c-cs9h5" Oct 03 08:12:51 crc kubenswrapper[4664]: I1003 08:12:51.334547 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e3132b2-1be9-42db-8653-927679a439f2-config\") pod \"dnsmasq-dns-5576978c7c-cs9h5\" (UID: \"2e3132b2-1be9-42db-8653-927679a439f2\") " pod="openstack/dnsmasq-dns-5576978c7c-cs9h5" Oct 03 08:12:51 crc kubenswrapper[4664]: I1003 08:12:51.334653 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e3132b2-1be9-42db-8653-927679a439f2-dns-svc\") pod \"dnsmasq-dns-5576978c7c-cs9h5\" (UID: \"2e3132b2-1be9-42db-8653-927679a439f2\") " pod="openstack/dnsmasq-dns-5576978c7c-cs9h5" Oct 03 08:12:51 crc kubenswrapper[4664]: I1003 08:12:51.334719 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e3132b2-1be9-42db-8653-927679a439f2-ovsdbserver-nb\") pod \"dnsmasq-dns-5576978c7c-cs9h5\" (UID: \"2e3132b2-1be9-42db-8653-927679a439f2\") " pod="openstack/dnsmasq-dns-5576978c7c-cs9h5" Oct 03 08:12:51 crc kubenswrapper[4664]: I1003 08:12:51.334777 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e3132b2-1be9-42db-8653-927679a439f2-dns-swift-storage-0\") pod \"dnsmasq-dns-5576978c7c-cs9h5\" (UID: \"2e3132b2-1be9-42db-8653-927679a439f2\") " pod="openstack/dnsmasq-dns-5576978c7c-cs9h5" Oct 03 08:12:51 crc kubenswrapper[4664]: I1003 08:12:51.335669 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e3132b2-1be9-42db-8653-927679a439f2-ovsdbserver-sb\") pod \"dnsmasq-dns-5576978c7c-cs9h5\" (UID: \"2e3132b2-1be9-42db-8653-927679a439f2\") " pod="openstack/dnsmasq-dns-5576978c7c-cs9h5" Oct 03 08:12:51 crc kubenswrapper[4664]: I1003 08:12:51.335965 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e3132b2-1be9-42db-8653-927679a439f2-dns-swift-storage-0\") pod \"dnsmasq-dns-5576978c7c-cs9h5\" (UID: \"2e3132b2-1be9-42db-8653-927679a439f2\") " pod="openstack/dnsmasq-dns-5576978c7c-cs9h5" Oct 03 08:12:51 crc kubenswrapper[4664]: I1003 08:12:51.336306 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2e3132b2-1be9-42db-8653-927679a439f2-openstack-edpm-ipam\") pod \"dnsmasq-dns-5576978c7c-cs9h5\" (UID: \"2e3132b2-1be9-42db-8653-927679a439f2\") " pod="openstack/dnsmasq-dns-5576978c7c-cs9h5" Oct 03 08:12:51 crc kubenswrapper[4664]: I1003 08:12:51.336971 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e3132b2-1be9-42db-8653-927679a439f2-dns-svc\") pod \"dnsmasq-dns-5576978c7c-cs9h5\" (UID: \"2e3132b2-1be9-42db-8653-927679a439f2\") " pod="openstack/dnsmasq-dns-5576978c7c-cs9h5" Oct 03 08:12:51 crc kubenswrapper[4664]: I1003 08:12:51.337127 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-cs9h5"] Oct 03 08:12:51 crc kubenswrapper[4664]: I1003 08:12:51.337334 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e3132b2-1be9-42db-8653-927679a439f2-ovsdbserver-nb\") pod \"dnsmasq-dns-5576978c7c-cs9h5\" (UID: \"2e3132b2-1be9-42db-8653-927679a439f2\") " pod="openstack/dnsmasq-dns-5576978c7c-cs9h5" Oct 03 08:12:51 crc kubenswrapper[4664]: E1003 08:12:51.337554 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config kube-api-access-rrt77 ovsdbserver-nb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-5576978c7c-cs9h5" podUID="2e3132b2-1be9-42db-8653-927679a439f2" Oct 03 08:12:51 crc kubenswrapper[4664]: I1003 08:12:51.337620 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e3132b2-1be9-42db-8653-927679a439f2-config\") pod \"dnsmasq-dns-5576978c7c-cs9h5\" (UID: \"2e3132b2-1be9-42db-8653-927679a439f2\") " pod="openstack/dnsmasq-dns-5576978c7c-cs9h5" Oct 03 08:12:51 crc kubenswrapper[4664]: I1003 08:12:51.386526 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrt77\" (UniqueName: \"kubernetes.io/projected/2e3132b2-1be9-42db-8653-927679a439f2-kube-api-access-rrt77\") pod \"dnsmasq-dns-5576978c7c-cs9h5\" (UID: \"2e3132b2-1be9-42db-8653-927679a439f2\") " pod="openstack/dnsmasq-dns-5576978c7c-cs9h5" Oct 03 08:12:51 crc kubenswrapper[4664]: I1003 08:12:51.400117 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8c6f6df99-t8fh9"] Oct 03 08:12:51 crc kubenswrapper[4664]: I1003 08:12:51.401993 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8c6f6df99-t8fh9" Oct 03 08:12:51 crc kubenswrapper[4664]: I1003 08:12:51.441809 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8c6f6df99-t8fh9"] Oct 03 08:12:51 crc kubenswrapper[4664]: I1003 08:12:51.538129 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g89j\" (UniqueName: \"kubernetes.io/projected/f95a9f32-c6c5-4b5d-be2c-faab7a8d8e6a-kube-api-access-4g89j\") pod \"dnsmasq-dns-8c6f6df99-t8fh9\" (UID: \"f95a9f32-c6c5-4b5d-be2c-faab7a8d8e6a\") " pod="openstack/dnsmasq-dns-8c6f6df99-t8fh9" Oct 03 08:12:51 crc kubenswrapper[4664]: I1003 08:12:51.538511 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f95a9f32-c6c5-4b5d-be2c-faab7a8d8e6a-dns-svc\") pod \"dnsmasq-dns-8c6f6df99-t8fh9\" (UID: \"f95a9f32-c6c5-4b5d-be2c-faab7a8d8e6a\") " pod="openstack/dnsmasq-dns-8c6f6df99-t8fh9" Oct 03 08:12:51 crc kubenswrapper[4664]: I1003 08:12:51.538578 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f95a9f32-c6c5-4b5d-be2c-faab7a8d8e6a-ovsdbserver-sb\") pod \"dnsmasq-dns-8c6f6df99-t8fh9\" (UID: \"f95a9f32-c6c5-4b5d-be2c-faab7a8d8e6a\") " pod="openstack/dnsmasq-dns-8c6f6df99-t8fh9" Oct 03 08:12:51 crc kubenswrapper[4664]: I1003 08:12:51.538622 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f95a9f32-c6c5-4b5d-be2c-faab7a8d8e6a-dns-swift-storage-0\") pod \"dnsmasq-dns-8c6f6df99-t8fh9\" (UID: \"f95a9f32-c6c5-4b5d-be2c-faab7a8d8e6a\") " pod="openstack/dnsmasq-dns-8c6f6df99-t8fh9" Oct 03 08:12:51 crc kubenswrapper[4664]: I1003 08:12:51.538671 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f95a9f32-c6c5-4b5d-be2c-faab7a8d8e6a-config\") pod \"dnsmasq-dns-8c6f6df99-t8fh9\" (UID: \"f95a9f32-c6c5-4b5d-be2c-faab7a8d8e6a\") " pod="openstack/dnsmasq-dns-8c6f6df99-t8fh9" Oct 03 08:12:51 crc kubenswrapper[4664]: I1003 08:12:51.538752 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f95a9f32-c6c5-4b5d-be2c-faab7a8d8e6a-ovsdbserver-nb\") pod \"dnsmasq-dns-8c6f6df99-t8fh9\" (UID: \"f95a9f32-c6c5-4b5d-be2c-faab7a8d8e6a\") " pod="openstack/dnsmasq-dns-8c6f6df99-t8fh9" Oct 03 08:12:51 crc kubenswrapper[4664]: I1003 08:12:51.538779 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f95a9f32-c6c5-4b5d-be2c-faab7a8d8e6a-openstack-edpm-ipam\") pod \"dnsmasq-dns-8c6f6df99-t8fh9\" (UID: \"f95a9f32-c6c5-4b5d-be2c-faab7a8d8e6a\") " pod="openstack/dnsmasq-dns-8c6f6df99-t8fh9" Oct 03 08:12:51 crc kubenswrapper[4664]: I1003 08:12:51.640901 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g89j\" (UniqueName: \"kubernetes.io/projected/f95a9f32-c6c5-4b5d-be2c-faab7a8d8e6a-kube-api-access-4g89j\") pod \"dnsmasq-dns-8c6f6df99-t8fh9\" (UID: \"f95a9f32-c6c5-4b5d-be2c-faab7a8d8e6a\") " pod="openstack/dnsmasq-dns-8c6f6df99-t8fh9" Oct 03 08:12:51 crc kubenswrapper[4664]: I1003 08:12:51.640957 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f95a9f32-c6c5-4b5d-be2c-faab7a8d8e6a-dns-svc\") pod \"dnsmasq-dns-8c6f6df99-t8fh9\" (UID: \"f95a9f32-c6c5-4b5d-be2c-faab7a8d8e6a\") " pod="openstack/dnsmasq-dns-8c6f6df99-t8fh9" Oct 03 08:12:51 crc kubenswrapper[4664]: I1003 08:12:51.640999 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f95a9f32-c6c5-4b5d-be2c-faab7a8d8e6a-ovsdbserver-sb\") pod \"dnsmasq-dns-8c6f6df99-t8fh9\" (UID: \"f95a9f32-c6c5-4b5d-be2c-faab7a8d8e6a\") " pod="openstack/dnsmasq-dns-8c6f6df99-t8fh9" Oct 03 08:12:51 crc kubenswrapper[4664]: I1003 08:12:51.641018 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f95a9f32-c6c5-4b5d-be2c-faab7a8d8e6a-dns-swift-storage-0\") pod \"dnsmasq-dns-8c6f6df99-t8fh9\" (UID: \"f95a9f32-c6c5-4b5d-be2c-faab7a8d8e6a\") " pod="openstack/dnsmasq-dns-8c6f6df99-t8fh9" Oct 03 08:12:51 crc kubenswrapper[4664]: I1003 08:12:51.641048 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f95a9f32-c6c5-4b5d-be2c-faab7a8d8e6a-config\") pod \"dnsmasq-dns-8c6f6df99-t8fh9\" (UID: \"f95a9f32-c6c5-4b5d-be2c-faab7a8d8e6a\") " pod="openstack/dnsmasq-dns-8c6f6df99-t8fh9" Oct 03 08:12:51 crc kubenswrapper[4664]: I1003 08:12:51.641093 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f95a9f32-c6c5-4b5d-be2c-faab7a8d8e6a-ovsdbserver-nb\") pod \"dnsmasq-dns-8c6f6df99-t8fh9\" (UID: \"f95a9f32-c6c5-4b5d-be2c-faab7a8d8e6a\") " pod="openstack/dnsmasq-dns-8c6f6df99-t8fh9" Oct 03 08:12:51 crc kubenswrapper[4664]: I1003 08:12:51.641109 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f95a9f32-c6c5-4b5d-be2c-faab7a8d8e6a-openstack-edpm-ipam\") pod \"dnsmasq-dns-8c6f6df99-t8fh9\" (UID: \"f95a9f32-c6c5-4b5d-be2c-faab7a8d8e6a\") " pod="openstack/dnsmasq-dns-8c6f6df99-t8fh9" Oct 03 08:12:51 crc kubenswrapper[4664]: I1003 08:12:51.642042 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f95a9f32-c6c5-4b5d-be2c-faab7a8d8e6a-openstack-edpm-ipam\") pod \"dnsmasq-dns-8c6f6df99-t8fh9\" (UID: \"f95a9f32-c6c5-4b5d-be2c-faab7a8d8e6a\") " pod="openstack/dnsmasq-dns-8c6f6df99-t8fh9" Oct 03 08:12:51 crc kubenswrapper[4664]: I1003 08:12:51.642859 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f95a9f32-c6c5-4b5d-be2c-faab7a8d8e6a-dns-svc\") pod \"dnsmasq-dns-8c6f6df99-t8fh9\" (UID: \"f95a9f32-c6c5-4b5d-be2c-faab7a8d8e6a\") " pod="openstack/dnsmasq-dns-8c6f6df99-t8fh9" Oct 03 08:12:51 crc kubenswrapper[4664]: I1003 08:12:51.643415 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f95a9f32-c6c5-4b5d-be2c-faab7a8d8e6a-ovsdbserver-sb\") pod \"dnsmasq-dns-8c6f6df99-t8fh9\" (UID: \"f95a9f32-c6c5-4b5d-be2c-faab7a8d8e6a\") " pod="openstack/dnsmasq-dns-8c6f6df99-t8fh9" Oct 03 08:12:51 crc kubenswrapper[4664]: I1003 08:12:51.643924 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f95a9f32-c6c5-4b5d-be2c-faab7a8d8e6a-dns-swift-storage-0\") pod \"dnsmasq-dns-8c6f6df99-t8fh9\" (UID: \"f95a9f32-c6c5-4b5d-be2c-faab7a8d8e6a\") " pod="openstack/dnsmasq-dns-8c6f6df99-t8fh9" Oct 03 08:12:51 crc kubenswrapper[4664]: I1003 08:12:51.644595 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f95a9f32-c6c5-4b5d-be2c-faab7a8d8e6a-ovsdbserver-nb\") pod \"dnsmasq-dns-8c6f6df99-t8fh9\" (UID: \"f95a9f32-c6c5-4b5d-be2c-faab7a8d8e6a\") " pod="openstack/dnsmasq-dns-8c6f6df99-t8fh9" Oct 03 08:12:51 crc kubenswrapper[4664]: I1003 08:12:51.644628 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f95a9f32-c6c5-4b5d-be2c-faab7a8d8e6a-config\") pod \"dnsmasq-dns-8c6f6df99-t8fh9\" (UID: \"f95a9f32-c6c5-4b5d-be2c-faab7a8d8e6a\") " pod="openstack/dnsmasq-dns-8c6f6df99-t8fh9" Oct 03 08:12:51 crc kubenswrapper[4664]: I1003 08:12:51.663676 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g89j\" (UniqueName: \"kubernetes.io/projected/f95a9f32-c6c5-4b5d-be2c-faab7a8d8e6a-kube-api-access-4g89j\") pod \"dnsmasq-dns-8c6f6df99-t8fh9\" (UID: \"f95a9f32-c6c5-4b5d-be2c-faab7a8d8e6a\") " pod="openstack/dnsmasq-dns-8c6f6df99-t8fh9" Oct 03 08:12:51 crc kubenswrapper[4664]: I1003 08:12:51.823316 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8c6f6df99-t8fh9" Oct 03 08:12:51 crc kubenswrapper[4664]: I1003 08:12:51.888878 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8ae1def-1d1a-4acd-af78-204219a99fe6" path="/var/lib/kubelet/pods/b8ae1def-1d1a-4acd-af78-204219a99fe6/volumes" Oct 03 08:12:52 crc kubenswrapper[4664]: I1003 08:12:52.050525 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-cs9h5" Oct 03 08:12:52 crc kubenswrapper[4664]: I1003 08:12:52.050542 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"309f57eb-9191-4466-a9eb-4beac6f647ae","Type":"ContainerStarted","Data":"d764549f92045ce7a99fe2f4add09ff9861e6c138cf43e91e6cd3f9bcc9e31cf"} Oct 03 08:12:52 crc kubenswrapper[4664]: I1003 08:12:52.067302 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-cs9h5" Oct 03 08:12:52 crc kubenswrapper[4664]: I1003 08:12:52.153374 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2e3132b2-1be9-42db-8653-927679a439f2-openstack-edpm-ipam\") pod \"2e3132b2-1be9-42db-8653-927679a439f2\" (UID: \"2e3132b2-1be9-42db-8653-927679a439f2\") " Oct 03 08:12:52 crc kubenswrapper[4664]: I1003 08:12:52.153454 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e3132b2-1be9-42db-8653-927679a439f2-ovsdbserver-sb\") pod \"2e3132b2-1be9-42db-8653-927679a439f2\" (UID: \"2e3132b2-1be9-42db-8653-927679a439f2\") " Oct 03 08:12:52 crc kubenswrapper[4664]: I1003 08:12:52.153522 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e3132b2-1be9-42db-8653-927679a439f2-dns-svc\") pod \"2e3132b2-1be9-42db-8653-927679a439f2\" (UID: \"2e3132b2-1be9-42db-8653-927679a439f2\") " Oct 03 08:12:52 crc kubenswrapper[4664]: I1003 08:12:52.153574 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e3132b2-1be9-42db-8653-927679a439f2-ovsdbserver-nb\") pod \"2e3132b2-1be9-42db-8653-927679a439f2\" (UID: \"2e3132b2-1be9-42db-8653-927679a439f2\") " Oct 03 08:12:52 crc kubenswrapper[4664]: I1003 08:12:52.153659 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e3132b2-1be9-42db-8653-927679a439f2-dns-swift-storage-0\") pod \"2e3132b2-1be9-42db-8653-927679a439f2\" (UID: \"2e3132b2-1be9-42db-8653-927679a439f2\") " Oct 03 08:12:52 crc kubenswrapper[4664]: I1003 08:12:52.153705 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrt77\" (UniqueName: \"kubernetes.io/projected/2e3132b2-1be9-42db-8653-927679a439f2-kube-api-access-rrt77\") pod \"2e3132b2-1be9-42db-8653-927679a439f2\" (UID: \"2e3132b2-1be9-42db-8653-927679a439f2\") " Oct 03 08:12:52 crc kubenswrapper[4664]: I1003 08:12:52.153788 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e3132b2-1be9-42db-8653-927679a439f2-config\") pod \"2e3132b2-1be9-42db-8653-927679a439f2\" (UID: \"2e3132b2-1be9-42db-8653-927679a439f2\") " Oct 03 08:12:52 crc kubenswrapper[4664]: I1003 08:12:52.154372 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e3132b2-1be9-42db-8653-927679a439f2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2e3132b2-1be9-42db-8653-927679a439f2" (UID: "2e3132b2-1be9-42db-8653-927679a439f2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:12:52 crc kubenswrapper[4664]: I1003 08:12:52.154380 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e3132b2-1be9-42db-8653-927679a439f2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2e3132b2-1be9-42db-8653-927679a439f2" (UID: "2e3132b2-1be9-42db-8653-927679a439f2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:12:52 crc kubenswrapper[4664]: I1003 08:12:52.154421 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e3132b2-1be9-42db-8653-927679a439f2-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "2e3132b2-1be9-42db-8653-927679a439f2" (UID: "2e3132b2-1be9-42db-8653-927679a439f2"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:12:52 crc kubenswrapper[4664]: I1003 08:12:52.154884 4664 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2e3132b2-1be9-42db-8653-927679a439f2-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 03 08:12:52 crc kubenswrapper[4664]: I1003 08:12:52.154915 4664 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e3132b2-1be9-42db-8653-927679a439f2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 08:12:52 crc kubenswrapper[4664]: I1003 08:12:52.154928 4664 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e3132b2-1be9-42db-8653-927679a439f2-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 08:12:52 crc kubenswrapper[4664]: I1003 08:12:52.154888 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e3132b2-1be9-42db-8653-927679a439f2-config" (OuterVolumeSpecName: "config") pod "2e3132b2-1be9-42db-8653-927679a439f2" (UID: "2e3132b2-1be9-42db-8653-927679a439f2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:12:52 crc kubenswrapper[4664]: I1003 08:12:52.155301 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e3132b2-1be9-42db-8653-927679a439f2-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2e3132b2-1be9-42db-8653-927679a439f2" (UID: "2e3132b2-1be9-42db-8653-927679a439f2"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:12:52 crc kubenswrapper[4664]: I1003 08:12:52.156102 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e3132b2-1be9-42db-8653-927679a439f2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2e3132b2-1be9-42db-8653-927679a439f2" (UID: "2e3132b2-1be9-42db-8653-927679a439f2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:12:52 crc kubenswrapper[4664]: I1003 08:12:52.160098 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e3132b2-1be9-42db-8653-927679a439f2-kube-api-access-rrt77" (OuterVolumeSpecName: "kube-api-access-rrt77") pod "2e3132b2-1be9-42db-8653-927679a439f2" (UID: "2e3132b2-1be9-42db-8653-927679a439f2"). InnerVolumeSpecName "kube-api-access-rrt77". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:12:52 crc kubenswrapper[4664]: I1003 08:12:52.199836 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8c6f6df99-t8fh9"] Oct 03 08:12:52 crc kubenswrapper[4664]: I1003 08:12:52.258009 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrt77\" (UniqueName: \"kubernetes.io/projected/2e3132b2-1be9-42db-8653-927679a439f2-kube-api-access-rrt77\") on node \"crc\" DevicePath \"\"" Oct 03 08:12:52 crc kubenswrapper[4664]: I1003 08:12:52.258047 4664 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e3132b2-1be9-42db-8653-927679a439f2-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:12:52 crc kubenswrapper[4664]: I1003 08:12:52.258057 4664 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e3132b2-1be9-42db-8653-927679a439f2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 08:12:52 crc kubenswrapper[4664]: I1003 08:12:52.258069 4664 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e3132b2-1be9-42db-8653-927679a439f2-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 08:12:53 crc kubenswrapper[4664]: I1003 08:12:53.061036 4664 generic.go:334] "Generic (PLEG): container finished" podID="f95a9f32-c6c5-4b5d-be2c-faab7a8d8e6a" containerID="3be8f19b51783734e9f44e3e71f9fc5fa9c94d1f0558dd30142803f3355d358a" exitCode=0 Oct 03 08:12:53 crc kubenswrapper[4664]: I1003 08:12:53.061096 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8c6f6df99-t8fh9" event={"ID":"f95a9f32-c6c5-4b5d-be2c-faab7a8d8e6a","Type":"ContainerDied","Data":"3be8f19b51783734e9f44e3e71f9fc5fa9c94d1f0558dd30142803f3355d358a"} Oct 03 08:12:53 crc kubenswrapper[4664]: I1003 08:12:53.061431 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8c6f6df99-t8fh9" event={"ID":"f95a9f32-c6c5-4b5d-be2c-faab7a8d8e6a","Type":"ContainerStarted","Data":"0db38d236346e984e5e55425dd22d51dffdeb902b36e0a79df00bb59cdff7ee2"} Oct 03 08:12:53 crc kubenswrapper[4664]: I1003 08:12:53.064093 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2d2ecc09-bcf3-4702-9456-e6c6880256cb","Type":"ContainerStarted","Data":"851280011c6d0e0238423a82334f6c07f06e835229170084d73a8695ccb3624d"} Oct 03 08:12:53 crc kubenswrapper[4664]: I1003 08:12:53.064118 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-cs9h5" Oct 03 08:12:53 crc kubenswrapper[4664]: I1003 08:12:53.283266 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-cs9h5"] Oct 03 08:12:53 crc kubenswrapper[4664]: I1003 08:12:53.291516 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-cs9h5"] Oct 03 08:12:53 crc kubenswrapper[4664]: I1003 08:12:53.888725 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e3132b2-1be9-42db-8653-927679a439f2" path="/var/lib/kubelet/pods/2e3132b2-1be9-42db-8653-927679a439f2/volumes" Oct 03 08:12:54 crc kubenswrapper[4664]: I1003 08:12:54.073818 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"309f57eb-9191-4466-a9eb-4beac6f647ae","Type":"ContainerStarted","Data":"1e5af8ca0318d158bbf1eeafa4b61bf6851f1b1861c232ac44c1e22802371741"} Oct 03 08:12:54 crc kubenswrapper[4664]: I1003 08:12:54.076340 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8c6f6df99-t8fh9" event={"ID":"f95a9f32-c6c5-4b5d-be2c-faab7a8d8e6a","Type":"ContainerStarted","Data":"97ea216555cf832ad25fb071b0ea31f28aebe1b7ec322437c1f1c2fa473587fb"} Oct 03 08:12:54 crc kubenswrapper[4664]: I1003 08:12:54.076626 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8c6f6df99-t8fh9" Oct 03 08:12:54 crc kubenswrapper[4664]: I1003 08:12:54.120200 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8c6f6df99-t8fh9" podStartSLOduration=3.120179302 podStartE2EDuration="3.120179302s" podCreationTimestamp="2025-10-03 08:12:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:12:54.11712408 +0000 UTC m=+1474.938314590" watchObservedRunningTime="2025-10-03 08:12:54.120179302 +0000 UTC m=+1474.941369792" Oct 03 08:13:01 crc kubenswrapper[4664]: I1003 08:13:01.824718 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8c6f6df99-t8fh9" Oct 03 08:13:01 crc kubenswrapper[4664]: I1003 08:13:01.919951 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-czxf9"] Oct 03 08:13:01 crc kubenswrapper[4664]: I1003 08:13:01.920171 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c7b6c5df9-czxf9" podUID="90873c04-0d3c-41db-808b-8d550af4fe50" containerName="dnsmasq-dns" containerID="cri-o://e52f7c9a816336a03a3996b335ffe428595913b0431ab0ab580de426c6a8e1a3" gracePeriod=10 Oct 03 08:13:02 crc kubenswrapper[4664]: I1003 08:13:02.153405 4664 generic.go:334] "Generic (PLEG): container finished" podID="90873c04-0d3c-41db-808b-8d550af4fe50" containerID="e52f7c9a816336a03a3996b335ffe428595913b0431ab0ab580de426c6a8e1a3" exitCode=0 Oct 03 08:13:02 crc kubenswrapper[4664]: I1003 08:13:02.154523 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-czxf9" event={"ID":"90873c04-0d3c-41db-808b-8d550af4fe50","Type":"ContainerDied","Data":"e52f7c9a816336a03a3996b335ffe428595913b0431ab0ab580de426c6a8e1a3"} Oct 03 08:13:02 crc kubenswrapper[4664]: I1003 08:13:02.417664 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-czxf9" Oct 03 08:13:02 crc kubenswrapper[4664]: I1003 08:13:02.496829 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90873c04-0d3c-41db-808b-8d550af4fe50-config\") pod \"90873c04-0d3c-41db-808b-8d550af4fe50\" (UID: \"90873c04-0d3c-41db-808b-8d550af4fe50\") " Oct 03 08:13:02 crc kubenswrapper[4664]: I1003 08:13:02.496967 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/90873c04-0d3c-41db-808b-8d550af4fe50-dns-swift-storage-0\") pod \"90873c04-0d3c-41db-808b-8d550af4fe50\" (UID: \"90873c04-0d3c-41db-808b-8d550af4fe50\") " Oct 03 08:13:02 crc kubenswrapper[4664]: I1003 08:13:02.497063 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lp92w\" (UniqueName: \"kubernetes.io/projected/90873c04-0d3c-41db-808b-8d550af4fe50-kube-api-access-lp92w\") pod \"90873c04-0d3c-41db-808b-8d550af4fe50\" (UID: \"90873c04-0d3c-41db-808b-8d550af4fe50\") " Oct 03 08:13:02 crc kubenswrapper[4664]: I1003 08:13:02.497091 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90873c04-0d3c-41db-808b-8d550af4fe50-ovsdbserver-sb\") pod \"90873c04-0d3c-41db-808b-8d550af4fe50\" (UID: \"90873c04-0d3c-41db-808b-8d550af4fe50\") " Oct 03 08:13:02 crc kubenswrapper[4664]: I1003 08:13:02.497117 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90873c04-0d3c-41db-808b-8d550af4fe50-dns-svc\") pod \"90873c04-0d3c-41db-808b-8d550af4fe50\" (UID: \"90873c04-0d3c-41db-808b-8d550af4fe50\") " Oct 03 08:13:02 crc kubenswrapper[4664]: I1003 08:13:02.497180 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90873c04-0d3c-41db-808b-8d550af4fe50-ovsdbserver-nb\") pod \"90873c04-0d3c-41db-808b-8d550af4fe50\" (UID: \"90873c04-0d3c-41db-808b-8d550af4fe50\") " Oct 03 08:13:02 crc kubenswrapper[4664]: I1003 08:13:02.505108 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90873c04-0d3c-41db-808b-8d550af4fe50-kube-api-access-lp92w" (OuterVolumeSpecName: "kube-api-access-lp92w") pod "90873c04-0d3c-41db-808b-8d550af4fe50" (UID: "90873c04-0d3c-41db-808b-8d550af4fe50"). InnerVolumeSpecName "kube-api-access-lp92w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:13:02 crc kubenswrapper[4664]: I1003 08:13:02.551962 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90873c04-0d3c-41db-808b-8d550af4fe50-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "90873c04-0d3c-41db-808b-8d550af4fe50" (UID: "90873c04-0d3c-41db-808b-8d550af4fe50"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:13:02 crc kubenswrapper[4664]: I1003 08:13:02.552323 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90873c04-0d3c-41db-808b-8d550af4fe50-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "90873c04-0d3c-41db-808b-8d550af4fe50" (UID: "90873c04-0d3c-41db-808b-8d550af4fe50"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:13:02 crc kubenswrapper[4664]: I1003 08:13:02.553020 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90873c04-0d3c-41db-808b-8d550af4fe50-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "90873c04-0d3c-41db-808b-8d550af4fe50" (UID: "90873c04-0d3c-41db-808b-8d550af4fe50"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:13:02 crc kubenswrapper[4664]: I1003 08:13:02.553700 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90873c04-0d3c-41db-808b-8d550af4fe50-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "90873c04-0d3c-41db-808b-8d550af4fe50" (UID: "90873c04-0d3c-41db-808b-8d550af4fe50"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:13:02 crc kubenswrapper[4664]: I1003 08:13:02.553976 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90873c04-0d3c-41db-808b-8d550af4fe50-config" (OuterVolumeSpecName: "config") pod "90873c04-0d3c-41db-808b-8d550af4fe50" (UID: "90873c04-0d3c-41db-808b-8d550af4fe50"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:13:02 crc kubenswrapper[4664]: I1003 08:13:02.599510 4664 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90873c04-0d3c-41db-808b-8d550af4fe50-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:13:02 crc kubenswrapper[4664]: I1003 08:13:02.599554 4664 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/90873c04-0d3c-41db-808b-8d550af4fe50-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 08:13:02 crc kubenswrapper[4664]: I1003 08:13:02.599577 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lp92w\" (UniqueName: \"kubernetes.io/projected/90873c04-0d3c-41db-808b-8d550af4fe50-kube-api-access-lp92w\") on node \"crc\" DevicePath \"\"" Oct 03 08:13:02 crc kubenswrapper[4664]: I1003 08:13:02.599589 4664 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90873c04-0d3c-41db-808b-8d550af4fe50-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 08:13:02 crc kubenswrapper[4664]: I1003 08:13:02.599618 4664 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90873c04-0d3c-41db-808b-8d550af4fe50-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 08:13:02 crc kubenswrapper[4664]: I1003 08:13:02.599628 4664 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90873c04-0d3c-41db-808b-8d550af4fe50-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 08:13:03 crc kubenswrapper[4664]: I1003 08:13:03.164235 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-czxf9" event={"ID":"90873c04-0d3c-41db-808b-8d550af4fe50","Type":"ContainerDied","Data":"0963a1e7e9dfccb8763fa46fa81764860893e9979f065b1d9534efe0a0392c64"} Oct 03 08:13:03 crc kubenswrapper[4664]: I1003 08:13:03.164323 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-czxf9" Oct 03 08:13:03 crc kubenswrapper[4664]: I1003 08:13:03.165266 4664 scope.go:117] "RemoveContainer" containerID="e52f7c9a816336a03a3996b335ffe428595913b0431ab0ab580de426c6a8e1a3" Oct 03 08:13:03 crc kubenswrapper[4664]: I1003 08:13:03.190228 4664 scope.go:117] "RemoveContainer" containerID="b620d494e00cebd9a19844540362b70ef19dd7906c828ea37f2362685b8dff76" Oct 03 08:13:03 crc kubenswrapper[4664]: I1003 08:13:03.201592 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-czxf9"] Oct 03 08:13:03 crc kubenswrapper[4664]: I1003 08:13:03.208899 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-czxf9"] Oct 03 08:13:03 crc kubenswrapper[4664]: I1003 08:13:03.885952 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90873c04-0d3c-41db-808b-8d550af4fe50" path="/var/lib/kubelet/pods/90873c04-0d3c-41db-808b-8d550af4fe50/volumes" Oct 03 08:13:10 crc kubenswrapper[4664]: I1003 08:13:10.361250 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h6fr8"] Oct 03 08:13:10 crc kubenswrapper[4664]: E1003 08:13:10.362362 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90873c04-0d3c-41db-808b-8d550af4fe50" containerName="init" Oct 03 08:13:10 crc kubenswrapper[4664]: I1003 08:13:10.362380 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="90873c04-0d3c-41db-808b-8d550af4fe50" containerName="init" Oct 03 08:13:10 crc kubenswrapper[4664]: E1003 08:13:10.362441 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90873c04-0d3c-41db-808b-8d550af4fe50" containerName="dnsmasq-dns" Oct 03 08:13:10 crc kubenswrapper[4664]: I1003 08:13:10.362450 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="90873c04-0d3c-41db-808b-8d550af4fe50" containerName="dnsmasq-dns" Oct 03 08:13:10 crc kubenswrapper[4664]: I1003 08:13:10.362715 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="90873c04-0d3c-41db-808b-8d550af4fe50" containerName="dnsmasq-dns" Oct 03 08:13:10 crc kubenswrapper[4664]: I1003 08:13:10.364054 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h6fr8" Oct 03 08:13:10 crc kubenswrapper[4664]: I1003 08:13:10.369085 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 08:13:10 crc kubenswrapper[4664]: I1003 08:13:10.370301 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 08:13:10 crc kubenswrapper[4664]: I1003 08:13:10.371309 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h6s74" Oct 03 08:13:10 crc kubenswrapper[4664]: I1003 08:13:10.376991 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h6fr8"] Oct 03 08:13:10 crc kubenswrapper[4664]: I1003 08:13:10.389036 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 08:13:10 crc kubenswrapper[4664]: I1003 08:13:10.457877 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grfh7\" (UniqueName: \"kubernetes.io/projected/45e227c3-f408-44ab-808f-8351845af92c-kube-api-access-grfh7\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h6fr8\" (UID: \"45e227c3-f408-44ab-808f-8351845af92c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h6fr8" Oct 03 08:13:10 crc kubenswrapper[4664]: I1003 08:13:10.457949 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/45e227c3-f408-44ab-808f-8351845af92c-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h6fr8\" (UID: \"45e227c3-f408-44ab-808f-8351845af92c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h6fr8" Oct 03 08:13:10 crc kubenswrapper[4664]: I1003 08:13:10.458096 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45e227c3-f408-44ab-808f-8351845af92c-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h6fr8\" (UID: \"45e227c3-f408-44ab-808f-8351845af92c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h6fr8" Oct 03 08:13:10 crc kubenswrapper[4664]: I1003 08:13:10.458127 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/45e227c3-f408-44ab-808f-8351845af92c-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h6fr8\" (UID: \"45e227c3-f408-44ab-808f-8351845af92c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h6fr8" Oct 03 08:13:10 crc kubenswrapper[4664]: I1003 08:13:10.560574 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45e227c3-f408-44ab-808f-8351845af92c-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h6fr8\" (UID: \"45e227c3-f408-44ab-808f-8351845af92c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h6fr8" Oct 03 08:13:10 crc kubenswrapper[4664]: I1003 08:13:10.560717 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/45e227c3-f408-44ab-808f-8351845af92c-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h6fr8\" (UID: \"45e227c3-f408-44ab-808f-8351845af92c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h6fr8" Oct 03 08:13:10 crc kubenswrapper[4664]: I1003 08:13:10.560840 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grfh7\" (UniqueName: \"kubernetes.io/projected/45e227c3-f408-44ab-808f-8351845af92c-kube-api-access-grfh7\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h6fr8\" (UID: \"45e227c3-f408-44ab-808f-8351845af92c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h6fr8" Oct 03 08:13:10 crc kubenswrapper[4664]: I1003 08:13:10.560910 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/45e227c3-f408-44ab-808f-8351845af92c-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h6fr8\" (UID: \"45e227c3-f408-44ab-808f-8351845af92c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h6fr8" Oct 03 08:13:10 crc kubenswrapper[4664]: I1003 08:13:10.569578 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45e227c3-f408-44ab-808f-8351845af92c-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h6fr8\" (UID: \"45e227c3-f408-44ab-808f-8351845af92c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h6fr8" Oct 03 08:13:10 crc kubenswrapper[4664]: I1003 08:13:10.569680 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/45e227c3-f408-44ab-808f-8351845af92c-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h6fr8\" (UID: \"45e227c3-f408-44ab-808f-8351845af92c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h6fr8" Oct 03 08:13:10 crc kubenswrapper[4664]: I1003 08:13:10.571043 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/45e227c3-f408-44ab-808f-8351845af92c-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h6fr8\" (UID: \"45e227c3-f408-44ab-808f-8351845af92c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h6fr8" Oct 03 08:13:10 crc kubenswrapper[4664]: I1003 08:13:10.580041 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grfh7\" (UniqueName: \"kubernetes.io/projected/45e227c3-f408-44ab-808f-8351845af92c-kube-api-access-grfh7\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h6fr8\" (UID: \"45e227c3-f408-44ab-808f-8351845af92c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h6fr8" Oct 03 08:13:10 crc kubenswrapper[4664]: I1003 08:13:10.683922 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h6fr8" Oct 03 08:13:11 crc kubenswrapper[4664]: I1003 08:13:11.266934 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h6fr8"] Oct 03 08:13:11 crc kubenswrapper[4664]: W1003 08:13:11.278320 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45e227c3_f408_44ab_808f_8351845af92c.slice/crio-094a16b684e08a6a2988305c48349c9dedcc54d49ad94ed24cf5670547371845 WatchSource:0}: Error finding container 094a16b684e08a6a2988305c48349c9dedcc54d49ad94ed24cf5670547371845: Status 404 returned error can't find the container with id 094a16b684e08a6a2988305c48349c9dedcc54d49ad94ed24cf5670547371845 Oct 03 08:13:11 crc kubenswrapper[4664]: I1003 08:13:11.986678 4664 patch_prober.go:28] interesting pod/machine-config-daemon-x9dgm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:13:11 crc kubenswrapper[4664]: I1003 08:13:11.986995 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:13:11 crc kubenswrapper[4664]: I1003 08:13:11.987061 4664 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" Oct 03 08:13:11 crc kubenswrapper[4664]: I1003 08:13:11.988190 4664 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2dfe6ff457d0c2bccf5db2631d7781386b8da1168146e54f8a4ae9ce420f6b83"} pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 08:13:11 crc kubenswrapper[4664]: I1003 08:13:11.988296 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" containerID="cri-o://2dfe6ff457d0c2bccf5db2631d7781386b8da1168146e54f8a4ae9ce420f6b83" gracePeriod=600 Oct 03 08:13:12 crc kubenswrapper[4664]: I1003 08:13:12.260053 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h6fr8" event={"ID":"45e227c3-f408-44ab-808f-8351845af92c","Type":"ContainerStarted","Data":"094a16b684e08a6a2988305c48349c9dedcc54d49ad94ed24cf5670547371845"} Oct 03 08:13:12 crc kubenswrapper[4664]: I1003 08:13:12.262298 4664 generic.go:334] "Generic (PLEG): container finished" podID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerID="2dfe6ff457d0c2bccf5db2631d7781386b8da1168146e54f8a4ae9ce420f6b83" exitCode=0 Oct 03 08:13:12 crc kubenswrapper[4664]: I1003 08:13:12.262325 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" event={"ID":"598b81ce-0ce7-498f-9337-ae5e6e64682b","Type":"ContainerDied","Data":"2dfe6ff457d0c2bccf5db2631d7781386b8da1168146e54f8a4ae9ce420f6b83"} Oct 03 08:13:12 crc kubenswrapper[4664]: I1003 08:13:12.262377 4664 scope.go:117] "RemoveContainer" containerID="06473cda750028c12efef390356377e8ae805e2359da1c4b578e9e258218058e" Oct 03 08:13:13 crc kubenswrapper[4664]: I1003 08:13:13.278109 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" event={"ID":"598b81ce-0ce7-498f-9337-ae5e6e64682b","Type":"ContainerStarted","Data":"bfd5903d53b3dc9e39f07ec7a1586cebd8a3cc45fb7b6842a928b48f35ec7cf9"} Oct 03 08:13:23 crc kubenswrapper[4664]: I1003 08:13:23.095955 4664 scope.go:117] "RemoveContainer" containerID="290801e369a3162ccccc045da875b27f969d39c2ea636a29e333fb34a4069e3b" Oct 03 08:13:23 crc kubenswrapper[4664]: I1003 08:13:23.560867 4664 scope.go:117] "RemoveContainer" containerID="e891ccea0eb9f5a873ecf1f89302da2311263d8cc2defb9f9f2c45fa4370137d" Oct 03 08:13:24 crc kubenswrapper[4664]: I1003 08:13:24.245438 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 08:13:24 crc kubenswrapper[4664]: I1003 08:13:24.392300 4664 generic.go:334] "Generic (PLEG): container finished" podID="2d2ecc09-bcf3-4702-9456-e6c6880256cb" containerID="851280011c6d0e0238423a82334f6c07f06e835229170084d73a8695ccb3624d" exitCode=0 Oct 03 08:13:24 crc kubenswrapper[4664]: I1003 08:13:24.392382 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2d2ecc09-bcf3-4702-9456-e6c6880256cb","Type":"ContainerDied","Data":"851280011c6d0e0238423a82334f6c07f06e835229170084d73a8695ccb3624d"} Oct 03 08:13:25 crc kubenswrapper[4664]: I1003 08:13:25.406533 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h6fr8" event={"ID":"45e227c3-f408-44ab-808f-8351845af92c","Type":"ContainerStarted","Data":"7bde457705318bd92b7e585ac2936449f055d97e5bf461dcabc2c2aad9cc72db"} Oct 03 08:13:25 crc kubenswrapper[4664]: I1003 08:13:25.414589 4664 generic.go:334] "Generic (PLEG): container finished" podID="309f57eb-9191-4466-a9eb-4beac6f647ae" containerID="1e5af8ca0318d158bbf1eeafa4b61bf6851f1b1861c232ac44c1e22802371741" exitCode=0 Oct 03 08:13:25 crc kubenswrapper[4664]: I1003 08:13:25.414713 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"309f57eb-9191-4466-a9eb-4beac6f647ae","Type":"ContainerDied","Data":"1e5af8ca0318d158bbf1eeafa4b61bf6851f1b1861c232ac44c1e22802371741"} Oct 03 08:13:25 crc kubenswrapper[4664]: I1003 08:13:25.419936 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2d2ecc09-bcf3-4702-9456-e6c6880256cb","Type":"ContainerStarted","Data":"2d5b22538047a81aa734c3623b50a48938b8af48c278a281c3c48f5ce55d4570"} Oct 03 08:13:25 crc kubenswrapper[4664]: I1003 08:13:25.420194 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 03 08:13:25 crc kubenswrapper[4664]: I1003 08:13:25.466701 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.466676895 podStartE2EDuration="36.466676895s" podCreationTimestamp="2025-10-03 08:12:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:13:25.457642645 +0000 UTC m=+1506.278833145" watchObservedRunningTime="2025-10-03 08:13:25.466676895 +0000 UTC m=+1506.287867385" Oct 03 08:13:25 crc kubenswrapper[4664]: I1003 08:13:25.476809 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h6fr8" podStartSLOduration=2.5142088769999997 podStartE2EDuration="15.476774106s" podCreationTimestamp="2025-10-03 08:13:10 +0000 UTC" firstStartedPulling="2025-10-03 08:13:11.2804469 +0000 UTC m=+1492.101637390" lastFinishedPulling="2025-10-03 08:13:24.243012129 +0000 UTC m=+1505.064202619" observedRunningTime="2025-10-03 08:13:25.429415672 +0000 UTC m=+1506.250606162" watchObservedRunningTime="2025-10-03 08:13:25.476774106 +0000 UTC m=+1506.297964596" Oct 03 08:13:26 crc kubenswrapper[4664]: I1003 08:13:26.437814 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"309f57eb-9191-4466-a9eb-4beac6f647ae","Type":"ContainerStarted","Data":"528f80c871267512ad92aa1d552157e75ecfb740a6d6578b3b14bf11f0893701"} Oct 03 08:13:26 crc kubenswrapper[4664]: I1003 08:13:26.438340 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:13:26 crc kubenswrapper[4664]: I1003 08:13:26.463007 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.462990324 podStartE2EDuration="36.462990324s" podCreationTimestamp="2025-10-03 08:12:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:13:26.459566946 +0000 UTC m=+1507.280757446" watchObservedRunningTime="2025-10-03 08:13:26.462990324 +0000 UTC m=+1507.284180804" Oct 03 08:13:36 crc kubenswrapper[4664]: I1003 08:13:36.542719 4664 generic.go:334] "Generic (PLEG): container finished" podID="45e227c3-f408-44ab-808f-8351845af92c" containerID="7bde457705318bd92b7e585ac2936449f055d97e5bf461dcabc2c2aad9cc72db" exitCode=0 Oct 03 08:13:36 crc kubenswrapper[4664]: I1003 08:13:36.542808 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h6fr8" event={"ID":"45e227c3-f408-44ab-808f-8351845af92c","Type":"ContainerDied","Data":"7bde457705318bd92b7e585ac2936449f055d97e5bf461dcabc2c2aad9cc72db"} Oct 03 08:13:37 crc kubenswrapper[4664]: I1003 08:13:37.971237 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h6fr8" Oct 03 08:13:38 crc kubenswrapper[4664]: I1003 08:13:38.102715 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/45e227c3-f408-44ab-808f-8351845af92c-ssh-key\") pod \"45e227c3-f408-44ab-808f-8351845af92c\" (UID: \"45e227c3-f408-44ab-808f-8351845af92c\") " Oct 03 08:13:38 crc kubenswrapper[4664]: I1003 08:13:38.102866 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/45e227c3-f408-44ab-808f-8351845af92c-inventory\") pod \"45e227c3-f408-44ab-808f-8351845af92c\" (UID: \"45e227c3-f408-44ab-808f-8351845af92c\") " Oct 03 08:13:38 crc kubenswrapper[4664]: I1003 08:13:38.103062 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45e227c3-f408-44ab-808f-8351845af92c-repo-setup-combined-ca-bundle\") pod \"45e227c3-f408-44ab-808f-8351845af92c\" (UID: \"45e227c3-f408-44ab-808f-8351845af92c\") " Oct 03 08:13:38 crc kubenswrapper[4664]: I1003 08:13:38.103329 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grfh7\" (UniqueName: \"kubernetes.io/projected/45e227c3-f408-44ab-808f-8351845af92c-kube-api-access-grfh7\") pod \"45e227c3-f408-44ab-808f-8351845af92c\" (UID: \"45e227c3-f408-44ab-808f-8351845af92c\") " Oct 03 08:13:38 crc kubenswrapper[4664]: I1003 08:13:38.111015 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45e227c3-f408-44ab-808f-8351845af92c-kube-api-access-grfh7" (OuterVolumeSpecName: "kube-api-access-grfh7") pod "45e227c3-f408-44ab-808f-8351845af92c" (UID: "45e227c3-f408-44ab-808f-8351845af92c"). InnerVolumeSpecName "kube-api-access-grfh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:13:38 crc kubenswrapper[4664]: I1003 08:13:38.111602 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45e227c3-f408-44ab-808f-8351845af92c-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "45e227c3-f408-44ab-808f-8351845af92c" (UID: "45e227c3-f408-44ab-808f-8351845af92c"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:13:38 crc kubenswrapper[4664]: I1003 08:13:38.139965 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45e227c3-f408-44ab-808f-8351845af92c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "45e227c3-f408-44ab-808f-8351845af92c" (UID: "45e227c3-f408-44ab-808f-8351845af92c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:13:38 crc kubenswrapper[4664]: I1003 08:13:38.141683 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45e227c3-f408-44ab-808f-8351845af92c-inventory" (OuterVolumeSpecName: "inventory") pod "45e227c3-f408-44ab-808f-8351845af92c" (UID: "45e227c3-f408-44ab-808f-8351845af92c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:13:38 crc kubenswrapper[4664]: I1003 08:13:38.206436 4664 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45e227c3-f408-44ab-808f-8351845af92c-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:13:38 crc kubenswrapper[4664]: I1003 08:13:38.206476 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grfh7\" (UniqueName: \"kubernetes.io/projected/45e227c3-f408-44ab-808f-8351845af92c-kube-api-access-grfh7\") on node \"crc\" DevicePath \"\"" Oct 03 08:13:38 crc kubenswrapper[4664]: I1003 08:13:38.206488 4664 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/45e227c3-f408-44ab-808f-8351845af92c-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 08:13:38 crc kubenswrapper[4664]: I1003 08:13:38.206499 4664 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/45e227c3-f408-44ab-808f-8351845af92c-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 08:13:38 crc kubenswrapper[4664]: I1003 08:13:38.569668 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h6fr8" event={"ID":"45e227c3-f408-44ab-808f-8351845af92c","Type":"ContainerDied","Data":"094a16b684e08a6a2988305c48349c9dedcc54d49ad94ed24cf5670547371845"} Oct 03 08:13:38 crc kubenswrapper[4664]: I1003 08:13:38.569723 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="094a16b684e08a6a2988305c48349c9dedcc54d49ad94ed24cf5670547371845" Oct 03 08:13:38 crc kubenswrapper[4664]: I1003 08:13:38.569789 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h6fr8" Oct 03 08:13:38 crc kubenswrapper[4664]: I1003 08:13:38.660559 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-2pdvl"] Oct 03 08:13:38 crc kubenswrapper[4664]: E1003 08:13:38.661266 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45e227c3-f408-44ab-808f-8351845af92c" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 03 08:13:38 crc kubenswrapper[4664]: I1003 08:13:38.661294 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="45e227c3-f408-44ab-808f-8351845af92c" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 03 08:13:38 crc kubenswrapper[4664]: I1003 08:13:38.661603 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="45e227c3-f408-44ab-808f-8351845af92c" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 03 08:13:38 crc kubenswrapper[4664]: I1003 08:13:38.662457 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2pdvl" Oct 03 08:13:38 crc kubenswrapper[4664]: I1003 08:13:38.666592 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h6s74" Oct 03 08:13:38 crc kubenswrapper[4664]: I1003 08:13:38.666674 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 08:13:38 crc kubenswrapper[4664]: I1003 08:13:38.667121 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 08:13:38 crc kubenswrapper[4664]: I1003 08:13:38.667120 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 08:13:38 crc kubenswrapper[4664]: I1003 08:13:38.675530 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-2pdvl"] Oct 03 08:13:38 crc kubenswrapper[4664]: I1003 08:13:38.718549 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2080288a-c108-46e5-b794-0a3f41eb2e31-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2pdvl\" (UID: \"2080288a-c108-46e5-b794-0a3f41eb2e31\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2pdvl" Oct 03 08:13:38 crc kubenswrapper[4664]: I1003 08:13:38.719082 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl8px\" (UniqueName: \"kubernetes.io/projected/2080288a-c108-46e5-b794-0a3f41eb2e31-kube-api-access-zl8px\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2pdvl\" (UID: \"2080288a-c108-46e5-b794-0a3f41eb2e31\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2pdvl" Oct 03 08:13:38 crc kubenswrapper[4664]: I1003 08:13:38.719149 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2080288a-c108-46e5-b794-0a3f41eb2e31-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2pdvl\" (UID: \"2080288a-c108-46e5-b794-0a3f41eb2e31\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2pdvl" Oct 03 08:13:38 crc kubenswrapper[4664]: I1003 08:13:38.820457 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2080288a-c108-46e5-b794-0a3f41eb2e31-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2pdvl\" (UID: \"2080288a-c108-46e5-b794-0a3f41eb2e31\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2pdvl" Oct 03 08:13:38 crc kubenswrapper[4664]: I1003 08:13:38.820503 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl8px\" (UniqueName: \"kubernetes.io/projected/2080288a-c108-46e5-b794-0a3f41eb2e31-kube-api-access-zl8px\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2pdvl\" (UID: \"2080288a-c108-46e5-b794-0a3f41eb2e31\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2pdvl" Oct 03 08:13:38 crc kubenswrapper[4664]: I1003 08:13:38.820547 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2080288a-c108-46e5-b794-0a3f41eb2e31-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2pdvl\" (UID: \"2080288a-c108-46e5-b794-0a3f41eb2e31\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2pdvl" Oct 03 08:13:38 crc kubenswrapper[4664]: I1003 08:13:38.824593 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2080288a-c108-46e5-b794-0a3f41eb2e31-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2pdvl\" (UID: \"2080288a-c108-46e5-b794-0a3f41eb2e31\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2pdvl" Oct 03 08:13:38 crc kubenswrapper[4664]: I1003 08:13:38.826899 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2080288a-c108-46e5-b794-0a3f41eb2e31-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2pdvl\" (UID: \"2080288a-c108-46e5-b794-0a3f41eb2e31\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2pdvl" Oct 03 08:13:38 crc kubenswrapper[4664]: I1003 08:13:38.838046 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl8px\" (UniqueName: \"kubernetes.io/projected/2080288a-c108-46e5-b794-0a3f41eb2e31-kube-api-access-zl8px\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2pdvl\" (UID: \"2080288a-c108-46e5-b794-0a3f41eb2e31\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2pdvl" Oct 03 08:13:38 crc kubenswrapper[4664]: I1003 08:13:38.990633 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2pdvl" Oct 03 08:13:39 crc kubenswrapper[4664]: I1003 08:13:39.505857 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 03 08:13:39 crc kubenswrapper[4664]: I1003 08:13:39.528062 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-2pdvl"] Oct 03 08:13:39 crc kubenswrapper[4664]: I1003 08:13:39.599839 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2pdvl" event={"ID":"2080288a-c108-46e5-b794-0a3f41eb2e31","Type":"ContainerStarted","Data":"c4db789711ad81e5a5c37e17984790fcc9c2f8c9eec2c6ade93a57a6bf6a2e67"} Oct 03 08:13:40 crc kubenswrapper[4664]: I1003 08:13:40.610825 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2pdvl" event={"ID":"2080288a-c108-46e5-b794-0a3f41eb2e31","Type":"ContainerStarted","Data":"b845919dc6e9290ef576f6d1bb7ff99754d6caee1415bde06a5330465f5f5711"} Oct 03 08:13:40 crc kubenswrapper[4664]: I1003 08:13:40.645222 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2pdvl" podStartSLOduration=2.476954108 podStartE2EDuration="2.645186422s" podCreationTimestamp="2025-10-03 08:13:38 +0000 UTC" firstStartedPulling="2025-10-03 08:13:39.549016958 +0000 UTC m=+1520.370207448" lastFinishedPulling="2025-10-03 08:13:39.717249272 +0000 UTC m=+1520.538439762" observedRunningTime="2025-10-03 08:13:40.631809837 +0000 UTC m=+1521.453000327" watchObservedRunningTime="2025-10-03 08:13:40.645186422 +0000 UTC m=+1521.466376912" Oct 03 08:13:40 crc kubenswrapper[4664]: I1003 08:13:40.653881 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:13:43 crc kubenswrapper[4664]: I1003 08:13:43.646776 4664 generic.go:334] "Generic (PLEG): container finished" podID="2080288a-c108-46e5-b794-0a3f41eb2e31" containerID="b845919dc6e9290ef576f6d1bb7ff99754d6caee1415bde06a5330465f5f5711" exitCode=0 Oct 03 08:13:43 crc kubenswrapper[4664]: I1003 08:13:43.646857 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2pdvl" event={"ID":"2080288a-c108-46e5-b794-0a3f41eb2e31","Type":"ContainerDied","Data":"b845919dc6e9290ef576f6d1bb7ff99754d6caee1415bde06a5330465f5f5711"} Oct 03 08:13:45 crc kubenswrapper[4664]: I1003 08:13:45.155062 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2pdvl" Oct 03 08:13:45 crc kubenswrapper[4664]: I1003 08:13:45.254489 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zl8px\" (UniqueName: \"kubernetes.io/projected/2080288a-c108-46e5-b794-0a3f41eb2e31-kube-api-access-zl8px\") pod \"2080288a-c108-46e5-b794-0a3f41eb2e31\" (UID: \"2080288a-c108-46e5-b794-0a3f41eb2e31\") " Oct 03 08:13:45 crc kubenswrapper[4664]: I1003 08:13:45.254572 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2080288a-c108-46e5-b794-0a3f41eb2e31-inventory\") pod \"2080288a-c108-46e5-b794-0a3f41eb2e31\" (UID: \"2080288a-c108-46e5-b794-0a3f41eb2e31\") " Oct 03 08:13:45 crc kubenswrapper[4664]: I1003 08:13:45.254643 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2080288a-c108-46e5-b794-0a3f41eb2e31-ssh-key\") pod \"2080288a-c108-46e5-b794-0a3f41eb2e31\" (UID: \"2080288a-c108-46e5-b794-0a3f41eb2e31\") " Oct 03 08:13:45 crc kubenswrapper[4664]: I1003 08:13:45.262073 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2080288a-c108-46e5-b794-0a3f41eb2e31-kube-api-access-zl8px" (OuterVolumeSpecName: "kube-api-access-zl8px") pod "2080288a-c108-46e5-b794-0a3f41eb2e31" (UID: "2080288a-c108-46e5-b794-0a3f41eb2e31"). InnerVolumeSpecName "kube-api-access-zl8px". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:13:45 crc kubenswrapper[4664]: I1003 08:13:45.290559 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2080288a-c108-46e5-b794-0a3f41eb2e31-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2080288a-c108-46e5-b794-0a3f41eb2e31" (UID: "2080288a-c108-46e5-b794-0a3f41eb2e31"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:13:45 crc kubenswrapper[4664]: I1003 08:13:45.291049 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2080288a-c108-46e5-b794-0a3f41eb2e31-inventory" (OuterVolumeSpecName: "inventory") pod "2080288a-c108-46e5-b794-0a3f41eb2e31" (UID: "2080288a-c108-46e5-b794-0a3f41eb2e31"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:13:45 crc kubenswrapper[4664]: I1003 08:13:45.357186 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zl8px\" (UniqueName: \"kubernetes.io/projected/2080288a-c108-46e5-b794-0a3f41eb2e31-kube-api-access-zl8px\") on node \"crc\" DevicePath \"\"" Oct 03 08:13:45 crc kubenswrapper[4664]: I1003 08:13:45.357233 4664 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2080288a-c108-46e5-b794-0a3f41eb2e31-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 08:13:45 crc kubenswrapper[4664]: I1003 08:13:45.357247 4664 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2080288a-c108-46e5-b794-0a3f41eb2e31-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 08:13:45 crc kubenswrapper[4664]: I1003 08:13:45.667036 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2pdvl" event={"ID":"2080288a-c108-46e5-b794-0a3f41eb2e31","Type":"ContainerDied","Data":"c4db789711ad81e5a5c37e17984790fcc9c2f8c9eec2c6ade93a57a6bf6a2e67"} Oct 03 08:13:45 crc kubenswrapper[4664]: I1003 08:13:45.667080 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4db789711ad81e5a5c37e17984790fcc9c2f8c9eec2c6ade93a57a6bf6a2e67" Oct 03 08:13:45 crc kubenswrapper[4664]: I1003 08:13:45.667105 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2pdvl" Oct 03 08:13:45 crc kubenswrapper[4664]: I1003 08:13:45.737526 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5hmgn"] Oct 03 08:13:45 crc kubenswrapper[4664]: E1003 08:13:45.738046 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2080288a-c108-46e5-b794-0a3f41eb2e31" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 03 08:13:45 crc kubenswrapper[4664]: I1003 08:13:45.738068 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="2080288a-c108-46e5-b794-0a3f41eb2e31" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 03 08:13:45 crc kubenswrapper[4664]: I1003 08:13:45.738275 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="2080288a-c108-46e5-b794-0a3f41eb2e31" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 03 08:13:45 crc kubenswrapper[4664]: I1003 08:13:45.739054 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5hmgn" Oct 03 08:13:45 crc kubenswrapper[4664]: I1003 08:13:45.742287 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 08:13:45 crc kubenswrapper[4664]: I1003 08:13:45.742940 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 08:13:45 crc kubenswrapper[4664]: I1003 08:13:45.742979 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h6s74" Oct 03 08:13:45 crc kubenswrapper[4664]: I1003 08:13:45.744135 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 08:13:45 crc kubenswrapper[4664]: I1003 08:13:45.750981 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5hmgn"] Oct 03 08:13:45 crc kubenswrapper[4664]: I1003 08:13:45.868402 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkfns\" (UniqueName: \"kubernetes.io/projected/d922fb2a-651d-432e-9859-c89cd4a2268f-kube-api-access-xkfns\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5hmgn\" (UID: \"d922fb2a-651d-432e-9859-c89cd4a2268f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5hmgn" Oct 03 08:13:45 crc kubenswrapper[4664]: I1003 08:13:45.868771 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d922fb2a-651d-432e-9859-c89cd4a2268f-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5hmgn\" (UID: \"d922fb2a-651d-432e-9859-c89cd4a2268f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5hmgn" Oct 03 08:13:45 crc kubenswrapper[4664]: I1003 08:13:45.868928 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d922fb2a-651d-432e-9859-c89cd4a2268f-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5hmgn\" (UID: \"d922fb2a-651d-432e-9859-c89cd4a2268f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5hmgn" Oct 03 08:13:45 crc kubenswrapper[4664]: I1003 08:13:45.869048 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d922fb2a-651d-432e-9859-c89cd4a2268f-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5hmgn\" (UID: \"d922fb2a-651d-432e-9859-c89cd4a2268f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5hmgn" Oct 03 08:13:45 crc kubenswrapper[4664]: I1003 08:13:45.971383 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d922fb2a-651d-432e-9859-c89cd4a2268f-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5hmgn\" (UID: \"d922fb2a-651d-432e-9859-c89cd4a2268f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5hmgn" Oct 03 08:13:45 crc kubenswrapper[4664]: I1003 08:13:45.971512 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d922fb2a-651d-432e-9859-c89cd4a2268f-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5hmgn\" (UID: \"d922fb2a-651d-432e-9859-c89cd4a2268f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5hmgn" Oct 03 08:13:45 crc kubenswrapper[4664]: I1003 08:13:45.971635 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkfns\" (UniqueName: \"kubernetes.io/projected/d922fb2a-651d-432e-9859-c89cd4a2268f-kube-api-access-xkfns\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5hmgn\" (UID: \"d922fb2a-651d-432e-9859-c89cd4a2268f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5hmgn" Oct 03 08:13:45 crc kubenswrapper[4664]: I1003 08:13:45.971705 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d922fb2a-651d-432e-9859-c89cd4a2268f-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5hmgn\" (UID: \"d922fb2a-651d-432e-9859-c89cd4a2268f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5hmgn" Oct 03 08:13:45 crc kubenswrapper[4664]: I1003 08:13:45.976208 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d922fb2a-651d-432e-9859-c89cd4a2268f-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5hmgn\" (UID: \"d922fb2a-651d-432e-9859-c89cd4a2268f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5hmgn" Oct 03 08:13:45 crc kubenswrapper[4664]: I1003 08:13:45.978061 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d922fb2a-651d-432e-9859-c89cd4a2268f-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5hmgn\" (UID: \"d922fb2a-651d-432e-9859-c89cd4a2268f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5hmgn" Oct 03 08:13:45 crc kubenswrapper[4664]: I1003 08:13:45.986078 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d922fb2a-651d-432e-9859-c89cd4a2268f-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5hmgn\" (UID: \"d922fb2a-651d-432e-9859-c89cd4a2268f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5hmgn" Oct 03 08:13:46 crc kubenswrapper[4664]: I1003 08:13:46.011934 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkfns\" (UniqueName: \"kubernetes.io/projected/d922fb2a-651d-432e-9859-c89cd4a2268f-kube-api-access-xkfns\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5hmgn\" (UID: \"d922fb2a-651d-432e-9859-c89cd4a2268f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5hmgn" Oct 03 08:13:46 crc kubenswrapper[4664]: I1003 08:13:46.070655 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5hmgn" Oct 03 08:13:46 crc kubenswrapper[4664]: I1003 08:13:46.621208 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5hmgn"] Oct 03 08:13:46 crc kubenswrapper[4664]: I1003 08:13:46.677662 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5hmgn" event={"ID":"d922fb2a-651d-432e-9859-c89cd4a2268f","Type":"ContainerStarted","Data":"2b1afd5f1c9d8ee3b5ce58facee34462d404e8b6b8f2f8b6928519a17e494e04"} Oct 03 08:13:47 crc kubenswrapper[4664]: I1003 08:13:47.691754 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5hmgn" event={"ID":"d922fb2a-651d-432e-9859-c89cd4a2268f","Type":"ContainerStarted","Data":"799021bcc2c1c32162ef0968cc27a3f8fa2000ab9b68f7f2773115135c7e543c"} Oct 03 08:13:47 crc kubenswrapper[4664]: I1003 08:13:47.713037 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5hmgn" podStartSLOduration=2.537973081 podStartE2EDuration="2.713009241s" podCreationTimestamp="2025-10-03 08:13:45 +0000 UTC" firstStartedPulling="2025-10-03 08:13:46.638618954 +0000 UTC m=+1527.459809434" lastFinishedPulling="2025-10-03 08:13:46.813655104 +0000 UTC m=+1527.634845594" observedRunningTime="2025-10-03 08:13:47.708558993 +0000 UTC m=+1528.529749503" watchObservedRunningTime="2025-10-03 08:13:47.713009241 +0000 UTC m=+1528.534199741" Oct 03 08:14:24 crc kubenswrapper[4664]: I1003 08:14:24.049120 4664 scope.go:117] "RemoveContainer" containerID="565fead3d3a1e0e35eed5e8760a29b61f356235e0ca65465e8ac8dee0f2e8ee0" Oct 03 08:15:00 crc kubenswrapper[4664]: I1003 08:15:00.156975 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324655-n56qp"] Oct 03 08:15:00 crc kubenswrapper[4664]: I1003 08:15:00.159802 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324655-n56qp" Oct 03 08:15:00 crc kubenswrapper[4664]: I1003 08:15:00.163625 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 08:15:00 crc kubenswrapper[4664]: I1003 08:15:00.166725 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 08:15:00 crc kubenswrapper[4664]: I1003 08:15:00.169302 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324655-n56qp"] Oct 03 08:15:00 crc kubenswrapper[4664]: I1003 08:15:00.317220 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56wxl\" (UniqueName: \"kubernetes.io/projected/7061a942-1c4a-417d-a7ba-9082b5a64147-kube-api-access-56wxl\") pod \"collect-profiles-29324655-n56qp\" (UID: \"7061a942-1c4a-417d-a7ba-9082b5a64147\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324655-n56qp" Oct 03 08:15:00 crc kubenswrapper[4664]: I1003 08:15:00.317391 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7061a942-1c4a-417d-a7ba-9082b5a64147-secret-volume\") pod \"collect-profiles-29324655-n56qp\" (UID: \"7061a942-1c4a-417d-a7ba-9082b5a64147\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324655-n56qp" Oct 03 08:15:00 crc kubenswrapper[4664]: I1003 08:15:00.317415 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7061a942-1c4a-417d-a7ba-9082b5a64147-config-volume\") pod \"collect-profiles-29324655-n56qp\" (UID: \"7061a942-1c4a-417d-a7ba-9082b5a64147\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324655-n56qp" Oct 03 08:15:00 crc kubenswrapper[4664]: I1003 08:15:00.419644 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7061a942-1c4a-417d-a7ba-9082b5a64147-secret-volume\") pod \"collect-profiles-29324655-n56qp\" (UID: \"7061a942-1c4a-417d-a7ba-9082b5a64147\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324655-n56qp" Oct 03 08:15:00 crc kubenswrapper[4664]: I1003 08:15:00.419704 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7061a942-1c4a-417d-a7ba-9082b5a64147-config-volume\") pod \"collect-profiles-29324655-n56qp\" (UID: \"7061a942-1c4a-417d-a7ba-9082b5a64147\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324655-n56qp" Oct 03 08:15:00 crc kubenswrapper[4664]: I1003 08:15:00.419857 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56wxl\" (UniqueName: \"kubernetes.io/projected/7061a942-1c4a-417d-a7ba-9082b5a64147-kube-api-access-56wxl\") pod \"collect-profiles-29324655-n56qp\" (UID: \"7061a942-1c4a-417d-a7ba-9082b5a64147\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324655-n56qp" Oct 03 08:15:00 crc kubenswrapper[4664]: I1003 08:15:00.421100 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7061a942-1c4a-417d-a7ba-9082b5a64147-config-volume\") pod \"collect-profiles-29324655-n56qp\" (UID: \"7061a942-1c4a-417d-a7ba-9082b5a64147\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324655-n56qp" Oct 03 08:15:00 crc kubenswrapper[4664]: I1003 08:15:00.426739 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7061a942-1c4a-417d-a7ba-9082b5a64147-secret-volume\") pod \"collect-profiles-29324655-n56qp\" (UID: \"7061a942-1c4a-417d-a7ba-9082b5a64147\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324655-n56qp" Oct 03 08:15:00 crc kubenswrapper[4664]: I1003 08:15:00.439421 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56wxl\" (UniqueName: \"kubernetes.io/projected/7061a942-1c4a-417d-a7ba-9082b5a64147-kube-api-access-56wxl\") pod \"collect-profiles-29324655-n56qp\" (UID: \"7061a942-1c4a-417d-a7ba-9082b5a64147\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324655-n56qp" Oct 03 08:15:00 crc kubenswrapper[4664]: I1003 08:15:00.488793 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324655-n56qp" Oct 03 08:15:00 crc kubenswrapper[4664]: I1003 08:15:00.940181 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324655-n56qp"] Oct 03 08:15:01 crc kubenswrapper[4664]: I1003 08:15:01.393960 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324655-n56qp" event={"ID":"7061a942-1c4a-417d-a7ba-9082b5a64147","Type":"ContainerStarted","Data":"e21dc86ae7adf5be3298fc10c57f4f4752222547e3a2c5389335e8572afbbffb"} Oct 03 08:15:01 crc kubenswrapper[4664]: I1003 08:15:01.394346 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324655-n56qp" event={"ID":"7061a942-1c4a-417d-a7ba-9082b5a64147","Type":"ContainerStarted","Data":"955e5f7d44056c37b1a0adb8b84abc2c459bba0d579df5a1bcce5698aeb0578b"} Oct 03 08:15:02 crc kubenswrapper[4664]: I1003 08:15:02.404406 4664 generic.go:334] "Generic (PLEG): container finished" podID="7061a942-1c4a-417d-a7ba-9082b5a64147" containerID="e21dc86ae7adf5be3298fc10c57f4f4752222547e3a2c5389335e8572afbbffb" exitCode=0 Oct 03 08:15:02 crc kubenswrapper[4664]: I1003 08:15:02.404625 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324655-n56qp" event={"ID":"7061a942-1c4a-417d-a7ba-9082b5a64147","Type":"ContainerDied","Data":"e21dc86ae7adf5be3298fc10c57f4f4752222547e3a2c5389335e8572afbbffb"} Oct 03 08:15:03 crc kubenswrapper[4664]: I1003 08:15:03.742104 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324655-n56qp" Oct 03 08:15:03 crc kubenswrapper[4664]: I1003 08:15:03.887800 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7061a942-1c4a-417d-a7ba-9082b5a64147-config-volume\") pod \"7061a942-1c4a-417d-a7ba-9082b5a64147\" (UID: \"7061a942-1c4a-417d-a7ba-9082b5a64147\") " Oct 03 08:15:03 crc kubenswrapper[4664]: I1003 08:15:03.888085 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7061a942-1c4a-417d-a7ba-9082b5a64147-secret-volume\") pod \"7061a942-1c4a-417d-a7ba-9082b5a64147\" (UID: \"7061a942-1c4a-417d-a7ba-9082b5a64147\") " Oct 03 08:15:03 crc kubenswrapper[4664]: I1003 08:15:03.888361 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56wxl\" (UniqueName: \"kubernetes.io/projected/7061a942-1c4a-417d-a7ba-9082b5a64147-kube-api-access-56wxl\") pod \"7061a942-1c4a-417d-a7ba-9082b5a64147\" (UID: \"7061a942-1c4a-417d-a7ba-9082b5a64147\") " Oct 03 08:15:03 crc kubenswrapper[4664]: I1003 08:15:03.888538 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7061a942-1c4a-417d-a7ba-9082b5a64147-config-volume" (OuterVolumeSpecName: "config-volume") pod "7061a942-1c4a-417d-a7ba-9082b5a64147" (UID: "7061a942-1c4a-417d-a7ba-9082b5a64147"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:15:03 crc kubenswrapper[4664]: I1003 08:15:03.888983 4664 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7061a942-1c4a-417d-a7ba-9082b5a64147-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 08:15:03 crc kubenswrapper[4664]: I1003 08:15:03.895217 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7061a942-1c4a-417d-a7ba-9082b5a64147-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7061a942-1c4a-417d-a7ba-9082b5a64147" (UID: "7061a942-1c4a-417d-a7ba-9082b5a64147"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:15:03 crc kubenswrapper[4664]: I1003 08:15:03.895333 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7061a942-1c4a-417d-a7ba-9082b5a64147-kube-api-access-56wxl" (OuterVolumeSpecName: "kube-api-access-56wxl") pod "7061a942-1c4a-417d-a7ba-9082b5a64147" (UID: "7061a942-1c4a-417d-a7ba-9082b5a64147"). InnerVolumeSpecName "kube-api-access-56wxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:15:03 crc kubenswrapper[4664]: I1003 08:15:03.991843 4664 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7061a942-1c4a-417d-a7ba-9082b5a64147-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 08:15:03 crc kubenswrapper[4664]: I1003 08:15:03.991887 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56wxl\" (UniqueName: \"kubernetes.io/projected/7061a942-1c4a-417d-a7ba-9082b5a64147-kube-api-access-56wxl\") on node \"crc\" DevicePath \"\"" Oct 03 08:15:04 crc kubenswrapper[4664]: I1003 08:15:04.429757 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324655-n56qp" event={"ID":"7061a942-1c4a-417d-a7ba-9082b5a64147","Type":"ContainerDied","Data":"955e5f7d44056c37b1a0adb8b84abc2c459bba0d579df5a1bcce5698aeb0578b"} Oct 03 08:15:04 crc kubenswrapper[4664]: I1003 08:15:04.430197 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="955e5f7d44056c37b1a0adb8b84abc2c459bba0d579df5a1bcce5698aeb0578b" Oct 03 08:15:04 crc kubenswrapper[4664]: I1003 08:15:04.429822 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324655-n56qp" Oct 03 08:15:24 crc kubenswrapper[4664]: I1003 08:15:24.119544 4664 scope.go:117] "RemoveContainer" containerID="99b3c3c6e99c5f67161b1252b5c38eedb50cc2f6291ee26d308f4cebd24707be" Oct 03 08:15:24 crc kubenswrapper[4664]: I1003 08:15:24.144567 4664 scope.go:117] "RemoveContainer" containerID="287d1f9ca3e681244d3cce039edfcd409cea2b90f4debc427dfadeab6c59b4e2" Oct 03 08:15:24 crc kubenswrapper[4664]: I1003 08:15:24.217977 4664 scope.go:117] "RemoveContainer" containerID="dec11e910be9792de36f8c4bb865a990edfa318b57c9d2c1b37962bde0dc3069" Oct 03 08:15:24 crc kubenswrapper[4664]: I1003 08:15:24.239100 4664 scope.go:117] "RemoveContainer" containerID="2530ee8677ae62c71817ffe80d3cc18b30d54b1718a2b3ffc74a70b706a727d2" Oct 03 08:15:24 crc kubenswrapper[4664]: I1003 08:15:24.260942 4664 scope.go:117] "RemoveContainer" containerID="d521b5f2af42cfcb808e03eca31aa839fac7c4f87393779a5956121158d6926c" Oct 03 08:15:24 crc kubenswrapper[4664]: I1003 08:15:24.301764 4664 scope.go:117] "RemoveContainer" containerID="64d835e96ba4280eb7e483668dddb5bde20f1a1b8703908dd6b6d2c6bd5af680" Oct 03 08:15:41 crc kubenswrapper[4664]: I1003 08:15:41.987022 4664 patch_prober.go:28] interesting pod/machine-config-daemon-x9dgm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:15:41 crc kubenswrapper[4664]: I1003 08:15:41.987590 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:16:11 crc kubenswrapper[4664]: I1003 08:16:11.987096 4664 patch_prober.go:28] interesting pod/machine-config-daemon-x9dgm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:16:11 crc kubenswrapper[4664]: I1003 08:16:11.988101 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:16:41 crc kubenswrapper[4664]: I1003 08:16:41.987489 4664 patch_prober.go:28] interesting pod/machine-config-daemon-x9dgm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:16:41 crc kubenswrapper[4664]: I1003 08:16:41.988032 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:16:41 crc kubenswrapper[4664]: I1003 08:16:41.988079 4664 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" Oct 03 08:16:41 crc kubenswrapper[4664]: I1003 08:16:41.988906 4664 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bfd5903d53b3dc9e39f07ec7a1586cebd8a3cc45fb7b6842a928b48f35ec7cf9"} pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 08:16:41 crc kubenswrapper[4664]: I1003 08:16:41.988978 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" containerID="cri-o://bfd5903d53b3dc9e39f07ec7a1586cebd8a3cc45fb7b6842a928b48f35ec7cf9" gracePeriod=600 Oct 03 08:16:42 crc kubenswrapper[4664]: E1003 08:16:42.115929 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:16:42 crc kubenswrapper[4664]: I1003 08:16:42.342443 4664 generic.go:334] "Generic (PLEG): container finished" podID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerID="bfd5903d53b3dc9e39f07ec7a1586cebd8a3cc45fb7b6842a928b48f35ec7cf9" exitCode=0 Oct 03 08:16:42 crc kubenswrapper[4664]: I1003 08:16:42.342493 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" event={"ID":"598b81ce-0ce7-498f-9337-ae5e6e64682b","Type":"ContainerDied","Data":"bfd5903d53b3dc9e39f07ec7a1586cebd8a3cc45fb7b6842a928b48f35ec7cf9"} Oct 03 08:16:42 crc kubenswrapper[4664]: I1003 08:16:42.342542 4664 scope.go:117] "RemoveContainer" containerID="2dfe6ff457d0c2bccf5db2631d7781386b8da1168146e54f8a4ae9ce420f6b83" Oct 03 08:16:42 crc kubenswrapper[4664]: I1003 08:16:42.343274 4664 scope.go:117] "RemoveContainer" containerID="bfd5903d53b3dc9e39f07ec7a1586cebd8a3cc45fb7b6842a928b48f35ec7cf9" Oct 03 08:16:42 crc kubenswrapper[4664]: E1003 08:16:42.343541 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:16:52 crc kubenswrapper[4664]: I1003 08:16:52.876990 4664 scope.go:117] "RemoveContainer" containerID="bfd5903d53b3dc9e39f07ec7a1586cebd8a3cc45fb7b6842a928b48f35ec7cf9" Oct 03 08:16:52 crc kubenswrapper[4664]: E1003 08:16:52.877680 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:16:53 crc kubenswrapper[4664]: I1003 08:16:53.450587 4664 generic.go:334] "Generic (PLEG): container finished" podID="d922fb2a-651d-432e-9859-c89cd4a2268f" containerID="799021bcc2c1c32162ef0968cc27a3f8fa2000ab9b68f7f2773115135c7e543c" exitCode=0 Oct 03 08:16:53 crc kubenswrapper[4664]: I1003 08:16:53.451047 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5hmgn" event={"ID":"d922fb2a-651d-432e-9859-c89cd4a2268f","Type":"ContainerDied","Data":"799021bcc2c1c32162ef0968cc27a3f8fa2000ab9b68f7f2773115135c7e543c"} Oct 03 08:16:54 crc kubenswrapper[4664]: I1003 08:16:54.899575 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5hmgn" Oct 03 08:16:55 crc kubenswrapper[4664]: I1003 08:16:55.055294 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d922fb2a-651d-432e-9859-c89cd4a2268f-inventory\") pod \"d922fb2a-651d-432e-9859-c89cd4a2268f\" (UID: \"d922fb2a-651d-432e-9859-c89cd4a2268f\") " Oct 03 08:16:55 crc kubenswrapper[4664]: I1003 08:16:55.055777 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d922fb2a-651d-432e-9859-c89cd4a2268f-bootstrap-combined-ca-bundle\") pod \"d922fb2a-651d-432e-9859-c89cd4a2268f\" (UID: \"d922fb2a-651d-432e-9859-c89cd4a2268f\") " Oct 03 08:16:55 crc kubenswrapper[4664]: I1003 08:16:55.055838 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkfns\" (UniqueName: \"kubernetes.io/projected/d922fb2a-651d-432e-9859-c89cd4a2268f-kube-api-access-xkfns\") pod \"d922fb2a-651d-432e-9859-c89cd4a2268f\" (UID: \"d922fb2a-651d-432e-9859-c89cd4a2268f\") " Oct 03 08:16:55 crc kubenswrapper[4664]: I1003 08:16:55.056008 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d922fb2a-651d-432e-9859-c89cd4a2268f-ssh-key\") pod \"d922fb2a-651d-432e-9859-c89cd4a2268f\" (UID: \"d922fb2a-651d-432e-9859-c89cd4a2268f\") " Oct 03 08:16:55 crc kubenswrapper[4664]: I1003 08:16:55.062792 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d922fb2a-651d-432e-9859-c89cd4a2268f-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "d922fb2a-651d-432e-9859-c89cd4a2268f" (UID: "d922fb2a-651d-432e-9859-c89cd4a2268f"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:16:55 crc kubenswrapper[4664]: I1003 08:16:55.063904 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d922fb2a-651d-432e-9859-c89cd4a2268f-kube-api-access-xkfns" (OuterVolumeSpecName: "kube-api-access-xkfns") pod "d922fb2a-651d-432e-9859-c89cd4a2268f" (UID: "d922fb2a-651d-432e-9859-c89cd4a2268f"). InnerVolumeSpecName "kube-api-access-xkfns". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:16:55 crc kubenswrapper[4664]: I1003 08:16:55.083838 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d922fb2a-651d-432e-9859-c89cd4a2268f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d922fb2a-651d-432e-9859-c89cd4a2268f" (UID: "d922fb2a-651d-432e-9859-c89cd4a2268f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:16:55 crc kubenswrapper[4664]: I1003 08:16:55.084508 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d922fb2a-651d-432e-9859-c89cd4a2268f-inventory" (OuterVolumeSpecName: "inventory") pod "d922fb2a-651d-432e-9859-c89cd4a2268f" (UID: "d922fb2a-651d-432e-9859-c89cd4a2268f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:16:55 crc kubenswrapper[4664]: I1003 08:16:55.199363 4664 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d922fb2a-651d-432e-9859-c89cd4a2268f-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:16:55 crc kubenswrapper[4664]: I1003 08:16:55.199405 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkfns\" (UniqueName: \"kubernetes.io/projected/d922fb2a-651d-432e-9859-c89cd4a2268f-kube-api-access-xkfns\") on node \"crc\" DevicePath \"\"" Oct 03 08:16:55 crc kubenswrapper[4664]: I1003 08:16:55.199417 4664 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d922fb2a-651d-432e-9859-c89cd4a2268f-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 08:16:55 crc kubenswrapper[4664]: I1003 08:16:55.199427 4664 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d922fb2a-651d-432e-9859-c89cd4a2268f-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 08:16:55 crc kubenswrapper[4664]: I1003 08:16:55.469585 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5hmgn" event={"ID":"d922fb2a-651d-432e-9859-c89cd4a2268f","Type":"ContainerDied","Data":"2b1afd5f1c9d8ee3b5ce58facee34462d404e8b6b8f2f8b6928519a17e494e04"} Oct 03 08:16:55 crc kubenswrapper[4664]: I1003 08:16:55.469647 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b1afd5f1c9d8ee3b5ce58facee34462d404e8b6b8f2f8b6928519a17e494e04" Oct 03 08:16:55 crc kubenswrapper[4664]: I1003 08:16:55.469653 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5hmgn" Oct 03 08:16:55 crc kubenswrapper[4664]: I1003 08:16:55.553705 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5tthl"] Oct 03 08:16:55 crc kubenswrapper[4664]: E1003 08:16:55.554128 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7061a942-1c4a-417d-a7ba-9082b5a64147" containerName="collect-profiles" Oct 03 08:16:55 crc kubenswrapper[4664]: I1003 08:16:55.554143 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="7061a942-1c4a-417d-a7ba-9082b5a64147" containerName="collect-profiles" Oct 03 08:16:55 crc kubenswrapper[4664]: E1003 08:16:55.554167 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d922fb2a-651d-432e-9859-c89cd4a2268f" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 03 08:16:55 crc kubenswrapper[4664]: I1003 08:16:55.554175 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="d922fb2a-651d-432e-9859-c89cd4a2268f" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 03 08:16:55 crc kubenswrapper[4664]: I1003 08:16:55.554366 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="d922fb2a-651d-432e-9859-c89cd4a2268f" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 03 08:16:55 crc kubenswrapper[4664]: I1003 08:16:55.554390 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="7061a942-1c4a-417d-a7ba-9082b5a64147" containerName="collect-profiles" Oct 03 08:16:55 crc kubenswrapper[4664]: I1003 08:16:55.555059 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5tthl" Oct 03 08:16:55 crc kubenswrapper[4664]: I1003 08:16:55.557135 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 08:16:55 crc kubenswrapper[4664]: I1003 08:16:55.557313 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 08:16:55 crc kubenswrapper[4664]: I1003 08:16:55.560284 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h6s74" Oct 03 08:16:55 crc kubenswrapper[4664]: I1003 08:16:55.563872 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5tthl"] Oct 03 08:16:55 crc kubenswrapper[4664]: I1003 08:16:55.570298 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 08:16:55 crc kubenswrapper[4664]: I1003 08:16:55.607335 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/70c69b69-6f48-4169-9cbc-a145ed1d8e07-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5tthl\" (UID: \"70c69b69-6f48-4169-9cbc-a145ed1d8e07\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5tthl" Oct 03 08:16:55 crc kubenswrapper[4664]: I1003 08:16:55.607394 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/70c69b69-6f48-4169-9cbc-a145ed1d8e07-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5tthl\" (UID: \"70c69b69-6f48-4169-9cbc-a145ed1d8e07\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5tthl" Oct 03 08:16:55 crc kubenswrapper[4664]: I1003 08:16:55.607422 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfrxz\" (UniqueName: \"kubernetes.io/projected/70c69b69-6f48-4169-9cbc-a145ed1d8e07-kube-api-access-lfrxz\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5tthl\" (UID: \"70c69b69-6f48-4169-9cbc-a145ed1d8e07\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5tthl" Oct 03 08:16:55 crc kubenswrapper[4664]: I1003 08:16:55.708426 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/70c69b69-6f48-4169-9cbc-a145ed1d8e07-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5tthl\" (UID: \"70c69b69-6f48-4169-9cbc-a145ed1d8e07\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5tthl" Oct 03 08:16:55 crc kubenswrapper[4664]: I1003 08:16:55.708499 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfrxz\" (UniqueName: \"kubernetes.io/projected/70c69b69-6f48-4169-9cbc-a145ed1d8e07-kube-api-access-lfrxz\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5tthl\" (UID: \"70c69b69-6f48-4169-9cbc-a145ed1d8e07\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5tthl" Oct 03 08:16:55 crc kubenswrapper[4664]: I1003 08:16:55.708715 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/70c69b69-6f48-4169-9cbc-a145ed1d8e07-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5tthl\" (UID: \"70c69b69-6f48-4169-9cbc-a145ed1d8e07\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5tthl" Oct 03 08:16:55 crc kubenswrapper[4664]: I1003 08:16:55.715462 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/70c69b69-6f48-4169-9cbc-a145ed1d8e07-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5tthl\" (UID: \"70c69b69-6f48-4169-9cbc-a145ed1d8e07\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5tthl" Oct 03 08:16:55 crc kubenswrapper[4664]: I1003 08:16:55.715508 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/70c69b69-6f48-4169-9cbc-a145ed1d8e07-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5tthl\" (UID: \"70c69b69-6f48-4169-9cbc-a145ed1d8e07\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5tthl" Oct 03 08:16:55 crc kubenswrapper[4664]: I1003 08:16:55.725684 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfrxz\" (UniqueName: \"kubernetes.io/projected/70c69b69-6f48-4169-9cbc-a145ed1d8e07-kube-api-access-lfrxz\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5tthl\" (UID: \"70c69b69-6f48-4169-9cbc-a145ed1d8e07\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5tthl" Oct 03 08:16:55 crc kubenswrapper[4664]: I1003 08:16:55.881172 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5tthl" Oct 03 08:16:56 crc kubenswrapper[4664]: I1003 08:16:56.459747 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5tthl"] Oct 03 08:16:56 crc kubenswrapper[4664]: I1003 08:16:56.466423 4664 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 08:16:56 crc kubenswrapper[4664]: I1003 08:16:56.478727 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5tthl" event={"ID":"70c69b69-6f48-4169-9cbc-a145ed1d8e07","Type":"ContainerStarted","Data":"76f190cca98fa058b17daface1ca2acd005e8259acb1b3cbbf5b7cf03fe96689"} Oct 03 08:16:57 crc kubenswrapper[4664]: I1003 08:16:57.488728 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5tthl" event={"ID":"70c69b69-6f48-4169-9cbc-a145ed1d8e07","Type":"ContainerStarted","Data":"f636d7d37e303f62bad6fd514bf47684f1be1c0fa13c22e6a2f487967f5d33f5"} Oct 03 08:16:57 crc kubenswrapper[4664]: I1003 08:16:57.513571 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5tthl" podStartSLOduration=2.350438339 podStartE2EDuration="2.513544545s" podCreationTimestamp="2025-10-03 08:16:55 +0000 UTC" firstStartedPulling="2025-10-03 08:16:56.466087223 +0000 UTC m=+1717.287277713" lastFinishedPulling="2025-10-03 08:16:56.629193429 +0000 UTC m=+1717.450383919" observedRunningTime="2025-10-03 08:16:57.504126793 +0000 UTC m=+1718.325317293" watchObservedRunningTime="2025-10-03 08:16:57.513544545 +0000 UTC m=+1718.334735035" Oct 03 08:17:06 crc kubenswrapper[4664]: I1003 08:17:06.876818 4664 scope.go:117] "RemoveContainer" containerID="bfd5903d53b3dc9e39f07ec7a1586cebd8a3cc45fb7b6842a928b48f35ec7cf9" Oct 03 08:17:06 crc kubenswrapper[4664]: E1003 08:17:06.877549 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:17:18 crc kubenswrapper[4664]: I1003 08:17:18.876334 4664 scope.go:117] "RemoveContainer" containerID="bfd5903d53b3dc9e39f07ec7a1586cebd8a3cc45fb7b6842a928b48f35ec7cf9" Oct 03 08:17:18 crc kubenswrapper[4664]: E1003 08:17:18.877328 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:17:31 crc kubenswrapper[4664]: I1003 08:17:31.877499 4664 scope.go:117] "RemoveContainer" containerID="bfd5903d53b3dc9e39f07ec7a1586cebd8a3cc45fb7b6842a928b48f35ec7cf9" Oct 03 08:17:31 crc kubenswrapper[4664]: E1003 08:17:31.879478 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:17:33 crc kubenswrapper[4664]: I1003 08:17:33.047482 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-98kgj"] Oct 03 08:17:33 crc kubenswrapper[4664]: I1003 08:17:33.055818 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-98kgj"] Oct 03 08:17:33 crc kubenswrapper[4664]: I1003 08:17:33.894366 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a531dde-5ad1-4118-acef-d104aec77b92" path="/var/lib/kubelet/pods/6a531dde-5ad1-4118-acef-d104aec77b92/volumes" Oct 03 08:17:35 crc kubenswrapper[4664]: I1003 08:17:35.028720 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-m7kq6"] Oct 03 08:17:35 crc kubenswrapper[4664]: I1003 08:17:35.038487 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-drt6h"] Oct 03 08:17:35 crc kubenswrapper[4664]: I1003 08:17:35.046879 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-m7kq6"] Oct 03 08:17:35 crc kubenswrapper[4664]: I1003 08:17:35.054887 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-drt6h"] Oct 03 08:17:35 crc kubenswrapper[4664]: I1003 08:17:35.888378 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b5c50a5-3090-403e-9207-a48e761544c3" path="/var/lib/kubelet/pods/0b5c50a5-3090-403e-9207-a48e761544c3/volumes" Oct 03 08:17:35 crc kubenswrapper[4664]: I1003 08:17:35.889353 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25492a2d-416b-4e30-a717-2b814d47066e" path="/var/lib/kubelet/pods/25492a2d-416b-4e30-a717-2b814d47066e/volumes" Oct 03 08:17:44 crc kubenswrapper[4664]: I1003 08:17:44.031804 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-3b68-account-create-tcs5r"] Oct 03 08:17:44 crc kubenswrapper[4664]: I1003 08:17:44.041850 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-3b68-account-create-tcs5r"] Oct 03 08:17:44 crc kubenswrapper[4664]: I1003 08:17:44.876542 4664 scope.go:117] "RemoveContainer" containerID="bfd5903d53b3dc9e39f07ec7a1586cebd8a3cc45fb7b6842a928b48f35ec7cf9" Oct 03 08:17:44 crc kubenswrapper[4664]: E1003 08:17:44.877103 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:17:45 crc kubenswrapper[4664]: I1003 08:17:45.048142 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-fbb7-account-create-ggzfs"] Oct 03 08:17:45 crc kubenswrapper[4664]: I1003 08:17:45.068661 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-f9ab-account-create-62kjt"] Oct 03 08:17:45 crc kubenswrapper[4664]: I1003 08:17:45.080057 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-fbb7-account-create-ggzfs"] Oct 03 08:17:45 crc kubenswrapper[4664]: I1003 08:17:45.091207 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-f9ab-account-create-62kjt"] Oct 03 08:17:45 crc kubenswrapper[4664]: I1003 08:17:45.887679 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19b6d284-9fc0-4929-97d1-1841f75ea25c" path="/var/lib/kubelet/pods/19b6d284-9fc0-4929-97d1-1841f75ea25c/volumes" Oct 03 08:17:45 crc kubenswrapper[4664]: I1003 08:17:45.888252 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b3ab103-d868-4ca1-98ef-90d62174e20d" path="/var/lib/kubelet/pods/6b3ab103-d868-4ca1-98ef-90d62174e20d/volumes" Oct 03 08:17:45 crc kubenswrapper[4664]: I1003 08:17:45.888816 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bba7e96-7b1b-4071-84d6-bf2f6705ca0b" path="/var/lib/kubelet/pods/6bba7e96-7b1b-4071-84d6-bf2f6705ca0b/volumes" Oct 03 08:17:55 crc kubenswrapper[4664]: I1003 08:17:55.876667 4664 scope.go:117] "RemoveContainer" containerID="bfd5903d53b3dc9e39f07ec7a1586cebd8a3cc45fb7b6842a928b48f35ec7cf9" Oct 03 08:17:55 crc kubenswrapper[4664]: E1003 08:17:55.877422 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:18:10 crc kubenswrapper[4664]: I1003 08:18:10.878822 4664 scope.go:117] "RemoveContainer" containerID="bfd5903d53b3dc9e39f07ec7a1586cebd8a3cc45fb7b6842a928b48f35ec7cf9" Oct 03 08:18:10 crc kubenswrapper[4664]: E1003 08:18:10.881276 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:18:12 crc kubenswrapper[4664]: I1003 08:18:12.035204 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-h4n98"] Oct 03 08:18:12 crc kubenswrapper[4664]: I1003 08:18:12.045830 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-wctgk"] Oct 03 08:18:12 crc kubenswrapper[4664]: I1003 08:18:12.055447 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-bn2c5"] Oct 03 08:18:12 crc kubenswrapper[4664]: I1003 08:18:12.063736 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-bn2c5"] Oct 03 08:18:12 crc kubenswrapper[4664]: I1003 08:18:12.071146 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-wctgk"] Oct 03 08:18:12 crc kubenswrapper[4664]: I1003 08:18:12.078936 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-h4n98"] Oct 03 08:18:13 crc kubenswrapper[4664]: I1003 08:18:13.889244 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13d51df0-aff8-4202-b94c-814faaf05cbb" path="/var/lib/kubelet/pods/13d51df0-aff8-4202-b94c-814faaf05cbb/volumes" Oct 03 08:18:13 crc kubenswrapper[4664]: I1003 08:18:13.890191 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a629e3c9-2dba-46d6-a978-bc6133d150e1" path="/var/lib/kubelet/pods/a629e3c9-2dba-46d6-a978-bc6133d150e1/volumes" Oct 03 08:18:13 crc kubenswrapper[4664]: I1003 08:18:13.890817 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac091c1c-fab5-4ecd-87ff-c2a7a0367d5a" path="/var/lib/kubelet/pods/ac091c1c-fab5-4ecd-87ff-c2a7a0367d5a/volumes" Oct 03 08:18:22 crc kubenswrapper[4664]: I1003 08:18:22.040909 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-ab56-account-create-2c74m"] Oct 03 08:18:22 crc kubenswrapper[4664]: I1003 08:18:22.060559 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-6b46-account-create-fgbmb"] Oct 03 08:18:22 crc kubenswrapper[4664]: I1003 08:18:22.075856 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-691a-account-create-v52t7"] Oct 03 08:18:22 crc kubenswrapper[4664]: I1003 08:18:22.086031 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-ab56-account-create-2c74m"] Oct 03 08:18:22 crc kubenswrapper[4664]: I1003 08:18:22.096950 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-6b46-account-create-fgbmb"] Oct 03 08:18:22 crc kubenswrapper[4664]: I1003 08:18:22.105341 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-691a-account-create-v52t7"] Oct 03 08:18:23 crc kubenswrapper[4664]: I1003 08:18:23.039368 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-8mfsn"] Oct 03 08:18:23 crc kubenswrapper[4664]: I1003 08:18:23.070254 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-8mfsn"] Oct 03 08:18:23 crc kubenswrapper[4664]: I1003 08:18:23.887340 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9197763c-fb03-4db8-9edb-b35fdc61856f" path="/var/lib/kubelet/pods/9197763c-fb03-4db8-9edb-b35fdc61856f/volumes" Oct 03 08:18:23 crc kubenswrapper[4664]: I1003 08:18:23.888077 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c1c6c43-9fc4-46f4-b47e-620efb9d7c4a" path="/var/lib/kubelet/pods/9c1c6c43-9fc4-46f4-b47e-620efb9d7c4a/volumes" Oct 03 08:18:23 crc kubenswrapper[4664]: I1003 08:18:23.888716 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9dfe6751-9a34-4e24-9769-6fe97328ef4a" path="/var/lib/kubelet/pods/9dfe6751-9a34-4e24-9769-6fe97328ef4a/volumes" Oct 03 08:18:23 crc kubenswrapper[4664]: I1003 08:18:23.889333 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="caaff67b-4184-4473-a892-305de99f3886" path="/var/lib/kubelet/pods/caaff67b-4184-4473-a892-305de99f3886/volumes" Oct 03 08:18:24 crc kubenswrapper[4664]: I1003 08:18:24.438561 4664 scope.go:117] "RemoveContainer" containerID="eb64e74ea9a2cba531637e440f05cb932c6b4375f5f1c63ec64f3e11fe1fff6d" Oct 03 08:18:24 crc kubenswrapper[4664]: I1003 08:18:24.489885 4664 scope.go:117] "RemoveContainer" containerID="f36f3035ecc75d0a667c02b19ccd5b69308b32c3f95f1d1b83f8b6b1576bebed" Oct 03 08:18:24 crc kubenswrapper[4664]: I1003 08:18:24.525654 4664 scope.go:117] "RemoveContainer" containerID="ddc484af05261aab8ea4328fe71be3ffd1e86e359c997c314493cf278dec7610" Oct 03 08:18:24 crc kubenswrapper[4664]: I1003 08:18:24.569580 4664 scope.go:117] "RemoveContainer" containerID="1c2ade37c5164cdd797ae746cdf91c1002d670a2d8140d6cfd5fb3fc1263bfbc" Oct 03 08:18:24 crc kubenswrapper[4664]: I1003 08:18:24.610424 4664 scope.go:117] "RemoveContainer" containerID="72fce17de0d48c9396b4eed1952bbe92dec8d0573dbad65dfb7a261bc47ce5c1" Oct 03 08:18:24 crc kubenswrapper[4664]: I1003 08:18:24.663426 4664 scope.go:117] "RemoveContainer" containerID="039bbc1e3505a3edb29319c43b0c1ae59c617ffc88403d3741a4819dbff7a581" Oct 03 08:18:24 crc kubenswrapper[4664]: I1003 08:18:24.717203 4664 scope.go:117] "RemoveContainer" containerID="1643f639e33cd2766005bc72f8b4849c583f9b645475ace313ca004b3e1cc402" Oct 03 08:18:24 crc kubenswrapper[4664]: I1003 08:18:24.736860 4664 scope.go:117] "RemoveContainer" containerID="fc414354f59d3a8c75fb2e543abed1b2e50124b4fecb4834ef68f4531fca7eea" Oct 03 08:18:24 crc kubenswrapper[4664]: I1003 08:18:24.756651 4664 scope.go:117] "RemoveContainer" containerID="2c4d2a2835d448567dbf445f8021b9906cd57cb07a25acf1f143e6bc32ecdc48" Oct 03 08:18:24 crc kubenswrapper[4664]: I1003 08:18:24.772980 4664 scope.go:117] "RemoveContainer" containerID="90ae26e109a6e5662ae4b4f690e8d4514c32b784b4472b876cffad013773a11f" Oct 03 08:18:24 crc kubenswrapper[4664]: I1003 08:18:24.791262 4664 scope.go:117] "RemoveContainer" containerID="30a26a31e0379301afd7539be361b0208d595d01bd2a18bb8ade062568ec1f4f" Oct 03 08:18:24 crc kubenswrapper[4664]: I1003 08:18:24.811057 4664 scope.go:117] "RemoveContainer" containerID="13398b4f9ee8746d3fd93b75af8e690a8ab7dee644b3b721d295c8fbca10c576" Oct 03 08:18:24 crc kubenswrapper[4664]: I1003 08:18:24.830472 4664 scope.go:117] "RemoveContainer" containerID="7299a9da53bdfc3fe22676a95215620d6317d1dac46480e92856710951713fdf" Oct 03 08:18:24 crc kubenswrapper[4664]: I1003 08:18:24.877630 4664 scope.go:117] "RemoveContainer" containerID="bfd5903d53b3dc9e39f07ec7a1586cebd8a3cc45fb7b6842a928b48f35ec7cf9" Oct 03 08:18:24 crc kubenswrapper[4664]: E1003 08:18:24.877908 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:18:28 crc kubenswrapper[4664]: I1003 08:18:28.333269 4664 generic.go:334] "Generic (PLEG): container finished" podID="70c69b69-6f48-4169-9cbc-a145ed1d8e07" containerID="f636d7d37e303f62bad6fd514bf47684f1be1c0fa13c22e6a2f487967f5d33f5" exitCode=0 Oct 03 08:18:28 crc kubenswrapper[4664]: I1003 08:18:28.333363 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5tthl" event={"ID":"70c69b69-6f48-4169-9cbc-a145ed1d8e07","Type":"ContainerDied","Data":"f636d7d37e303f62bad6fd514bf47684f1be1c0fa13c22e6a2f487967f5d33f5"} Oct 03 08:18:29 crc kubenswrapper[4664]: I1003 08:18:29.740292 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5tthl" Oct 03 08:18:29 crc kubenswrapper[4664]: I1003 08:18:29.811288 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/70c69b69-6f48-4169-9cbc-a145ed1d8e07-inventory\") pod \"70c69b69-6f48-4169-9cbc-a145ed1d8e07\" (UID: \"70c69b69-6f48-4169-9cbc-a145ed1d8e07\") " Oct 03 08:18:29 crc kubenswrapper[4664]: I1003 08:18:29.811492 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/70c69b69-6f48-4169-9cbc-a145ed1d8e07-ssh-key\") pod \"70c69b69-6f48-4169-9cbc-a145ed1d8e07\" (UID: \"70c69b69-6f48-4169-9cbc-a145ed1d8e07\") " Oct 03 08:18:29 crc kubenswrapper[4664]: I1003 08:18:29.811637 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfrxz\" (UniqueName: \"kubernetes.io/projected/70c69b69-6f48-4169-9cbc-a145ed1d8e07-kube-api-access-lfrxz\") pod \"70c69b69-6f48-4169-9cbc-a145ed1d8e07\" (UID: \"70c69b69-6f48-4169-9cbc-a145ed1d8e07\") " Oct 03 08:18:29 crc kubenswrapper[4664]: I1003 08:18:29.820690 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70c69b69-6f48-4169-9cbc-a145ed1d8e07-kube-api-access-lfrxz" (OuterVolumeSpecName: "kube-api-access-lfrxz") pod "70c69b69-6f48-4169-9cbc-a145ed1d8e07" (UID: "70c69b69-6f48-4169-9cbc-a145ed1d8e07"). InnerVolumeSpecName "kube-api-access-lfrxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:18:29 crc kubenswrapper[4664]: I1003 08:18:29.841967 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70c69b69-6f48-4169-9cbc-a145ed1d8e07-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "70c69b69-6f48-4169-9cbc-a145ed1d8e07" (UID: "70c69b69-6f48-4169-9cbc-a145ed1d8e07"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:18:29 crc kubenswrapper[4664]: I1003 08:18:29.844491 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70c69b69-6f48-4169-9cbc-a145ed1d8e07-inventory" (OuterVolumeSpecName: "inventory") pod "70c69b69-6f48-4169-9cbc-a145ed1d8e07" (UID: "70c69b69-6f48-4169-9cbc-a145ed1d8e07"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:18:29 crc kubenswrapper[4664]: I1003 08:18:29.914282 4664 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/70c69b69-6f48-4169-9cbc-a145ed1d8e07-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 08:18:29 crc kubenswrapper[4664]: I1003 08:18:29.914331 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfrxz\" (UniqueName: \"kubernetes.io/projected/70c69b69-6f48-4169-9cbc-a145ed1d8e07-kube-api-access-lfrxz\") on node \"crc\" DevicePath \"\"" Oct 03 08:18:29 crc kubenswrapper[4664]: I1003 08:18:29.914344 4664 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/70c69b69-6f48-4169-9cbc-a145ed1d8e07-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 08:18:30 crc kubenswrapper[4664]: I1003 08:18:30.357248 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5tthl" event={"ID":"70c69b69-6f48-4169-9cbc-a145ed1d8e07","Type":"ContainerDied","Data":"76f190cca98fa058b17daface1ca2acd005e8259acb1b3cbbf5b7cf03fe96689"} Oct 03 08:18:30 crc kubenswrapper[4664]: I1003 08:18:30.357298 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76f190cca98fa058b17daface1ca2acd005e8259acb1b3cbbf5b7cf03fe96689" Oct 03 08:18:30 crc kubenswrapper[4664]: I1003 08:18:30.357335 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5tthl" Oct 03 08:18:30 crc kubenswrapper[4664]: I1003 08:18:30.458841 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bj8rm"] Oct 03 08:18:30 crc kubenswrapper[4664]: E1003 08:18:30.459591 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70c69b69-6f48-4169-9cbc-a145ed1d8e07" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 03 08:18:30 crc kubenswrapper[4664]: I1003 08:18:30.459639 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="70c69b69-6f48-4169-9cbc-a145ed1d8e07" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 03 08:18:30 crc kubenswrapper[4664]: I1003 08:18:30.459987 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="70c69b69-6f48-4169-9cbc-a145ed1d8e07" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 03 08:18:30 crc kubenswrapper[4664]: I1003 08:18:30.460957 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bj8rm" Oct 03 08:18:30 crc kubenswrapper[4664]: I1003 08:18:30.464480 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h6s74" Oct 03 08:18:30 crc kubenswrapper[4664]: I1003 08:18:30.464971 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 08:18:30 crc kubenswrapper[4664]: I1003 08:18:30.467599 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 08:18:30 crc kubenswrapper[4664]: I1003 08:18:30.467928 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 08:18:30 crc kubenswrapper[4664]: I1003 08:18:30.477879 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bj8rm"] Oct 03 08:18:30 crc kubenswrapper[4664]: I1003 08:18:30.531148 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f1a9115e-e1a1-4de5-8a59-3b23ac395ec4-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-bj8rm\" (UID: \"f1a9115e-e1a1-4de5-8a59-3b23ac395ec4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bj8rm" Oct 03 08:18:30 crc kubenswrapper[4664]: I1003 08:18:30.531204 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f1a9115e-e1a1-4de5-8a59-3b23ac395ec4-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-bj8rm\" (UID: \"f1a9115e-e1a1-4de5-8a59-3b23ac395ec4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bj8rm" Oct 03 08:18:30 crc kubenswrapper[4664]: I1003 08:18:30.531331 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wqmw\" (UniqueName: \"kubernetes.io/projected/f1a9115e-e1a1-4de5-8a59-3b23ac395ec4-kube-api-access-9wqmw\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-bj8rm\" (UID: \"f1a9115e-e1a1-4de5-8a59-3b23ac395ec4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bj8rm" Oct 03 08:18:30 crc kubenswrapper[4664]: I1003 08:18:30.633103 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wqmw\" (UniqueName: \"kubernetes.io/projected/f1a9115e-e1a1-4de5-8a59-3b23ac395ec4-kube-api-access-9wqmw\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-bj8rm\" (UID: \"f1a9115e-e1a1-4de5-8a59-3b23ac395ec4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bj8rm" Oct 03 08:18:30 crc kubenswrapper[4664]: I1003 08:18:30.633216 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f1a9115e-e1a1-4de5-8a59-3b23ac395ec4-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-bj8rm\" (UID: \"f1a9115e-e1a1-4de5-8a59-3b23ac395ec4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bj8rm" Oct 03 08:18:30 crc kubenswrapper[4664]: I1003 08:18:30.633249 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f1a9115e-e1a1-4de5-8a59-3b23ac395ec4-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-bj8rm\" (UID: \"f1a9115e-e1a1-4de5-8a59-3b23ac395ec4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bj8rm" Oct 03 08:18:30 crc kubenswrapper[4664]: I1003 08:18:30.636892 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f1a9115e-e1a1-4de5-8a59-3b23ac395ec4-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-bj8rm\" (UID: \"f1a9115e-e1a1-4de5-8a59-3b23ac395ec4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bj8rm" Oct 03 08:18:30 crc kubenswrapper[4664]: I1003 08:18:30.652722 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f1a9115e-e1a1-4de5-8a59-3b23ac395ec4-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-bj8rm\" (UID: \"f1a9115e-e1a1-4de5-8a59-3b23ac395ec4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bj8rm" Oct 03 08:18:30 crc kubenswrapper[4664]: I1003 08:18:30.653534 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wqmw\" (UniqueName: \"kubernetes.io/projected/f1a9115e-e1a1-4de5-8a59-3b23ac395ec4-kube-api-access-9wqmw\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-bj8rm\" (UID: \"f1a9115e-e1a1-4de5-8a59-3b23ac395ec4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bj8rm" Oct 03 08:18:30 crc kubenswrapper[4664]: I1003 08:18:30.790274 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bj8rm" Oct 03 08:18:31 crc kubenswrapper[4664]: I1003 08:18:31.315540 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bj8rm"] Oct 03 08:18:31 crc kubenswrapper[4664]: I1003 08:18:31.376002 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bj8rm" event={"ID":"f1a9115e-e1a1-4de5-8a59-3b23ac395ec4","Type":"ContainerStarted","Data":"f2acf0b6ad77e5bbe1052d92cc93bea03409577e012406832e56094265bb4897"} Oct 03 08:18:32 crc kubenswrapper[4664]: I1003 08:18:32.386388 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bj8rm" event={"ID":"f1a9115e-e1a1-4de5-8a59-3b23ac395ec4","Type":"ContainerStarted","Data":"a31bb9f85c3263021ced5bfe7d7b26ecbb514a70f51507433439f9de49ca8f7e"} Oct 03 08:18:32 crc kubenswrapper[4664]: I1003 08:18:32.414757 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bj8rm" podStartSLOduration=2.198666769 podStartE2EDuration="2.414728336s" podCreationTimestamp="2025-10-03 08:18:30 +0000 UTC" firstStartedPulling="2025-10-03 08:18:31.321519322 +0000 UTC m=+1812.142709822" lastFinishedPulling="2025-10-03 08:18:31.537580909 +0000 UTC m=+1812.358771389" observedRunningTime="2025-10-03 08:18:32.404813843 +0000 UTC m=+1813.226004353" watchObservedRunningTime="2025-10-03 08:18:32.414728336 +0000 UTC m=+1813.235918826" Oct 03 08:18:36 crc kubenswrapper[4664]: I1003 08:18:36.876569 4664 scope.go:117] "RemoveContainer" containerID="bfd5903d53b3dc9e39f07ec7a1586cebd8a3cc45fb7b6842a928b48f35ec7cf9" Oct 03 08:18:36 crc kubenswrapper[4664]: E1003 08:18:36.877149 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:18:50 crc kubenswrapper[4664]: I1003 08:18:50.876478 4664 scope.go:117] "RemoveContainer" containerID="bfd5903d53b3dc9e39f07ec7a1586cebd8a3cc45fb7b6842a928b48f35ec7cf9" Oct 03 08:18:50 crc kubenswrapper[4664]: E1003 08:18:50.877805 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:19:02 crc kubenswrapper[4664]: I1003 08:19:02.876327 4664 scope.go:117] "RemoveContainer" containerID="bfd5903d53b3dc9e39f07ec7a1586cebd8a3cc45fb7b6842a928b48f35ec7cf9" Oct 03 08:19:02 crc kubenswrapper[4664]: E1003 08:19:02.877316 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:19:03 crc kubenswrapper[4664]: I1003 08:19:03.043675 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-djcmk"] Oct 03 08:19:03 crc kubenswrapper[4664]: I1003 08:19:03.051960 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-djcmk"] Oct 03 08:19:03 crc kubenswrapper[4664]: I1003 08:19:03.903421 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d75d0f35-4cda-4925-8d0d-1666f794ce9b" path="/var/lib/kubelet/pods/d75d0f35-4cda-4925-8d0d-1666f794ce9b/volumes" Oct 03 08:19:16 crc kubenswrapper[4664]: I1003 08:19:16.052314 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-k59vh"] Oct 03 08:19:16 crc kubenswrapper[4664]: I1003 08:19:16.063784 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-q2khg"] Oct 03 08:19:16 crc kubenswrapper[4664]: I1003 08:19:16.080013 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-k59vh"] Oct 03 08:19:16 crc kubenswrapper[4664]: I1003 08:19:16.088979 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-q2khg"] Oct 03 08:19:16 crc kubenswrapper[4664]: I1003 08:19:16.879427 4664 scope.go:117] "RemoveContainer" containerID="bfd5903d53b3dc9e39f07ec7a1586cebd8a3cc45fb7b6842a928b48f35ec7cf9" Oct 03 08:19:16 crc kubenswrapper[4664]: E1003 08:19:16.879945 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:19:17 crc kubenswrapper[4664]: I1003 08:19:17.892811 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6eef4071-9678-4d8e-a595-3ab2d97f1862" path="/var/lib/kubelet/pods/6eef4071-9678-4d8e-a595-3ab2d97f1862/volumes" Oct 03 08:19:17 crc kubenswrapper[4664]: I1003 08:19:17.894181 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c13de59d-0879-41ca-95cc-e8bf05c223eb" path="/var/lib/kubelet/pods/c13de59d-0879-41ca-95cc-e8bf05c223eb/volumes" Oct 03 08:19:20 crc kubenswrapper[4664]: I1003 08:19:20.039008 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-vb5f6"] Oct 03 08:19:20 crc kubenswrapper[4664]: I1003 08:19:20.047811 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-vb5f6"] Oct 03 08:19:21 crc kubenswrapper[4664]: I1003 08:19:21.890658 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc164c8e-05b6-4b2e-bc8d-5eeb8970b4dc" path="/var/lib/kubelet/pods/bc164c8e-05b6-4b2e-bc8d-5eeb8970b4dc/volumes" Oct 03 08:19:22 crc kubenswrapper[4664]: I1003 08:19:22.025747 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-wcvcw"] Oct 03 08:19:22 crc kubenswrapper[4664]: I1003 08:19:22.046597 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-wcvcw"] Oct 03 08:19:23 crc kubenswrapper[4664]: I1003 08:19:23.893178 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82832c17-408b-4b89-992f-09e393024fe2" path="/var/lib/kubelet/pods/82832c17-408b-4b89-992f-09e393024fe2/volumes" Oct 03 08:19:25 crc kubenswrapper[4664]: I1003 08:19:25.059350 4664 scope.go:117] "RemoveContainer" containerID="be6533d7289d50bcfc9390a098140db08ef97032f4b07f20dbd48d768af4243a" Oct 03 08:19:25 crc kubenswrapper[4664]: I1003 08:19:25.102832 4664 scope.go:117] "RemoveContainer" containerID="6c5a48133e9038524a116cd42bf27c44fb9734cc10a9f3b5e3e6f8820e6c5446" Oct 03 08:19:25 crc kubenswrapper[4664]: I1003 08:19:25.141769 4664 scope.go:117] "RemoveContainer" containerID="c62a70250bacca48817b9ee175f91b0a12a26ee8d51bc5f0a3fe50be3677205e" Oct 03 08:19:25 crc kubenswrapper[4664]: I1003 08:19:25.195065 4664 scope.go:117] "RemoveContainer" containerID="b9c3c4d136277110b5b01a873e525e489aff77e0d0ef27946aaeaf217e35ca6c" Oct 03 08:19:25 crc kubenswrapper[4664]: I1003 08:19:25.256572 4664 scope.go:117] "RemoveContainer" containerID="2b6d39ba1d42eb78f1004d2232ad96134bd6c37a6bd1e60cd9b5ef41659c264c" Oct 03 08:19:27 crc kubenswrapper[4664]: I1003 08:19:27.876886 4664 scope.go:117] "RemoveContainer" containerID="bfd5903d53b3dc9e39f07ec7a1586cebd8a3cc45fb7b6842a928b48f35ec7cf9" Oct 03 08:19:27 crc kubenswrapper[4664]: E1003 08:19:27.877137 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:19:41 crc kubenswrapper[4664]: I1003 08:19:41.876276 4664 scope.go:117] "RemoveContainer" containerID="bfd5903d53b3dc9e39f07ec7a1586cebd8a3cc45fb7b6842a928b48f35ec7cf9" Oct 03 08:19:41 crc kubenswrapper[4664]: E1003 08:19:41.877150 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:19:43 crc kubenswrapper[4664]: I1003 08:19:43.040715 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-lgr4h"] Oct 03 08:19:43 crc kubenswrapper[4664]: I1003 08:19:43.050731 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-lgr4h"] Oct 03 08:19:43 crc kubenswrapper[4664]: I1003 08:19:43.888549 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d316d5e-f411-4940-af4d-9c42f5baae63" path="/var/lib/kubelet/pods/4d316d5e-f411-4940-af4d-9c42f5baae63/volumes" Oct 03 08:19:52 crc kubenswrapper[4664]: I1003 08:19:52.876902 4664 scope.go:117] "RemoveContainer" containerID="bfd5903d53b3dc9e39f07ec7a1586cebd8a3cc45fb7b6842a928b48f35ec7cf9" Oct 03 08:19:52 crc kubenswrapper[4664]: E1003 08:19:52.877648 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:19:53 crc kubenswrapper[4664]: I1003 08:19:53.164495 4664 generic.go:334] "Generic (PLEG): container finished" podID="f1a9115e-e1a1-4de5-8a59-3b23ac395ec4" containerID="a31bb9f85c3263021ced5bfe7d7b26ecbb514a70f51507433439f9de49ca8f7e" exitCode=0 Oct 03 08:19:53 crc kubenswrapper[4664]: I1003 08:19:53.164545 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bj8rm" event={"ID":"f1a9115e-e1a1-4de5-8a59-3b23ac395ec4","Type":"ContainerDied","Data":"a31bb9f85c3263021ced5bfe7d7b26ecbb514a70f51507433439f9de49ca8f7e"} Oct 03 08:19:54 crc kubenswrapper[4664]: I1003 08:19:54.577055 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bj8rm" Oct 03 08:19:54 crc kubenswrapper[4664]: I1003 08:19:54.643187 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f1a9115e-e1a1-4de5-8a59-3b23ac395ec4-inventory\") pod \"f1a9115e-e1a1-4de5-8a59-3b23ac395ec4\" (UID: \"f1a9115e-e1a1-4de5-8a59-3b23ac395ec4\") " Oct 03 08:19:54 crc kubenswrapper[4664]: I1003 08:19:54.643253 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wqmw\" (UniqueName: \"kubernetes.io/projected/f1a9115e-e1a1-4de5-8a59-3b23ac395ec4-kube-api-access-9wqmw\") pod \"f1a9115e-e1a1-4de5-8a59-3b23ac395ec4\" (UID: \"f1a9115e-e1a1-4de5-8a59-3b23ac395ec4\") " Oct 03 08:19:54 crc kubenswrapper[4664]: I1003 08:19:54.643312 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f1a9115e-e1a1-4de5-8a59-3b23ac395ec4-ssh-key\") pod \"f1a9115e-e1a1-4de5-8a59-3b23ac395ec4\" (UID: \"f1a9115e-e1a1-4de5-8a59-3b23ac395ec4\") " Oct 03 08:19:54 crc kubenswrapper[4664]: I1003 08:19:54.649371 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1a9115e-e1a1-4de5-8a59-3b23ac395ec4-kube-api-access-9wqmw" (OuterVolumeSpecName: "kube-api-access-9wqmw") pod "f1a9115e-e1a1-4de5-8a59-3b23ac395ec4" (UID: "f1a9115e-e1a1-4de5-8a59-3b23ac395ec4"). InnerVolumeSpecName "kube-api-access-9wqmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:19:54 crc kubenswrapper[4664]: E1003 08:19:54.673248 4664 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1a9115e-e1a1-4de5-8a59-3b23ac395ec4-ssh-key podName:f1a9115e-e1a1-4de5-8a59-3b23ac395ec4 nodeName:}" failed. No retries permitted until 2025-10-03 08:19:55.17321673 +0000 UTC m=+1895.994407220 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ssh-key" (UniqueName: "kubernetes.io/secret/f1a9115e-e1a1-4de5-8a59-3b23ac395ec4-ssh-key") pod "f1a9115e-e1a1-4de5-8a59-3b23ac395ec4" (UID: "f1a9115e-e1a1-4de5-8a59-3b23ac395ec4") : error deleting /var/lib/kubelet/pods/f1a9115e-e1a1-4de5-8a59-3b23ac395ec4/volume-subpaths: remove /var/lib/kubelet/pods/f1a9115e-e1a1-4de5-8a59-3b23ac395ec4/volume-subpaths: no such file or directory Oct 03 08:19:54 crc kubenswrapper[4664]: I1003 08:19:54.676074 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1a9115e-e1a1-4de5-8a59-3b23ac395ec4-inventory" (OuterVolumeSpecName: "inventory") pod "f1a9115e-e1a1-4de5-8a59-3b23ac395ec4" (UID: "f1a9115e-e1a1-4de5-8a59-3b23ac395ec4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:19:54 crc kubenswrapper[4664]: I1003 08:19:54.745842 4664 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f1a9115e-e1a1-4de5-8a59-3b23ac395ec4-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 08:19:54 crc kubenswrapper[4664]: I1003 08:19:54.745884 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wqmw\" (UniqueName: \"kubernetes.io/projected/f1a9115e-e1a1-4de5-8a59-3b23ac395ec4-kube-api-access-9wqmw\") on node \"crc\" DevicePath \"\"" Oct 03 08:19:55 crc kubenswrapper[4664]: I1003 08:19:55.184424 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bj8rm" event={"ID":"f1a9115e-e1a1-4de5-8a59-3b23ac395ec4","Type":"ContainerDied","Data":"f2acf0b6ad77e5bbe1052d92cc93bea03409577e012406832e56094265bb4897"} Oct 03 08:19:55 crc kubenswrapper[4664]: I1003 08:19:55.184468 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2acf0b6ad77e5bbe1052d92cc93bea03409577e012406832e56094265bb4897" Oct 03 08:19:55 crc kubenswrapper[4664]: I1003 08:19:55.184493 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bj8rm" Oct 03 08:19:55 crc kubenswrapper[4664]: I1003 08:19:55.255004 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f1a9115e-e1a1-4de5-8a59-3b23ac395ec4-ssh-key\") pod \"f1a9115e-e1a1-4de5-8a59-3b23ac395ec4\" (UID: \"f1a9115e-e1a1-4de5-8a59-3b23ac395ec4\") " Oct 03 08:19:55 crc kubenswrapper[4664]: I1003 08:19:55.261598 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1a9115e-e1a1-4de5-8a59-3b23ac395ec4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f1a9115e-e1a1-4de5-8a59-3b23ac395ec4" (UID: "f1a9115e-e1a1-4de5-8a59-3b23ac395ec4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:19:55 crc kubenswrapper[4664]: I1003 08:19:55.273371 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-57b7j"] Oct 03 08:19:55 crc kubenswrapper[4664]: E1003 08:19:55.273960 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1a9115e-e1a1-4de5-8a59-3b23ac395ec4" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 03 08:19:55 crc kubenswrapper[4664]: I1003 08:19:55.273990 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1a9115e-e1a1-4de5-8a59-3b23ac395ec4" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 03 08:19:55 crc kubenswrapper[4664]: I1003 08:19:55.274329 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1a9115e-e1a1-4de5-8a59-3b23ac395ec4" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 03 08:19:55 crc kubenswrapper[4664]: I1003 08:19:55.275534 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-57b7j" Oct 03 08:19:55 crc kubenswrapper[4664]: I1003 08:19:55.293527 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-57b7j"] Oct 03 08:19:55 crc kubenswrapper[4664]: I1003 08:19:55.357524 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8307f32a-5a7d-4239-bd00-5b69c80f407d-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-57b7j\" (UID: \"8307f32a-5a7d-4239-bd00-5b69c80f407d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-57b7j" Oct 03 08:19:55 crc kubenswrapper[4664]: I1003 08:19:55.357637 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8307f32a-5a7d-4239-bd00-5b69c80f407d-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-57b7j\" (UID: \"8307f32a-5a7d-4239-bd00-5b69c80f407d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-57b7j" Oct 03 08:19:55 crc kubenswrapper[4664]: I1003 08:19:55.357764 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kz5pm\" (UniqueName: \"kubernetes.io/projected/8307f32a-5a7d-4239-bd00-5b69c80f407d-kube-api-access-kz5pm\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-57b7j\" (UID: \"8307f32a-5a7d-4239-bd00-5b69c80f407d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-57b7j" Oct 03 08:19:55 crc kubenswrapper[4664]: I1003 08:19:55.357856 4664 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f1a9115e-e1a1-4de5-8a59-3b23ac395ec4-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 08:19:55 crc kubenswrapper[4664]: I1003 08:19:55.460413 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kz5pm\" (UniqueName: \"kubernetes.io/projected/8307f32a-5a7d-4239-bd00-5b69c80f407d-kube-api-access-kz5pm\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-57b7j\" (UID: \"8307f32a-5a7d-4239-bd00-5b69c80f407d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-57b7j" Oct 03 08:19:55 crc kubenswrapper[4664]: I1003 08:19:55.460593 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8307f32a-5a7d-4239-bd00-5b69c80f407d-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-57b7j\" (UID: \"8307f32a-5a7d-4239-bd00-5b69c80f407d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-57b7j" Oct 03 08:19:55 crc kubenswrapper[4664]: I1003 08:19:55.460720 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8307f32a-5a7d-4239-bd00-5b69c80f407d-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-57b7j\" (UID: \"8307f32a-5a7d-4239-bd00-5b69c80f407d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-57b7j" Oct 03 08:19:55 crc kubenswrapper[4664]: I1003 08:19:55.466892 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8307f32a-5a7d-4239-bd00-5b69c80f407d-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-57b7j\" (UID: \"8307f32a-5a7d-4239-bd00-5b69c80f407d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-57b7j" Oct 03 08:19:55 crc kubenswrapper[4664]: I1003 08:19:55.466967 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8307f32a-5a7d-4239-bd00-5b69c80f407d-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-57b7j\" (UID: \"8307f32a-5a7d-4239-bd00-5b69c80f407d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-57b7j" Oct 03 08:19:55 crc kubenswrapper[4664]: I1003 08:19:55.480301 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kz5pm\" (UniqueName: \"kubernetes.io/projected/8307f32a-5a7d-4239-bd00-5b69c80f407d-kube-api-access-kz5pm\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-57b7j\" (UID: \"8307f32a-5a7d-4239-bd00-5b69c80f407d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-57b7j" Oct 03 08:19:55 crc kubenswrapper[4664]: I1003 08:19:55.642917 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-57b7j" Oct 03 08:19:56 crc kubenswrapper[4664]: I1003 08:19:56.192349 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-57b7j"] Oct 03 08:19:57 crc kubenswrapper[4664]: I1003 08:19:57.207852 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-57b7j" event={"ID":"8307f32a-5a7d-4239-bd00-5b69c80f407d","Type":"ContainerStarted","Data":"c8b747722068c83ac0b9cb183c5d53ca4109725d166d4c9de1e21e8b6e7d21dd"} Oct 03 08:19:57 crc kubenswrapper[4664]: I1003 08:19:57.208269 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-57b7j" event={"ID":"8307f32a-5a7d-4239-bd00-5b69c80f407d","Type":"ContainerStarted","Data":"35bed553f3509127e21c1e90fce296130625049f8c00744fdfe68f8eabf7784e"} Oct 03 08:19:57 crc kubenswrapper[4664]: I1003 08:19:57.234042 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-57b7j" podStartSLOduration=2.085050427 podStartE2EDuration="2.23400816s" podCreationTimestamp="2025-10-03 08:19:55 +0000 UTC" firstStartedPulling="2025-10-03 08:19:56.203586182 +0000 UTC m=+1897.024776672" lastFinishedPulling="2025-10-03 08:19:56.352543915 +0000 UTC m=+1897.173734405" observedRunningTime="2025-10-03 08:19:57.224280921 +0000 UTC m=+1898.045471431" watchObservedRunningTime="2025-10-03 08:19:57.23400816 +0000 UTC m=+1898.055198650" Oct 03 08:20:02 crc kubenswrapper[4664]: I1003 08:20:02.262427 4664 generic.go:334] "Generic (PLEG): container finished" podID="8307f32a-5a7d-4239-bd00-5b69c80f407d" containerID="c8b747722068c83ac0b9cb183c5d53ca4109725d166d4c9de1e21e8b6e7d21dd" exitCode=0 Oct 03 08:20:02 crc kubenswrapper[4664]: I1003 08:20:02.262533 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-57b7j" event={"ID":"8307f32a-5a7d-4239-bd00-5b69c80f407d","Type":"ContainerDied","Data":"c8b747722068c83ac0b9cb183c5d53ca4109725d166d4c9de1e21e8b6e7d21dd"} Oct 03 08:20:03 crc kubenswrapper[4664]: I1003 08:20:03.678896 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-57b7j" Oct 03 08:20:03 crc kubenswrapper[4664]: I1003 08:20:03.745269 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8307f32a-5a7d-4239-bd00-5b69c80f407d-inventory\") pod \"8307f32a-5a7d-4239-bd00-5b69c80f407d\" (UID: \"8307f32a-5a7d-4239-bd00-5b69c80f407d\") " Oct 03 08:20:03 crc kubenswrapper[4664]: I1003 08:20:03.745341 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8307f32a-5a7d-4239-bd00-5b69c80f407d-ssh-key\") pod \"8307f32a-5a7d-4239-bd00-5b69c80f407d\" (UID: \"8307f32a-5a7d-4239-bd00-5b69c80f407d\") " Oct 03 08:20:03 crc kubenswrapper[4664]: I1003 08:20:03.745554 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kz5pm\" (UniqueName: \"kubernetes.io/projected/8307f32a-5a7d-4239-bd00-5b69c80f407d-kube-api-access-kz5pm\") pod \"8307f32a-5a7d-4239-bd00-5b69c80f407d\" (UID: \"8307f32a-5a7d-4239-bd00-5b69c80f407d\") " Oct 03 08:20:03 crc kubenswrapper[4664]: I1003 08:20:03.751074 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8307f32a-5a7d-4239-bd00-5b69c80f407d-kube-api-access-kz5pm" (OuterVolumeSpecName: "kube-api-access-kz5pm") pod "8307f32a-5a7d-4239-bd00-5b69c80f407d" (UID: "8307f32a-5a7d-4239-bd00-5b69c80f407d"). InnerVolumeSpecName "kube-api-access-kz5pm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:20:03 crc kubenswrapper[4664]: I1003 08:20:03.773327 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8307f32a-5a7d-4239-bd00-5b69c80f407d-inventory" (OuterVolumeSpecName: "inventory") pod "8307f32a-5a7d-4239-bd00-5b69c80f407d" (UID: "8307f32a-5a7d-4239-bd00-5b69c80f407d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:20:03 crc kubenswrapper[4664]: I1003 08:20:03.774464 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8307f32a-5a7d-4239-bd00-5b69c80f407d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8307f32a-5a7d-4239-bd00-5b69c80f407d" (UID: "8307f32a-5a7d-4239-bd00-5b69c80f407d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:20:03 crc kubenswrapper[4664]: I1003 08:20:03.847868 4664 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8307f32a-5a7d-4239-bd00-5b69c80f407d-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 08:20:03 crc kubenswrapper[4664]: I1003 08:20:03.847904 4664 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8307f32a-5a7d-4239-bd00-5b69c80f407d-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 08:20:03 crc kubenswrapper[4664]: I1003 08:20:03.847916 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kz5pm\" (UniqueName: \"kubernetes.io/projected/8307f32a-5a7d-4239-bd00-5b69c80f407d-kube-api-access-kz5pm\") on node \"crc\" DevicePath \"\"" Oct 03 08:20:03 crc kubenswrapper[4664]: I1003 08:20:03.877864 4664 scope.go:117] "RemoveContainer" containerID="bfd5903d53b3dc9e39f07ec7a1586cebd8a3cc45fb7b6842a928b48f35ec7cf9" Oct 03 08:20:03 crc kubenswrapper[4664]: E1003 08:20:03.878341 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:20:04 crc kubenswrapper[4664]: I1003 08:20:04.281519 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-57b7j" event={"ID":"8307f32a-5a7d-4239-bd00-5b69c80f407d","Type":"ContainerDied","Data":"35bed553f3509127e21c1e90fce296130625049f8c00744fdfe68f8eabf7784e"} Oct 03 08:20:04 crc kubenswrapper[4664]: I1003 08:20:04.281570 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35bed553f3509127e21c1e90fce296130625049f8c00744fdfe68f8eabf7784e" Oct 03 08:20:04 crc kubenswrapper[4664]: I1003 08:20:04.281624 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-57b7j" Oct 03 08:20:04 crc kubenswrapper[4664]: I1003 08:20:04.354982 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-6kz44"] Oct 03 08:20:04 crc kubenswrapper[4664]: E1003 08:20:04.355749 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8307f32a-5a7d-4239-bd00-5b69c80f407d" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 03 08:20:04 crc kubenswrapper[4664]: I1003 08:20:04.355843 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="8307f32a-5a7d-4239-bd00-5b69c80f407d" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 03 08:20:04 crc kubenswrapper[4664]: I1003 08:20:04.356106 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="8307f32a-5a7d-4239-bd00-5b69c80f407d" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 03 08:20:04 crc kubenswrapper[4664]: I1003 08:20:04.356889 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6kz44" Oct 03 08:20:04 crc kubenswrapper[4664]: I1003 08:20:04.359840 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h6s74" Oct 03 08:20:04 crc kubenswrapper[4664]: I1003 08:20:04.359998 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 08:20:04 crc kubenswrapper[4664]: I1003 08:20:04.360122 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 08:20:04 crc kubenswrapper[4664]: I1003 08:20:04.360179 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 08:20:04 crc kubenswrapper[4664]: I1003 08:20:04.365800 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-6kz44"] Oct 03 08:20:04 crc kubenswrapper[4664]: I1003 08:20:04.459974 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq8rt\" (UniqueName: \"kubernetes.io/projected/d40d65c6-7d3e-4be9-8c0c-a24b74166668-kube-api-access-lq8rt\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6kz44\" (UID: \"d40d65c6-7d3e-4be9-8c0c-a24b74166668\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6kz44" Oct 03 08:20:04 crc kubenswrapper[4664]: I1003 08:20:04.460431 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d40d65c6-7d3e-4be9-8c0c-a24b74166668-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6kz44\" (UID: \"d40d65c6-7d3e-4be9-8c0c-a24b74166668\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6kz44" Oct 03 08:20:04 crc kubenswrapper[4664]: I1003 08:20:04.460487 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d40d65c6-7d3e-4be9-8c0c-a24b74166668-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6kz44\" (UID: \"d40d65c6-7d3e-4be9-8c0c-a24b74166668\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6kz44" Oct 03 08:20:04 crc kubenswrapper[4664]: I1003 08:20:04.562441 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d40d65c6-7d3e-4be9-8c0c-a24b74166668-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6kz44\" (UID: \"d40d65c6-7d3e-4be9-8c0c-a24b74166668\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6kz44" Oct 03 08:20:04 crc kubenswrapper[4664]: I1003 08:20:04.562537 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d40d65c6-7d3e-4be9-8c0c-a24b74166668-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6kz44\" (UID: \"d40d65c6-7d3e-4be9-8c0c-a24b74166668\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6kz44" Oct 03 08:20:04 crc kubenswrapper[4664]: I1003 08:20:04.562647 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq8rt\" (UniqueName: \"kubernetes.io/projected/d40d65c6-7d3e-4be9-8c0c-a24b74166668-kube-api-access-lq8rt\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6kz44\" (UID: \"d40d65c6-7d3e-4be9-8c0c-a24b74166668\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6kz44" Oct 03 08:20:04 crc kubenswrapper[4664]: I1003 08:20:04.569265 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d40d65c6-7d3e-4be9-8c0c-a24b74166668-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6kz44\" (UID: \"d40d65c6-7d3e-4be9-8c0c-a24b74166668\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6kz44" Oct 03 08:20:04 crc kubenswrapper[4664]: I1003 08:20:04.569319 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d40d65c6-7d3e-4be9-8c0c-a24b74166668-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6kz44\" (UID: \"d40d65c6-7d3e-4be9-8c0c-a24b74166668\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6kz44" Oct 03 08:20:04 crc kubenswrapper[4664]: I1003 08:20:04.582842 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq8rt\" (UniqueName: \"kubernetes.io/projected/d40d65c6-7d3e-4be9-8c0c-a24b74166668-kube-api-access-lq8rt\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6kz44\" (UID: \"d40d65c6-7d3e-4be9-8c0c-a24b74166668\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6kz44" Oct 03 08:20:04 crc kubenswrapper[4664]: I1003 08:20:04.674581 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6kz44" Oct 03 08:20:05 crc kubenswrapper[4664]: I1003 08:20:05.205886 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-6kz44"] Oct 03 08:20:05 crc kubenswrapper[4664]: I1003 08:20:05.291735 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6kz44" event={"ID":"d40d65c6-7d3e-4be9-8c0c-a24b74166668","Type":"ContainerStarted","Data":"d8bba3c04a08b3d35da1638410eb1d81c4d5fca9937b2b236dec3ad5a009b98a"} Oct 03 08:20:06 crc kubenswrapper[4664]: I1003 08:20:06.300962 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6kz44" event={"ID":"d40d65c6-7d3e-4be9-8c0c-a24b74166668","Type":"ContainerStarted","Data":"ba81ed7edded3b7d322835869695863fb3ab24bfc0efc7939ee94c307a988eab"} Oct 03 08:20:06 crc kubenswrapper[4664]: I1003 08:20:06.323469 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6kz44" podStartSLOduration=2.137627756 podStartE2EDuration="2.323447887s" podCreationTimestamp="2025-10-03 08:20:04 +0000 UTC" firstStartedPulling="2025-10-03 08:20:05.210808449 +0000 UTC m=+1906.031998939" lastFinishedPulling="2025-10-03 08:20:05.39662859 +0000 UTC m=+1906.217819070" observedRunningTime="2025-10-03 08:20:06.315125539 +0000 UTC m=+1907.136316029" watchObservedRunningTime="2025-10-03 08:20:06.323447887 +0000 UTC m=+1907.144638377" Oct 03 08:20:10 crc kubenswrapper[4664]: I1003 08:20:10.040316 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-4q9vk"] Oct 03 08:20:10 crc kubenswrapper[4664]: I1003 08:20:10.054507 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-4q9vk"] Oct 03 08:20:11 crc kubenswrapper[4664]: I1003 08:20:11.037467 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-n7nck"] Oct 03 08:20:11 crc kubenswrapper[4664]: I1003 08:20:11.048701 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-zp2sz"] Oct 03 08:20:11 crc kubenswrapper[4664]: I1003 08:20:11.055953 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-n7nck"] Oct 03 08:20:11 crc kubenswrapper[4664]: I1003 08:20:11.062801 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-zp2sz"] Oct 03 08:20:11 crc kubenswrapper[4664]: I1003 08:20:11.893237 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1af25c5f-f748-4b42-8c46-f7929cb72bdf" path="/var/lib/kubelet/pods/1af25c5f-f748-4b42-8c46-f7929cb72bdf/volumes" Oct 03 08:20:11 crc kubenswrapper[4664]: I1003 08:20:11.893867 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76e6be14-99be-4c39-8b45-43cf4dd937c5" path="/var/lib/kubelet/pods/76e6be14-99be-4c39-8b45-43cf4dd937c5/volumes" Oct 03 08:20:11 crc kubenswrapper[4664]: I1003 08:20:11.894346 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="baf2ab95-9e7c-4e4f-b653-c4dbb00b0745" path="/var/lib/kubelet/pods/baf2ab95-9e7c-4e4f-b653-c4dbb00b0745/volumes" Oct 03 08:20:16 crc kubenswrapper[4664]: I1003 08:20:16.876811 4664 scope.go:117] "RemoveContainer" containerID="bfd5903d53b3dc9e39f07ec7a1586cebd8a3cc45fb7b6842a928b48f35ec7cf9" Oct 03 08:20:16 crc kubenswrapper[4664]: E1003 08:20:16.877997 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:20:17 crc kubenswrapper[4664]: I1003 08:20:17.045421 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-30c0-account-create-6g9b4"] Oct 03 08:20:17 crc kubenswrapper[4664]: I1003 08:20:17.054503 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-30c0-account-create-6g9b4"] Oct 03 08:20:17 crc kubenswrapper[4664]: I1003 08:20:17.888465 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="569e62d3-ff5a-483b-914c-c17b6b1c35da" path="/var/lib/kubelet/pods/569e62d3-ff5a-483b-914c-c17b6b1c35da/volumes" Oct 03 08:20:25 crc kubenswrapper[4664]: I1003 08:20:25.387674 4664 scope.go:117] "RemoveContainer" containerID="18a7ff075956ded64f9d6e2abb1947765764e84363b9a35a07db64b75756a64d" Oct 03 08:20:25 crc kubenswrapper[4664]: I1003 08:20:25.466888 4664 scope.go:117] "RemoveContainer" containerID="4ac161b7ccb2f318cb332bf0e1727ea22d29eb30c89b84332c224f983685b3fa" Oct 03 08:20:25 crc kubenswrapper[4664]: I1003 08:20:25.494372 4664 scope.go:117] "RemoveContainer" containerID="a166b452207f8454dd4f6d38f6173d2e37f64c6aa48d74c3e1e4cdcc87843d0c" Oct 03 08:20:25 crc kubenswrapper[4664]: I1003 08:20:25.543672 4664 scope.go:117] "RemoveContainer" containerID="1efed958dfcf921e8fc2ef0cf2a433973765b9179ebc906bf355b06b51e5e0d5" Oct 03 08:20:25 crc kubenswrapper[4664]: I1003 08:20:25.618900 4664 scope.go:117] "RemoveContainer" containerID="121c0f76bc2bdf81c4167dd3c3b09a06365ffdc73b78ef105096b7a9e8cf653a" Oct 03 08:20:28 crc kubenswrapper[4664]: I1003 08:20:28.036106 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-fa05-account-create-vslm4"] Oct 03 08:20:28 crc kubenswrapper[4664]: I1003 08:20:28.051434 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-4ca0-account-create-8x7r2"] Oct 03 08:20:28 crc kubenswrapper[4664]: I1003 08:20:28.063542 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-4ca0-account-create-8x7r2"] Oct 03 08:20:28 crc kubenswrapper[4664]: I1003 08:20:28.072425 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-fa05-account-create-vslm4"] Oct 03 08:20:28 crc kubenswrapper[4664]: I1003 08:20:28.876104 4664 scope.go:117] "RemoveContainer" containerID="bfd5903d53b3dc9e39f07ec7a1586cebd8a3cc45fb7b6842a928b48f35ec7cf9" Oct 03 08:20:28 crc kubenswrapper[4664]: E1003 08:20:28.876993 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:20:29 crc kubenswrapper[4664]: I1003 08:20:29.891264 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17d3f5d3-82e5-4954-83a6-a7e0255df488" path="/var/lib/kubelet/pods/17d3f5d3-82e5-4954-83a6-a7e0255df488/volumes" Oct 03 08:20:29 crc kubenswrapper[4664]: I1003 08:20:29.892336 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53a3e54d-79e2-48c5-84df-499f3ac97222" path="/var/lib/kubelet/pods/53a3e54d-79e2-48c5-84df-499f3ac97222/volumes" Oct 03 08:20:39 crc kubenswrapper[4664]: I1003 08:20:39.882820 4664 scope.go:117] "RemoveContainer" containerID="bfd5903d53b3dc9e39f07ec7a1586cebd8a3cc45fb7b6842a928b48f35ec7cf9" Oct 03 08:20:39 crc kubenswrapper[4664]: E1003 08:20:39.890356 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:20:42 crc kubenswrapper[4664]: I1003 08:20:42.639934 4664 generic.go:334] "Generic (PLEG): container finished" podID="d40d65c6-7d3e-4be9-8c0c-a24b74166668" containerID="ba81ed7edded3b7d322835869695863fb3ab24bfc0efc7939ee94c307a988eab" exitCode=0 Oct 03 08:20:42 crc kubenswrapper[4664]: I1003 08:20:42.640028 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6kz44" event={"ID":"d40d65c6-7d3e-4be9-8c0c-a24b74166668","Type":"ContainerDied","Data":"ba81ed7edded3b7d322835869695863fb3ab24bfc0efc7939ee94c307a988eab"} Oct 03 08:20:44 crc kubenswrapper[4664]: I1003 08:20:44.059424 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6kz44" Oct 03 08:20:44 crc kubenswrapper[4664]: I1003 08:20:44.224063 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d40d65c6-7d3e-4be9-8c0c-a24b74166668-inventory\") pod \"d40d65c6-7d3e-4be9-8c0c-a24b74166668\" (UID: \"d40d65c6-7d3e-4be9-8c0c-a24b74166668\") " Oct 03 08:20:44 crc kubenswrapper[4664]: I1003 08:20:44.224192 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d40d65c6-7d3e-4be9-8c0c-a24b74166668-ssh-key\") pod \"d40d65c6-7d3e-4be9-8c0c-a24b74166668\" (UID: \"d40d65c6-7d3e-4be9-8c0c-a24b74166668\") " Oct 03 08:20:44 crc kubenswrapper[4664]: I1003 08:20:44.224468 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lq8rt\" (UniqueName: \"kubernetes.io/projected/d40d65c6-7d3e-4be9-8c0c-a24b74166668-kube-api-access-lq8rt\") pod \"d40d65c6-7d3e-4be9-8c0c-a24b74166668\" (UID: \"d40d65c6-7d3e-4be9-8c0c-a24b74166668\") " Oct 03 08:20:44 crc kubenswrapper[4664]: I1003 08:20:44.231105 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d40d65c6-7d3e-4be9-8c0c-a24b74166668-kube-api-access-lq8rt" (OuterVolumeSpecName: "kube-api-access-lq8rt") pod "d40d65c6-7d3e-4be9-8c0c-a24b74166668" (UID: "d40d65c6-7d3e-4be9-8c0c-a24b74166668"). InnerVolumeSpecName "kube-api-access-lq8rt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:20:44 crc kubenswrapper[4664]: I1003 08:20:44.255386 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d40d65c6-7d3e-4be9-8c0c-a24b74166668-inventory" (OuterVolumeSpecName: "inventory") pod "d40d65c6-7d3e-4be9-8c0c-a24b74166668" (UID: "d40d65c6-7d3e-4be9-8c0c-a24b74166668"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:20:44 crc kubenswrapper[4664]: I1003 08:20:44.258757 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d40d65c6-7d3e-4be9-8c0c-a24b74166668-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d40d65c6-7d3e-4be9-8c0c-a24b74166668" (UID: "d40d65c6-7d3e-4be9-8c0c-a24b74166668"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:20:44 crc kubenswrapper[4664]: I1003 08:20:44.328684 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lq8rt\" (UniqueName: \"kubernetes.io/projected/d40d65c6-7d3e-4be9-8c0c-a24b74166668-kube-api-access-lq8rt\") on node \"crc\" DevicePath \"\"" Oct 03 08:20:44 crc kubenswrapper[4664]: I1003 08:20:44.328733 4664 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d40d65c6-7d3e-4be9-8c0c-a24b74166668-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 08:20:44 crc kubenswrapper[4664]: I1003 08:20:44.328748 4664 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d40d65c6-7d3e-4be9-8c0c-a24b74166668-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 08:20:44 crc kubenswrapper[4664]: I1003 08:20:44.657461 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6kz44" event={"ID":"d40d65c6-7d3e-4be9-8c0c-a24b74166668","Type":"ContainerDied","Data":"d8bba3c04a08b3d35da1638410eb1d81c4d5fca9937b2b236dec3ad5a009b98a"} Oct 03 08:20:44 crc kubenswrapper[4664]: I1003 08:20:44.657506 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8bba3c04a08b3d35da1638410eb1d81c4d5fca9937b2b236dec3ad5a009b98a" Oct 03 08:20:44 crc kubenswrapper[4664]: I1003 08:20:44.657538 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6kz44" Oct 03 08:20:44 crc kubenswrapper[4664]: I1003 08:20:44.749774 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gtls8"] Oct 03 08:20:44 crc kubenswrapper[4664]: E1003 08:20:44.750396 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d40d65c6-7d3e-4be9-8c0c-a24b74166668" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 03 08:20:44 crc kubenswrapper[4664]: I1003 08:20:44.750428 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="d40d65c6-7d3e-4be9-8c0c-a24b74166668" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 03 08:20:44 crc kubenswrapper[4664]: I1003 08:20:44.750698 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="d40d65c6-7d3e-4be9-8c0c-a24b74166668" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 03 08:20:44 crc kubenswrapper[4664]: I1003 08:20:44.751492 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gtls8" Oct 03 08:20:44 crc kubenswrapper[4664]: I1003 08:20:44.754949 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 08:20:44 crc kubenswrapper[4664]: I1003 08:20:44.755805 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h6s74" Oct 03 08:20:44 crc kubenswrapper[4664]: I1003 08:20:44.756096 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 08:20:44 crc kubenswrapper[4664]: I1003 08:20:44.756112 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 08:20:44 crc kubenswrapper[4664]: I1003 08:20:44.761020 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gtls8"] Oct 03 08:20:44 crc kubenswrapper[4664]: I1003 08:20:44.943687 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1bec1364-2710-466d-83d1-66e226d2e314-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gtls8\" (UID: \"1bec1364-2710-466d-83d1-66e226d2e314\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gtls8" Oct 03 08:20:44 crc kubenswrapper[4664]: I1003 08:20:44.944096 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1bec1364-2710-466d-83d1-66e226d2e314-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gtls8\" (UID: \"1bec1364-2710-466d-83d1-66e226d2e314\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gtls8" Oct 03 08:20:44 crc kubenswrapper[4664]: I1003 08:20:44.944621 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vp4lg\" (UniqueName: \"kubernetes.io/projected/1bec1364-2710-466d-83d1-66e226d2e314-kube-api-access-vp4lg\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gtls8\" (UID: \"1bec1364-2710-466d-83d1-66e226d2e314\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gtls8" Oct 03 08:20:45 crc kubenswrapper[4664]: I1003 08:20:45.048025 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vp4lg\" (UniqueName: \"kubernetes.io/projected/1bec1364-2710-466d-83d1-66e226d2e314-kube-api-access-vp4lg\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gtls8\" (UID: \"1bec1364-2710-466d-83d1-66e226d2e314\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gtls8" Oct 03 08:20:45 crc kubenswrapper[4664]: I1003 08:20:45.048481 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1bec1364-2710-466d-83d1-66e226d2e314-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gtls8\" (UID: \"1bec1364-2710-466d-83d1-66e226d2e314\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gtls8" Oct 03 08:20:45 crc kubenswrapper[4664]: I1003 08:20:45.048546 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1bec1364-2710-466d-83d1-66e226d2e314-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gtls8\" (UID: \"1bec1364-2710-466d-83d1-66e226d2e314\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gtls8" Oct 03 08:20:45 crc kubenswrapper[4664]: I1003 08:20:45.054125 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1bec1364-2710-466d-83d1-66e226d2e314-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gtls8\" (UID: \"1bec1364-2710-466d-83d1-66e226d2e314\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gtls8" Oct 03 08:20:45 crc kubenswrapper[4664]: I1003 08:20:45.055181 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1bec1364-2710-466d-83d1-66e226d2e314-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gtls8\" (UID: \"1bec1364-2710-466d-83d1-66e226d2e314\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gtls8" Oct 03 08:20:45 crc kubenswrapper[4664]: I1003 08:20:45.069215 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vp4lg\" (UniqueName: \"kubernetes.io/projected/1bec1364-2710-466d-83d1-66e226d2e314-kube-api-access-vp4lg\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gtls8\" (UID: \"1bec1364-2710-466d-83d1-66e226d2e314\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gtls8" Oct 03 08:20:45 crc kubenswrapper[4664]: I1003 08:20:45.079981 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gtls8" Oct 03 08:20:45 crc kubenswrapper[4664]: I1003 08:20:45.642251 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gtls8"] Oct 03 08:20:45 crc kubenswrapper[4664]: I1003 08:20:45.669376 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gtls8" event={"ID":"1bec1364-2710-466d-83d1-66e226d2e314","Type":"ContainerStarted","Data":"e72dd274caf644a614498a6a1e384f1d4f5df7358f81866bfbcdf7e27c048952"} Oct 03 08:20:46 crc kubenswrapper[4664]: I1003 08:20:46.683473 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gtls8" event={"ID":"1bec1364-2710-466d-83d1-66e226d2e314","Type":"ContainerStarted","Data":"f75b3ffc493773561eadf39d86593052eb790a4271823b572aa9096dced2b29b"} Oct 03 08:20:46 crc kubenswrapper[4664]: I1003 08:20:46.706403 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gtls8" podStartSLOduration=2.473059196 podStartE2EDuration="2.706375899s" podCreationTimestamp="2025-10-03 08:20:44 +0000 UTC" firstStartedPulling="2025-10-03 08:20:45.644095505 +0000 UTC m=+1946.465285995" lastFinishedPulling="2025-10-03 08:20:45.877412188 +0000 UTC m=+1946.698602698" observedRunningTime="2025-10-03 08:20:46.703224888 +0000 UTC m=+1947.524415388" watchObservedRunningTime="2025-10-03 08:20:46.706375899 +0000 UTC m=+1947.527566389" Oct 03 08:20:52 crc kubenswrapper[4664]: I1003 08:20:52.038898 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-c6qmc"] Oct 03 08:20:52 crc kubenswrapper[4664]: I1003 08:20:52.067402 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-c6qmc"] Oct 03 08:20:52 crc kubenswrapper[4664]: I1003 08:20:52.877409 4664 scope.go:117] "RemoveContainer" containerID="bfd5903d53b3dc9e39f07ec7a1586cebd8a3cc45fb7b6842a928b48f35ec7cf9" Oct 03 08:20:52 crc kubenswrapper[4664]: E1003 08:20:52.877889 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:20:53 crc kubenswrapper[4664]: I1003 08:20:53.888531 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4958ebda-1932-42bd-825b-c64ac09c50ac" path="/var/lib/kubelet/pods/4958ebda-1932-42bd-825b-c64ac09c50ac/volumes" Oct 03 08:21:06 crc kubenswrapper[4664]: I1003 08:21:06.876404 4664 scope.go:117] "RemoveContainer" containerID="bfd5903d53b3dc9e39f07ec7a1586cebd8a3cc45fb7b6842a928b48f35ec7cf9" Oct 03 08:21:06 crc kubenswrapper[4664]: E1003 08:21:06.877516 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:21:15 crc kubenswrapper[4664]: I1003 08:21:15.039646 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-srldj"] Oct 03 08:21:15 crc kubenswrapper[4664]: I1003 08:21:15.048513 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-srldj"] Oct 03 08:21:15 crc kubenswrapper[4664]: I1003 08:21:15.887803 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef" path="/var/lib/kubelet/pods/6deb0e14-600b-4c7c-93ec-3ba6e3b2c0ef/volumes" Oct 03 08:21:16 crc kubenswrapper[4664]: I1003 08:21:16.030954 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7c992"] Oct 03 08:21:16 crc kubenswrapper[4664]: I1003 08:21:16.038516 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7c992"] Oct 03 08:21:17 crc kubenswrapper[4664]: I1003 08:21:17.893830 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb30837f-4a77-4972-a012-ef0c51b62cb5" path="/var/lib/kubelet/pods/bb30837f-4a77-4972-a012-ef0c51b62cb5/volumes" Oct 03 08:21:21 crc kubenswrapper[4664]: I1003 08:21:21.877432 4664 scope.go:117] "RemoveContainer" containerID="bfd5903d53b3dc9e39f07ec7a1586cebd8a3cc45fb7b6842a928b48f35ec7cf9" Oct 03 08:21:21 crc kubenswrapper[4664]: E1003 08:21:21.878582 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:21:25 crc kubenswrapper[4664]: I1003 08:21:25.758777 4664 scope.go:117] "RemoveContainer" containerID="e199afdf389fdeda61244d3d59d662461641ef71f62fd5a6d8cb8c80371c855e" Oct 03 08:21:25 crc kubenswrapper[4664]: I1003 08:21:25.815957 4664 scope.go:117] "RemoveContainer" containerID="b0b1c809d633c9aa80a2ccb69d3579a6dc65f481cf89091f33d43f760ee397bc" Oct 03 08:21:25 crc kubenswrapper[4664]: I1003 08:21:25.875986 4664 scope.go:117] "RemoveContainer" containerID="6559c0f3b65e30b8ddff55a52fd232a7626191dae7979b6ec4aabe70abbb11e6" Oct 03 08:21:25 crc kubenswrapper[4664]: I1003 08:21:25.922902 4664 scope.go:117] "RemoveContainer" containerID="457c614fa57ae92ea61f3660bf6ce50cb90ad8ec656c3e162df330f5da85a486" Oct 03 08:21:25 crc kubenswrapper[4664]: I1003 08:21:25.947490 4664 scope.go:117] "RemoveContainer" containerID="6c1a4104f1eb041df233e6d4b76bb980a6742eb3acd7a3585ff5f731ec33c810" Oct 03 08:21:33 crc kubenswrapper[4664]: I1003 08:21:33.877469 4664 scope.go:117] "RemoveContainer" containerID="bfd5903d53b3dc9e39f07ec7a1586cebd8a3cc45fb7b6842a928b48f35ec7cf9" Oct 03 08:21:33 crc kubenswrapper[4664]: E1003 08:21:33.878352 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:21:39 crc kubenswrapper[4664]: I1003 08:21:39.155367 4664 generic.go:334] "Generic (PLEG): container finished" podID="1bec1364-2710-466d-83d1-66e226d2e314" containerID="f75b3ffc493773561eadf39d86593052eb790a4271823b572aa9096dced2b29b" exitCode=2 Oct 03 08:21:39 crc kubenswrapper[4664]: I1003 08:21:39.155417 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gtls8" event={"ID":"1bec1364-2710-466d-83d1-66e226d2e314","Type":"ContainerDied","Data":"f75b3ffc493773561eadf39d86593052eb790a4271823b572aa9096dced2b29b"} Oct 03 08:21:40 crc kubenswrapper[4664]: I1003 08:21:40.585580 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gtls8" Oct 03 08:21:40 crc kubenswrapper[4664]: I1003 08:21:40.652046 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1bec1364-2710-466d-83d1-66e226d2e314-ssh-key\") pod \"1bec1364-2710-466d-83d1-66e226d2e314\" (UID: \"1bec1364-2710-466d-83d1-66e226d2e314\") " Oct 03 08:21:40 crc kubenswrapper[4664]: I1003 08:21:40.652185 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1bec1364-2710-466d-83d1-66e226d2e314-inventory\") pod \"1bec1364-2710-466d-83d1-66e226d2e314\" (UID: \"1bec1364-2710-466d-83d1-66e226d2e314\") " Oct 03 08:21:40 crc kubenswrapper[4664]: I1003 08:21:40.652502 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vp4lg\" (UniqueName: \"kubernetes.io/projected/1bec1364-2710-466d-83d1-66e226d2e314-kube-api-access-vp4lg\") pod \"1bec1364-2710-466d-83d1-66e226d2e314\" (UID: \"1bec1364-2710-466d-83d1-66e226d2e314\") " Oct 03 08:21:40 crc kubenswrapper[4664]: I1003 08:21:40.665942 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bec1364-2710-466d-83d1-66e226d2e314-kube-api-access-vp4lg" (OuterVolumeSpecName: "kube-api-access-vp4lg") pod "1bec1364-2710-466d-83d1-66e226d2e314" (UID: "1bec1364-2710-466d-83d1-66e226d2e314"). InnerVolumeSpecName "kube-api-access-vp4lg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:21:40 crc kubenswrapper[4664]: I1003 08:21:40.685513 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bec1364-2710-466d-83d1-66e226d2e314-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1bec1364-2710-466d-83d1-66e226d2e314" (UID: "1bec1364-2710-466d-83d1-66e226d2e314"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:21:40 crc kubenswrapper[4664]: I1003 08:21:40.694594 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bec1364-2710-466d-83d1-66e226d2e314-inventory" (OuterVolumeSpecName: "inventory") pod "1bec1364-2710-466d-83d1-66e226d2e314" (UID: "1bec1364-2710-466d-83d1-66e226d2e314"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:21:40 crc kubenswrapper[4664]: I1003 08:21:40.754995 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vp4lg\" (UniqueName: \"kubernetes.io/projected/1bec1364-2710-466d-83d1-66e226d2e314-kube-api-access-vp4lg\") on node \"crc\" DevicePath \"\"" Oct 03 08:21:40 crc kubenswrapper[4664]: I1003 08:21:40.755032 4664 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1bec1364-2710-466d-83d1-66e226d2e314-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 08:21:40 crc kubenswrapper[4664]: I1003 08:21:40.755044 4664 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1bec1364-2710-466d-83d1-66e226d2e314-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 08:21:41 crc kubenswrapper[4664]: I1003 08:21:41.177497 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gtls8" event={"ID":"1bec1364-2710-466d-83d1-66e226d2e314","Type":"ContainerDied","Data":"e72dd274caf644a614498a6a1e384f1d4f5df7358f81866bfbcdf7e27c048952"} Oct 03 08:21:41 crc kubenswrapper[4664]: I1003 08:21:41.177901 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e72dd274caf644a614498a6a1e384f1d4f5df7358f81866bfbcdf7e27c048952" Oct 03 08:21:41 crc kubenswrapper[4664]: I1003 08:21:41.177588 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gtls8" Oct 03 08:21:45 crc kubenswrapper[4664]: I1003 08:21:45.876943 4664 scope.go:117] "RemoveContainer" containerID="bfd5903d53b3dc9e39f07ec7a1586cebd8a3cc45fb7b6842a928b48f35ec7cf9" Oct 03 08:21:46 crc kubenswrapper[4664]: I1003 08:21:46.243150 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" event={"ID":"598b81ce-0ce7-498f-9337-ae5e6e64682b","Type":"ContainerStarted","Data":"be55d517673418e2a00ffa031e5df1b78c2e2063781c6c50b4ccffd65918f5b0"} Oct 03 08:21:48 crc kubenswrapper[4664]: I1003 08:21:48.036926 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-77v4r"] Oct 03 08:21:48 crc kubenswrapper[4664]: E1003 08:21:48.038220 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bec1364-2710-466d-83d1-66e226d2e314" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 03 08:21:48 crc kubenswrapper[4664]: I1003 08:21:48.038239 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bec1364-2710-466d-83d1-66e226d2e314" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 03 08:21:48 crc kubenswrapper[4664]: I1003 08:21:48.038503 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bec1364-2710-466d-83d1-66e226d2e314" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 03 08:21:48 crc kubenswrapper[4664]: I1003 08:21:48.039452 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-77v4r" Oct 03 08:21:48 crc kubenswrapper[4664]: I1003 08:21:48.043592 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h6s74" Oct 03 08:21:48 crc kubenswrapper[4664]: I1003 08:21:48.043853 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 08:21:48 crc kubenswrapper[4664]: I1003 08:21:48.044108 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 08:21:48 crc kubenswrapper[4664]: I1003 08:21:48.044293 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 08:21:48 crc kubenswrapper[4664]: I1003 08:21:48.080671 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-77v4r"] Oct 03 08:21:48 crc kubenswrapper[4664]: I1003 08:21:48.117711 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fa362fe3-175b-4212-b34c-341eab1572cf-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-77v4r\" (UID: \"fa362fe3-175b-4212-b34c-341eab1572cf\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-77v4r" Oct 03 08:21:48 crc kubenswrapper[4664]: I1003 08:21:48.117818 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fa362fe3-175b-4212-b34c-341eab1572cf-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-77v4r\" (UID: \"fa362fe3-175b-4212-b34c-341eab1572cf\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-77v4r" Oct 03 08:21:48 crc kubenswrapper[4664]: I1003 08:21:48.117905 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkw5w\" (UniqueName: \"kubernetes.io/projected/fa362fe3-175b-4212-b34c-341eab1572cf-kube-api-access-nkw5w\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-77v4r\" (UID: \"fa362fe3-175b-4212-b34c-341eab1572cf\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-77v4r" Oct 03 08:21:48 crc kubenswrapper[4664]: I1003 08:21:48.219676 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkw5w\" (UniqueName: \"kubernetes.io/projected/fa362fe3-175b-4212-b34c-341eab1572cf-kube-api-access-nkw5w\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-77v4r\" (UID: \"fa362fe3-175b-4212-b34c-341eab1572cf\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-77v4r" Oct 03 08:21:48 crc kubenswrapper[4664]: I1003 08:21:48.219877 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fa362fe3-175b-4212-b34c-341eab1572cf-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-77v4r\" (UID: \"fa362fe3-175b-4212-b34c-341eab1572cf\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-77v4r" Oct 03 08:21:48 crc kubenswrapper[4664]: I1003 08:21:48.219944 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fa362fe3-175b-4212-b34c-341eab1572cf-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-77v4r\" (UID: \"fa362fe3-175b-4212-b34c-341eab1572cf\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-77v4r" Oct 03 08:21:48 crc kubenswrapper[4664]: I1003 08:21:48.230992 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fa362fe3-175b-4212-b34c-341eab1572cf-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-77v4r\" (UID: \"fa362fe3-175b-4212-b34c-341eab1572cf\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-77v4r" Oct 03 08:21:48 crc kubenswrapper[4664]: I1003 08:21:48.239949 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkw5w\" (UniqueName: \"kubernetes.io/projected/fa362fe3-175b-4212-b34c-341eab1572cf-kube-api-access-nkw5w\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-77v4r\" (UID: \"fa362fe3-175b-4212-b34c-341eab1572cf\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-77v4r" Oct 03 08:21:48 crc kubenswrapper[4664]: I1003 08:21:48.240396 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fa362fe3-175b-4212-b34c-341eab1572cf-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-77v4r\" (UID: \"fa362fe3-175b-4212-b34c-341eab1572cf\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-77v4r" Oct 03 08:21:48 crc kubenswrapper[4664]: I1003 08:21:48.365763 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-77v4r" Oct 03 08:21:48 crc kubenswrapper[4664]: I1003 08:21:48.882499 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-77v4r"] Oct 03 08:21:49 crc kubenswrapper[4664]: I1003 08:21:49.280473 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-77v4r" event={"ID":"fa362fe3-175b-4212-b34c-341eab1572cf","Type":"ContainerStarted","Data":"69948ce356e51ab67998072d5f8c92156426ba1c3fe7b4f57c1f576c323bfb17"} Oct 03 08:21:49 crc kubenswrapper[4664]: I1003 08:21:49.281051 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-77v4r" event={"ID":"fa362fe3-175b-4212-b34c-341eab1572cf","Type":"ContainerStarted","Data":"89c5ed45760366f6e231e7840c768dd2318282935c439501b2184e8ecedfebcf"} Oct 03 08:21:49 crc kubenswrapper[4664]: I1003 08:21:49.288989 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2nlrn"] Oct 03 08:21:49 crc kubenswrapper[4664]: I1003 08:21:49.292755 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2nlrn" Oct 03 08:21:49 crc kubenswrapper[4664]: I1003 08:21:49.304095 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2nlrn"] Oct 03 08:21:49 crc kubenswrapper[4664]: I1003 08:21:49.313435 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-77v4r" podStartSLOduration=1.157500603 podStartE2EDuration="1.31341189s" podCreationTimestamp="2025-10-03 08:21:48 +0000 UTC" firstStartedPulling="2025-10-03 08:21:48.893543167 +0000 UTC m=+2009.714733657" lastFinishedPulling="2025-10-03 08:21:49.049454454 +0000 UTC m=+2009.870644944" observedRunningTime="2025-10-03 08:21:49.301036649 +0000 UTC m=+2010.122227139" watchObservedRunningTime="2025-10-03 08:21:49.31341189 +0000 UTC m=+2010.134602380" Oct 03 08:21:49 crc kubenswrapper[4664]: I1003 08:21:49.346004 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2q5m\" (UniqueName: \"kubernetes.io/projected/fa1b990a-87d4-4c51-b834-474fd1b6373c-kube-api-access-z2q5m\") pod \"community-operators-2nlrn\" (UID: \"fa1b990a-87d4-4c51-b834-474fd1b6373c\") " pod="openshift-marketplace/community-operators-2nlrn" Oct 03 08:21:49 crc kubenswrapper[4664]: I1003 08:21:49.346149 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa1b990a-87d4-4c51-b834-474fd1b6373c-utilities\") pod \"community-operators-2nlrn\" (UID: \"fa1b990a-87d4-4c51-b834-474fd1b6373c\") " pod="openshift-marketplace/community-operators-2nlrn" Oct 03 08:21:49 crc kubenswrapper[4664]: I1003 08:21:49.346267 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa1b990a-87d4-4c51-b834-474fd1b6373c-catalog-content\") pod \"community-operators-2nlrn\" (UID: \"fa1b990a-87d4-4c51-b834-474fd1b6373c\") " pod="openshift-marketplace/community-operators-2nlrn" Oct 03 08:21:49 crc kubenswrapper[4664]: I1003 08:21:49.448436 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2q5m\" (UniqueName: \"kubernetes.io/projected/fa1b990a-87d4-4c51-b834-474fd1b6373c-kube-api-access-z2q5m\") pod \"community-operators-2nlrn\" (UID: \"fa1b990a-87d4-4c51-b834-474fd1b6373c\") " pod="openshift-marketplace/community-operators-2nlrn" Oct 03 08:21:49 crc kubenswrapper[4664]: I1003 08:21:49.448546 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa1b990a-87d4-4c51-b834-474fd1b6373c-utilities\") pod \"community-operators-2nlrn\" (UID: \"fa1b990a-87d4-4c51-b834-474fd1b6373c\") " pod="openshift-marketplace/community-operators-2nlrn" Oct 03 08:21:49 crc kubenswrapper[4664]: I1003 08:21:49.448578 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa1b990a-87d4-4c51-b834-474fd1b6373c-catalog-content\") pod \"community-operators-2nlrn\" (UID: \"fa1b990a-87d4-4c51-b834-474fd1b6373c\") " pod="openshift-marketplace/community-operators-2nlrn" Oct 03 08:21:49 crc kubenswrapper[4664]: I1003 08:21:49.449138 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa1b990a-87d4-4c51-b834-474fd1b6373c-catalog-content\") pod \"community-operators-2nlrn\" (UID: \"fa1b990a-87d4-4c51-b834-474fd1b6373c\") " pod="openshift-marketplace/community-operators-2nlrn" Oct 03 08:21:49 crc kubenswrapper[4664]: I1003 08:21:49.449284 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa1b990a-87d4-4c51-b834-474fd1b6373c-utilities\") pod \"community-operators-2nlrn\" (UID: \"fa1b990a-87d4-4c51-b834-474fd1b6373c\") " pod="openshift-marketplace/community-operators-2nlrn" Oct 03 08:21:49 crc kubenswrapper[4664]: I1003 08:21:49.466591 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2q5m\" (UniqueName: \"kubernetes.io/projected/fa1b990a-87d4-4c51-b834-474fd1b6373c-kube-api-access-z2q5m\") pod \"community-operators-2nlrn\" (UID: \"fa1b990a-87d4-4c51-b834-474fd1b6373c\") " pod="openshift-marketplace/community-operators-2nlrn" Oct 03 08:21:49 crc kubenswrapper[4664]: I1003 08:21:49.631476 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2nlrn" Oct 03 08:21:50 crc kubenswrapper[4664]: I1003 08:21:50.188580 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2nlrn"] Oct 03 08:21:50 crc kubenswrapper[4664]: I1003 08:21:50.295546 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2nlrn" event={"ID":"fa1b990a-87d4-4c51-b834-474fd1b6373c","Type":"ContainerStarted","Data":"e1969815be0a76bb57242d8895455dd56d493f3458af1f6c294c28ebe10a8ef0"} Oct 03 08:21:51 crc kubenswrapper[4664]: I1003 08:21:51.308323 4664 generic.go:334] "Generic (PLEG): container finished" podID="fa1b990a-87d4-4c51-b834-474fd1b6373c" containerID="c0d2fbe01dd501ae66b4870802127c4352a5f2a1593d6c332c84f54bc1596d5b" exitCode=0 Oct 03 08:21:51 crc kubenswrapper[4664]: I1003 08:21:51.308377 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2nlrn" event={"ID":"fa1b990a-87d4-4c51-b834-474fd1b6373c","Type":"ContainerDied","Data":"c0d2fbe01dd501ae66b4870802127c4352a5f2a1593d6c332c84f54bc1596d5b"} Oct 03 08:21:52 crc kubenswrapper[4664]: I1003 08:21:52.320757 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2nlrn" event={"ID":"fa1b990a-87d4-4c51-b834-474fd1b6373c","Type":"ContainerStarted","Data":"eccb4458fbc5ef76a32a07578d4f2cac7550efda1ea0c24c1e9ebd30e0114565"} Oct 03 08:21:53 crc kubenswrapper[4664]: I1003 08:21:53.330466 4664 generic.go:334] "Generic (PLEG): container finished" podID="fa1b990a-87d4-4c51-b834-474fd1b6373c" containerID="eccb4458fbc5ef76a32a07578d4f2cac7550efda1ea0c24c1e9ebd30e0114565" exitCode=0 Oct 03 08:21:53 crc kubenswrapper[4664]: I1003 08:21:53.330535 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2nlrn" event={"ID":"fa1b990a-87d4-4c51-b834-474fd1b6373c","Type":"ContainerDied","Data":"eccb4458fbc5ef76a32a07578d4f2cac7550efda1ea0c24c1e9ebd30e0114565"} Oct 03 08:21:54 crc kubenswrapper[4664]: I1003 08:21:54.341055 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2nlrn" event={"ID":"fa1b990a-87d4-4c51-b834-474fd1b6373c","Type":"ContainerStarted","Data":"6e3f25f7cba3ec63403bee8597f4b7d227e4c47ce1f44b568614c83bb2059925"} Oct 03 08:21:54 crc kubenswrapper[4664]: I1003 08:21:54.367677 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2nlrn" podStartSLOduration=2.9044122999999997 podStartE2EDuration="5.367651708s" podCreationTimestamp="2025-10-03 08:21:49 +0000 UTC" firstStartedPulling="2025-10-03 08:21:51.31128223 +0000 UTC m=+2012.132472720" lastFinishedPulling="2025-10-03 08:21:53.774521638 +0000 UTC m=+2014.595712128" observedRunningTime="2025-10-03 08:21:54.357065506 +0000 UTC m=+2015.178256016" watchObservedRunningTime="2025-10-03 08:21:54.367651708 +0000 UTC m=+2015.188842208" Oct 03 08:21:59 crc kubenswrapper[4664]: I1003 08:21:59.632062 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2nlrn" Oct 03 08:21:59 crc kubenswrapper[4664]: I1003 08:21:59.632884 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2nlrn" Oct 03 08:21:59 crc kubenswrapper[4664]: I1003 08:21:59.681374 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2nlrn" Oct 03 08:22:00 crc kubenswrapper[4664]: I1003 08:22:00.040488 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-hvj8z"] Oct 03 08:22:00 crc kubenswrapper[4664]: I1003 08:22:00.049575 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-hvj8z"] Oct 03 08:22:00 crc kubenswrapper[4664]: I1003 08:22:00.446531 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2nlrn" Oct 03 08:22:00 crc kubenswrapper[4664]: I1003 08:22:00.500625 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2nlrn"] Oct 03 08:22:01 crc kubenswrapper[4664]: I1003 08:22:01.886792 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="149fac5c-dc4f-47b5-94f7-c36271ad6bc6" path="/var/lib/kubelet/pods/149fac5c-dc4f-47b5-94f7-c36271ad6bc6/volumes" Oct 03 08:22:02 crc kubenswrapper[4664]: I1003 08:22:02.412597 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2nlrn" podUID="fa1b990a-87d4-4c51-b834-474fd1b6373c" containerName="registry-server" containerID="cri-o://6e3f25f7cba3ec63403bee8597f4b7d227e4c47ce1f44b568614c83bb2059925" gracePeriod=2 Oct 03 08:22:02 crc kubenswrapper[4664]: I1003 08:22:02.871743 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2nlrn" Oct 03 08:22:03 crc kubenswrapper[4664]: I1003 08:22:03.025234 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2q5m\" (UniqueName: \"kubernetes.io/projected/fa1b990a-87d4-4c51-b834-474fd1b6373c-kube-api-access-z2q5m\") pod \"fa1b990a-87d4-4c51-b834-474fd1b6373c\" (UID: \"fa1b990a-87d4-4c51-b834-474fd1b6373c\") " Oct 03 08:22:03 crc kubenswrapper[4664]: I1003 08:22:03.025325 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa1b990a-87d4-4c51-b834-474fd1b6373c-catalog-content\") pod \"fa1b990a-87d4-4c51-b834-474fd1b6373c\" (UID: \"fa1b990a-87d4-4c51-b834-474fd1b6373c\") " Oct 03 08:22:03 crc kubenswrapper[4664]: I1003 08:22:03.025381 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa1b990a-87d4-4c51-b834-474fd1b6373c-utilities\") pod \"fa1b990a-87d4-4c51-b834-474fd1b6373c\" (UID: \"fa1b990a-87d4-4c51-b834-474fd1b6373c\") " Oct 03 08:22:03 crc kubenswrapper[4664]: I1003 08:22:03.026750 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa1b990a-87d4-4c51-b834-474fd1b6373c-utilities" (OuterVolumeSpecName: "utilities") pod "fa1b990a-87d4-4c51-b834-474fd1b6373c" (UID: "fa1b990a-87d4-4c51-b834-474fd1b6373c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:22:03 crc kubenswrapper[4664]: I1003 08:22:03.035334 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa1b990a-87d4-4c51-b834-474fd1b6373c-kube-api-access-z2q5m" (OuterVolumeSpecName: "kube-api-access-z2q5m") pod "fa1b990a-87d4-4c51-b834-474fd1b6373c" (UID: "fa1b990a-87d4-4c51-b834-474fd1b6373c"). InnerVolumeSpecName "kube-api-access-z2q5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:22:03 crc kubenswrapper[4664]: I1003 08:22:03.076369 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa1b990a-87d4-4c51-b834-474fd1b6373c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fa1b990a-87d4-4c51-b834-474fd1b6373c" (UID: "fa1b990a-87d4-4c51-b834-474fd1b6373c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:22:03 crc kubenswrapper[4664]: I1003 08:22:03.128705 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2q5m\" (UniqueName: \"kubernetes.io/projected/fa1b990a-87d4-4c51-b834-474fd1b6373c-kube-api-access-z2q5m\") on node \"crc\" DevicePath \"\"" Oct 03 08:22:03 crc kubenswrapper[4664]: I1003 08:22:03.128765 4664 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa1b990a-87d4-4c51-b834-474fd1b6373c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:22:03 crc kubenswrapper[4664]: I1003 08:22:03.128778 4664 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa1b990a-87d4-4c51-b834-474fd1b6373c-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:22:03 crc kubenswrapper[4664]: I1003 08:22:03.423559 4664 generic.go:334] "Generic (PLEG): container finished" podID="fa1b990a-87d4-4c51-b834-474fd1b6373c" containerID="6e3f25f7cba3ec63403bee8597f4b7d227e4c47ce1f44b568614c83bb2059925" exitCode=0 Oct 03 08:22:03 crc kubenswrapper[4664]: I1003 08:22:03.423636 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2nlrn" event={"ID":"fa1b990a-87d4-4c51-b834-474fd1b6373c","Type":"ContainerDied","Data":"6e3f25f7cba3ec63403bee8597f4b7d227e4c47ce1f44b568614c83bb2059925"} Oct 03 08:22:03 crc kubenswrapper[4664]: I1003 08:22:03.424050 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2nlrn" event={"ID":"fa1b990a-87d4-4c51-b834-474fd1b6373c","Type":"ContainerDied","Data":"e1969815be0a76bb57242d8895455dd56d493f3458af1f6c294c28ebe10a8ef0"} Oct 03 08:22:03 crc kubenswrapper[4664]: I1003 08:22:03.423679 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2nlrn" Oct 03 08:22:03 crc kubenswrapper[4664]: I1003 08:22:03.424074 4664 scope.go:117] "RemoveContainer" containerID="6e3f25f7cba3ec63403bee8597f4b7d227e4c47ce1f44b568614c83bb2059925" Oct 03 08:22:03 crc kubenswrapper[4664]: I1003 08:22:03.458855 4664 scope.go:117] "RemoveContainer" containerID="eccb4458fbc5ef76a32a07578d4f2cac7550efda1ea0c24c1e9ebd30e0114565" Oct 03 08:22:03 crc kubenswrapper[4664]: I1003 08:22:03.476192 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2nlrn"] Oct 03 08:22:03 crc kubenswrapper[4664]: I1003 08:22:03.485236 4664 scope.go:117] "RemoveContainer" containerID="c0d2fbe01dd501ae66b4870802127c4352a5f2a1593d6c332c84f54bc1596d5b" Oct 03 08:22:03 crc kubenswrapper[4664]: I1003 08:22:03.486850 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2nlrn"] Oct 03 08:22:03 crc kubenswrapper[4664]: I1003 08:22:03.541413 4664 scope.go:117] "RemoveContainer" containerID="6e3f25f7cba3ec63403bee8597f4b7d227e4c47ce1f44b568614c83bb2059925" Oct 03 08:22:03 crc kubenswrapper[4664]: E1003 08:22:03.542103 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e3f25f7cba3ec63403bee8597f4b7d227e4c47ce1f44b568614c83bb2059925\": container with ID starting with 6e3f25f7cba3ec63403bee8597f4b7d227e4c47ce1f44b568614c83bb2059925 not found: ID does not exist" containerID="6e3f25f7cba3ec63403bee8597f4b7d227e4c47ce1f44b568614c83bb2059925" Oct 03 08:22:03 crc kubenswrapper[4664]: I1003 08:22:03.542165 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e3f25f7cba3ec63403bee8597f4b7d227e4c47ce1f44b568614c83bb2059925"} err="failed to get container status \"6e3f25f7cba3ec63403bee8597f4b7d227e4c47ce1f44b568614c83bb2059925\": rpc error: code = NotFound desc = could not find container \"6e3f25f7cba3ec63403bee8597f4b7d227e4c47ce1f44b568614c83bb2059925\": container with ID starting with 6e3f25f7cba3ec63403bee8597f4b7d227e4c47ce1f44b568614c83bb2059925 not found: ID does not exist" Oct 03 08:22:03 crc kubenswrapper[4664]: I1003 08:22:03.542194 4664 scope.go:117] "RemoveContainer" containerID="eccb4458fbc5ef76a32a07578d4f2cac7550efda1ea0c24c1e9ebd30e0114565" Oct 03 08:22:03 crc kubenswrapper[4664]: E1003 08:22:03.542543 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eccb4458fbc5ef76a32a07578d4f2cac7550efda1ea0c24c1e9ebd30e0114565\": container with ID starting with eccb4458fbc5ef76a32a07578d4f2cac7550efda1ea0c24c1e9ebd30e0114565 not found: ID does not exist" containerID="eccb4458fbc5ef76a32a07578d4f2cac7550efda1ea0c24c1e9ebd30e0114565" Oct 03 08:22:03 crc kubenswrapper[4664]: I1003 08:22:03.542582 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eccb4458fbc5ef76a32a07578d4f2cac7550efda1ea0c24c1e9ebd30e0114565"} err="failed to get container status \"eccb4458fbc5ef76a32a07578d4f2cac7550efda1ea0c24c1e9ebd30e0114565\": rpc error: code = NotFound desc = could not find container \"eccb4458fbc5ef76a32a07578d4f2cac7550efda1ea0c24c1e9ebd30e0114565\": container with ID starting with eccb4458fbc5ef76a32a07578d4f2cac7550efda1ea0c24c1e9ebd30e0114565 not found: ID does not exist" Oct 03 08:22:03 crc kubenswrapper[4664]: I1003 08:22:03.542626 4664 scope.go:117] "RemoveContainer" containerID="c0d2fbe01dd501ae66b4870802127c4352a5f2a1593d6c332c84f54bc1596d5b" Oct 03 08:22:03 crc kubenswrapper[4664]: E1003 08:22:03.543797 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0d2fbe01dd501ae66b4870802127c4352a5f2a1593d6c332c84f54bc1596d5b\": container with ID starting with c0d2fbe01dd501ae66b4870802127c4352a5f2a1593d6c332c84f54bc1596d5b not found: ID does not exist" containerID="c0d2fbe01dd501ae66b4870802127c4352a5f2a1593d6c332c84f54bc1596d5b" Oct 03 08:22:03 crc kubenswrapper[4664]: I1003 08:22:03.543915 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0d2fbe01dd501ae66b4870802127c4352a5f2a1593d6c332c84f54bc1596d5b"} err="failed to get container status \"c0d2fbe01dd501ae66b4870802127c4352a5f2a1593d6c332c84f54bc1596d5b\": rpc error: code = NotFound desc = could not find container \"c0d2fbe01dd501ae66b4870802127c4352a5f2a1593d6c332c84f54bc1596d5b\": container with ID starting with c0d2fbe01dd501ae66b4870802127c4352a5f2a1593d6c332c84f54bc1596d5b not found: ID does not exist" Oct 03 08:22:03 crc kubenswrapper[4664]: I1003 08:22:03.889799 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa1b990a-87d4-4c51-b834-474fd1b6373c" path="/var/lib/kubelet/pods/fa1b990a-87d4-4c51-b834-474fd1b6373c/volumes" Oct 03 08:22:14 crc kubenswrapper[4664]: I1003 08:22:14.256053 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-llvzt"] Oct 03 08:22:14 crc kubenswrapper[4664]: E1003 08:22:14.257262 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa1b990a-87d4-4c51-b834-474fd1b6373c" containerName="extract-utilities" Oct 03 08:22:14 crc kubenswrapper[4664]: I1003 08:22:14.257283 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa1b990a-87d4-4c51-b834-474fd1b6373c" containerName="extract-utilities" Oct 03 08:22:14 crc kubenswrapper[4664]: E1003 08:22:14.257301 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa1b990a-87d4-4c51-b834-474fd1b6373c" containerName="extract-content" Oct 03 08:22:14 crc kubenswrapper[4664]: I1003 08:22:14.257308 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa1b990a-87d4-4c51-b834-474fd1b6373c" containerName="extract-content" Oct 03 08:22:14 crc kubenswrapper[4664]: E1003 08:22:14.257324 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa1b990a-87d4-4c51-b834-474fd1b6373c" containerName="registry-server" Oct 03 08:22:14 crc kubenswrapper[4664]: I1003 08:22:14.257333 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa1b990a-87d4-4c51-b834-474fd1b6373c" containerName="registry-server" Oct 03 08:22:14 crc kubenswrapper[4664]: I1003 08:22:14.257630 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa1b990a-87d4-4c51-b834-474fd1b6373c" containerName="registry-server" Oct 03 08:22:14 crc kubenswrapper[4664]: I1003 08:22:14.259500 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-llvzt" Oct 03 08:22:14 crc kubenswrapper[4664]: I1003 08:22:14.267295 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-llvzt"] Oct 03 08:22:14 crc kubenswrapper[4664]: I1003 08:22:14.268748 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2073f10-19b0-41ed-8357-753ad2c6220f-catalog-content\") pod \"certified-operators-llvzt\" (UID: \"c2073f10-19b0-41ed-8357-753ad2c6220f\") " pod="openshift-marketplace/certified-operators-llvzt" Oct 03 08:22:14 crc kubenswrapper[4664]: I1003 08:22:14.268917 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k65cb\" (UniqueName: \"kubernetes.io/projected/c2073f10-19b0-41ed-8357-753ad2c6220f-kube-api-access-k65cb\") pod \"certified-operators-llvzt\" (UID: \"c2073f10-19b0-41ed-8357-753ad2c6220f\") " pod="openshift-marketplace/certified-operators-llvzt" Oct 03 08:22:14 crc kubenswrapper[4664]: I1003 08:22:14.269116 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2073f10-19b0-41ed-8357-753ad2c6220f-utilities\") pod \"certified-operators-llvzt\" (UID: \"c2073f10-19b0-41ed-8357-753ad2c6220f\") " pod="openshift-marketplace/certified-operators-llvzt" Oct 03 08:22:14 crc kubenswrapper[4664]: I1003 08:22:14.371272 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k65cb\" (UniqueName: \"kubernetes.io/projected/c2073f10-19b0-41ed-8357-753ad2c6220f-kube-api-access-k65cb\") pod \"certified-operators-llvzt\" (UID: \"c2073f10-19b0-41ed-8357-753ad2c6220f\") " pod="openshift-marketplace/certified-operators-llvzt" Oct 03 08:22:14 crc kubenswrapper[4664]: I1003 08:22:14.371696 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2073f10-19b0-41ed-8357-753ad2c6220f-utilities\") pod \"certified-operators-llvzt\" (UID: \"c2073f10-19b0-41ed-8357-753ad2c6220f\") " pod="openshift-marketplace/certified-operators-llvzt" Oct 03 08:22:14 crc kubenswrapper[4664]: I1003 08:22:14.371857 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2073f10-19b0-41ed-8357-753ad2c6220f-catalog-content\") pod \"certified-operators-llvzt\" (UID: \"c2073f10-19b0-41ed-8357-753ad2c6220f\") " pod="openshift-marketplace/certified-operators-llvzt" Oct 03 08:22:14 crc kubenswrapper[4664]: I1003 08:22:14.372486 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2073f10-19b0-41ed-8357-753ad2c6220f-catalog-content\") pod \"certified-operators-llvzt\" (UID: \"c2073f10-19b0-41ed-8357-753ad2c6220f\") " pod="openshift-marketplace/certified-operators-llvzt" Oct 03 08:22:14 crc kubenswrapper[4664]: I1003 08:22:14.373372 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2073f10-19b0-41ed-8357-753ad2c6220f-utilities\") pod \"certified-operators-llvzt\" (UID: \"c2073f10-19b0-41ed-8357-753ad2c6220f\") " pod="openshift-marketplace/certified-operators-llvzt" Oct 03 08:22:14 crc kubenswrapper[4664]: I1003 08:22:14.398113 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k65cb\" (UniqueName: \"kubernetes.io/projected/c2073f10-19b0-41ed-8357-753ad2c6220f-kube-api-access-k65cb\") pod \"certified-operators-llvzt\" (UID: \"c2073f10-19b0-41ed-8357-753ad2c6220f\") " pod="openshift-marketplace/certified-operators-llvzt" Oct 03 08:22:14 crc kubenswrapper[4664]: I1003 08:22:14.582522 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-llvzt" Oct 03 08:22:15 crc kubenswrapper[4664]: I1003 08:22:15.333558 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-llvzt"] Oct 03 08:22:15 crc kubenswrapper[4664]: I1003 08:22:15.611432 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-llvzt" event={"ID":"c2073f10-19b0-41ed-8357-753ad2c6220f","Type":"ContainerStarted","Data":"48571ca3a713fac2694b12f5e5235369ecfe00840943c33eb281a3a7a53e6fc3"} Oct 03 08:22:15 crc kubenswrapper[4664]: I1003 08:22:15.611486 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-llvzt" event={"ID":"c2073f10-19b0-41ed-8357-753ad2c6220f","Type":"ContainerStarted","Data":"c77832bfc18bc2b202df4c9f3b11bd2a536bec55750a1c50aee5318c933ead47"} Oct 03 08:22:16 crc kubenswrapper[4664]: I1003 08:22:16.628267 4664 generic.go:334] "Generic (PLEG): container finished" podID="c2073f10-19b0-41ed-8357-753ad2c6220f" containerID="48571ca3a713fac2694b12f5e5235369ecfe00840943c33eb281a3a7a53e6fc3" exitCode=0 Oct 03 08:22:16 crc kubenswrapper[4664]: I1003 08:22:16.628369 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-llvzt" event={"ID":"c2073f10-19b0-41ed-8357-753ad2c6220f","Type":"ContainerDied","Data":"48571ca3a713fac2694b12f5e5235369ecfe00840943c33eb281a3a7a53e6fc3"} Oct 03 08:22:16 crc kubenswrapper[4664]: I1003 08:22:16.631793 4664 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 08:22:17 crc kubenswrapper[4664]: I1003 08:22:17.638408 4664 generic.go:334] "Generic (PLEG): container finished" podID="c2073f10-19b0-41ed-8357-753ad2c6220f" containerID="57799a6f049ec612c555e2cb64ca6651a8f6976c67ba097c2b1042b0ed1a88ba" exitCode=0 Oct 03 08:22:17 crc kubenswrapper[4664]: I1003 08:22:17.638523 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-llvzt" event={"ID":"c2073f10-19b0-41ed-8357-753ad2c6220f","Type":"ContainerDied","Data":"57799a6f049ec612c555e2cb64ca6651a8f6976c67ba097c2b1042b0ed1a88ba"} Oct 03 08:22:18 crc kubenswrapper[4664]: I1003 08:22:18.652478 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-llvzt" event={"ID":"c2073f10-19b0-41ed-8357-753ad2c6220f","Type":"ContainerStarted","Data":"9d201200bfec627442784e6cfa6979991247ece3f8e9714c297b51dcf192755c"} Oct 03 08:22:18 crc kubenswrapper[4664]: I1003 08:22:18.673029 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-llvzt" podStartSLOduration=3.185413119 podStartE2EDuration="4.673007894s" podCreationTimestamp="2025-10-03 08:22:14 +0000 UTC" firstStartedPulling="2025-10-03 08:22:16.631373757 +0000 UTC m=+2037.452564247" lastFinishedPulling="2025-10-03 08:22:18.118968532 +0000 UTC m=+2038.940159022" observedRunningTime="2025-10-03 08:22:18.670399553 +0000 UTC m=+2039.491590063" watchObservedRunningTime="2025-10-03 08:22:18.673007894 +0000 UTC m=+2039.494198374" Oct 03 08:22:24 crc kubenswrapper[4664]: I1003 08:22:24.583247 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-llvzt" Oct 03 08:22:24 crc kubenswrapper[4664]: I1003 08:22:24.584803 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-llvzt" Oct 03 08:22:24 crc kubenswrapper[4664]: I1003 08:22:24.632419 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-llvzt" Oct 03 08:22:24 crc kubenswrapper[4664]: I1003 08:22:24.754669 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-llvzt" Oct 03 08:22:24 crc kubenswrapper[4664]: I1003 08:22:24.867321 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-llvzt"] Oct 03 08:22:26 crc kubenswrapper[4664]: I1003 08:22:26.088777 4664 scope.go:117] "RemoveContainer" containerID="f618544ea84ac2201d9c8f6c7f6c36caf56a676ebd6fd4faf75985b6cc2c7a97" Oct 03 08:22:26 crc kubenswrapper[4664]: I1003 08:22:26.725419 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-llvzt" podUID="c2073f10-19b0-41ed-8357-753ad2c6220f" containerName="registry-server" containerID="cri-o://9d201200bfec627442784e6cfa6979991247ece3f8e9714c297b51dcf192755c" gracePeriod=2 Oct 03 08:22:27 crc kubenswrapper[4664]: I1003 08:22:27.208210 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-llvzt" Oct 03 08:22:27 crc kubenswrapper[4664]: I1003 08:22:27.344059 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2073f10-19b0-41ed-8357-753ad2c6220f-catalog-content\") pod \"c2073f10-19b0-41ed-8357-753ad2c6220f\" (UID: \"c2073f10-19b0-41ed-8357-753ad2c6220f\") " Oct 03 08:22:27 crc kubenswrapper[4664]: I1003 08:22:27.344420 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k65cb\" (UniqueName: \"kubernetes.io/projected/c2073f10-19b0-41ed-8357-753ad2c6220f-kube-api-access-k65cb\") pod \"c2073f10-19b0-41ed-8357-753ad2c6220f\" (UID: \"c2073f10-19b0-41ed-8357-753ad2c6220f\") " Oct 03 08:22:27 crc kubenswrapper[4664]: I1003 08:22:27.344470 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2073f10-19b0-41ed-8357-753ad2c6220f-utilities\") pod \"c2073f10-19b0-41ed-8357-753ad2c6220f\" (UID: \"c2073f10-19b0-41ed-8357-753ad2c6220f\") " Oct 03 08:22:27 crc kubenswrapper[4664]: I1003 08:22:27.345439 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2073f10-19b0-41ed-8357-753ad2c6220f-utilities" (OuterVolumeSpecName: "utilities") pod "c2073f10-19b0-41ed-8357-753ad2c6220f" (UID: "c2073f10-19b0-41ed-8357-753ad2c6220f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:22:27 crc kubenswrapper[4664]: I1003 08:22:27.351169 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2073f10-19b0-41ed-8357-753ad2c6220f-kube-api-access-k65cb" (OuterVolumeSpecName: "kube-api-access-k65cb") pod "c2073f10-19b0-41ed-8357-753ad2c6220f" (UID: "c2073f10-19b0-41ed-8357-753ad2c6220f"). InnerVolumeSpecName "kube-api-access-k65cb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:22:27 crc kubenswrapper[4664]: I1003 08:22:27.395366 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2073f10-19b0-41ed-8357-753ad2c6220f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c2073f10-19b0-41ed-8357-753ad2c6220f" (UID: "c2073f10-19b0-41ed-8357-753ad2c6220f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:22:27 crc kubenswrapper[4664]: I1003 08:22:27.447559 4664 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2073f10-19b0-41ed-8357-753ad2c6220f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:22:27 crc kubenswrapper[4664]: I1003 08:22:27.447599 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k65cb\" (UniqueName: \"kubernetes.io/projected/c2073f10-19b0-41ed-8357-753ad2c6220f-kube-api-access-k65cb\") on node \"crc\" DevicePath \"\"" Oct 03 08:22:27 crc kubenswrapper[4664]: I1003 08:22:27.447627 4664 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2073f10-19b0-41ed-8357-753ad2c6220f-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:22:27 crc kubenswrapper[4664]: I1003 08:22:27.735434 4664 generic.go:334] "Generic (PLEG): container finished" podID="c2073f10-19b0-41ed-8357-753ad2c6220f" containerID="9d201200bfec627442784e6cfa6979991247ece3f8e9714c297b51dcf192755c" exitCode=0 Oct 03 08:22:27 crc kubenswrapper[4664]: I1003 08:22:27.735488 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-llvzt" event={"ID":"c2073f10-19b0-41ed-8357-753ad2c6220f","Type":"ContainerDied","Data":"9d201200bfec627442784e6cfa6979991247ece3f8e9714c297b51dcf192755c"} Oct 03 08:22:27 crc kubenswrapper[4664]: I1003 08:22:27.735511 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-llvzt" Oct 03 08:22:27 crc kubenswrapper[4664]: I1003 08:22:27.735528 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-llvzt" event={"ID":"c2073f10-19b0-41ed-8357-753ad2c6220f","Type":"ContainerDied","Data":"c77832bfc18bc2b202df4c9f3b11bd2a536bec55750a1c50aee5318c933ead47"} Oct 03 08:22:27 crc kubenswrapper[4664]: I1003 08:22:27.735548 4664 scope.go:117] "RemoveContainer" containerID="9d201200bfec627442784e6cfa6979991247ece3f8e9714c297b51dcf192755c" Oct 03 08:22:27 crc kubenswrapper[4664]: I1003 08:22:27.774719 4664 scope.go:117] "RemoveContainer" containerID="57799a6f049ec612c555e2cb64ca6651a8f6976c67ba097c2b1042b0ed1a88ba" Oct 03 08:22:27 crc kubenswrapper[4664]: I1003 08:22:27.779437 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-llvzt"] Oct 03 08:22:27 crc kubenswrapper[4664]: I1003 08:22:27.786872 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-llvzt"] Oct 03 08:22:27 crc kubenswrapper[4664]: I1003 08:22:27.805544 4664 scope.go:117] "RemoveContainer" containerID="48571ca3a713fac2694b12f5e5235369ecfe00840943c33eb281a3a7a53e6fc3" Oct 03 08:22:27 crc kubenswrapper[4664]: I1003 08:22:27.854195 4664 scope.go:117] "RemoveContainer" containerID="9d201200bfec627442784e6cfa6979991247ece3f8e9714c297b51dcf192755c" Oct 03 08:22:27 crc kubenswrapper[4664]: E1003 08:22:27.854946 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d201200bfec627442784e6cfa6979991247ece3f8e9714c297b51dcf192755c\": container with ID starting with 9d201200bfec627442784e6cfa6979991247ece3f8e9714c297b51dcf192755c not found: ID does not exist" containerID="9d201200bfec627442784e6cfa6979991247ece3f8e9714c297b51dcf192755c" Oct 03 08:22:27 crc kubenswrapper[4664]: I1003 08:22:27.854980 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d201200bfec627442784e6cfa6979991247ece3f8e9714c297b51dcf192755c"} err="failed to get container status \"9d201200bfec627442784e6cfa6979991247ece3f8e9714c297b51dcf192755c\": rpc error: code = NotFound desc = could not find container \"9d201200bfec627442784e6cfa6979991247ece3f8e9714c297b51dcf192755c\": container with ID starting with 9d201200bfec627442784e6cfa6979991247ece3f8e9714c297b51dcf192755c not found: ID does not exist" Oct 03 08:22:27 crc kubenswrapper[4664]: I1003 08:22:27.855003 4664 scope.go:117] "RemoveContainer" containerID="57799a6f049ec612c555e2cb64ca6651a8f6976c67ba097c2b1042b0ed1a88ba" Oct 03 08:22:27 crc kubenswrapper[4664]: E1003 08:22:27.855561 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57799a6f049ec612c555e2cb64ca6651a8f6976c67ba097c2b1042b0ed1a88ba\": container with ID starting with 57799a6f049ec612c555e2cb64ca6651a8f6976c67ba097c2b1042b0ed1a88ba not found: ID does not exist" containerID="57799a6f049ec612c555e2cb64ca6651a8f6976c67ba097c2b1042b0ed1a88ba" Oct 03 08:22:27 crc kubenswrapper[4664]: I1003 08:22:27.855660 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57799a6f049ec612c555e2cb64ca6651a8f6976c67ba097c2b1042b0ed1a88ba"} err="failed to get container status \"57799a6f049ec612c555e2cb64ca6651a8f6976c67ba097c2b1042b0ed1a88ba\": rpc error: code = NotFound desc = could not find container \"57799a6f049ec612c555e2cb64ca6651a8f6976c67ba097c2b1042b0ed1a88ba\": container with ID starting with 57799a6f049ec612c555e2cb64ca6651a8f6976c67ba097c2b1042b0ed1a88ba not found: ID does not exist" Oct 03 08:22:27 crc kubenswrapper[4664]: I1003 08:22:27.855685 4664 scope.go:117] "RemoveContainer" containerID="48571ca3a713fac2694b12f5e5235369ecfe00840943c33eb281a3a7a53e6fc3" Oct 03 08:22:27 crc kubenswrapper[4664]: E1003 08:22:27.856199 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48571ca3a713fac2694b12f5e5235369ecfe00840943c33eb281a3a7a53e6fc3\": container with ID starting with 48571ca3a713fac2694b12f5e5235369ecfe00840943c33eb281a3a7a53e6fc3 not found: ID does not exist" containerID="48571ca3a713fac2694b12f5e5235369ecfe00840943c33eb281a3a7a53e6fc3" Oct 03 08:22:27 crc kubenswrapper[4664]: I1003 08:22:27.856232 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48571ca3a713fac2694b12f5e5235369ecfe00840943c33eb281a3a7a53e6fc3"} err="failed to get container status \"48571ca3a713fac2694b12f5e5235369ecfe00840943c33eb281a3a7a53e6fc3\": rpc error: code = NotFound desc = could not find container \"48571ca3a713fac2694b12f5e5235369ecfe00840943c33eb281a3a7a53e6fc3\": container with ID starting with 48571ca3a713fac2694b12f5e5235369ecfe00840943c33eb281a3a7a53e6fc3 not found: ID does not exist" Oct 03 08:22:27 crc kubenswrapper[4664]: I1003 08:22:27.887699 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2073f10-19b0-41ed-8357-753ad2c6220f" path="/var/lib/kubelet/pods/c2073f10-19b0-41ed-8357-753ad2c6220f/volumes" Oct 03 08:22:30 crc kubenswrapper[4664]: I1003 08:22:30.280041 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vnrkk"] Oct 03 08:22:30 crc kubenswrapper[4664]: E1003 08:22:30.280954 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2073f10-19b0-41ed-8357-753ad2c6220f" containerName="registry-server" Oct 03 08:22:30 crc kubenswrapper[4664]: I1003 08:22:30.280974 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2073f10-19b0-41ed-8357-753ad2c6220f" containerName="registry-server" Oct 03 08:22:30 crc kubenswrapper[4664]: E1003 08:22:30.280992 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2073f10-19b0-41ed-8357-753ad2c6220f" containerName="extract-content" Oct 03 08:22:30 crc kubenswrapper[4664]: I1003 08:22:30.281003 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2073f10-19b0-41ed-8357-753ad2c6220f" containerName="extract-content" Oct 03 08:22:30 crc kubenswrapper[4664]: E1003 08:22:30.281054 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2073f10-19b0-41ed-8357-753ad2c6220f" containerName="extract-utilities" Oct 03 08:22:30 crc kubenswrapper[4664]: I1003 08:22:30.281066 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2073f10-19b0-41ed-8357-753ad2c6220f" containerName="extract-utilities" Oct 03 08:22:30 crc kubenswrapper[4664]: I1003 08:22:30.281304 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2073f10-19b0-41ed-8357-753ad2c6220f" containerName="registry-server" Oct 03 08:22:30 crc kubenswrapper[4664]: I1003 08:22:30.286869 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vnrkk" Oct 03 08:22:30 crc kubenswrapper[4664]: I1003 08:22:30.290909 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vnrkk"] Oct 03 08:22:30 crc kubenswrapper[4664]: I1003 08:22:30.405178 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef635d74-a284-4694-be18-b8bb1ba79214-utilities\") pod \"redhat-marketplace-vnrkk\" (UID: \"ef635d74-a284-4694-be18-b8bb1ba79214\") " pod="openshift-marketplace/redhat-marketplace-vnrkk" Oct 03 08:22:30 crc kubenswrapper[4664]: I1003 08:22:30.405306 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tp6fr\" (UniqueName: \"kubernetes.io/projected/ef635d74-a284-4694-be18-b8bb1ba79214-kube-api-access-tp6fr\") pod \"redhat-marketplace-vnrkk\" (UID: \"ef635d74-a284-4694-be18-b8bb1ba79214\") " pod="openshift-marketplace/redhat-marketplace-vnrkk" Oct 03 08:22:30 crc kubenswrapper[4664]: I1003 08:22:30.405349 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef635d74-a284-4694-be18-b8bb1ba79214-catalog-content\") pod \"redhat-marketplace-vnrkk\" (UID: \"ef635d74-a284-4694-be18-b8bb1ba79214\") " pod="openshift-marketplace/redhat-marketplace-vnrkk" Oct 03 08:22:30 crc kubenswrapper[4664]: I1003 08:22:30.506506 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef635d74-a284-4694-be18-b8bb1ba79214-utilities\") pod \"redhat-marketplace-vnrkk\" (UID: \"ef635d74-a284-4694-be18-b8bb1ba79214\") " pod="openshift-marketplace/redhat-marketplace-vnrkk" Oct 03 08:22:30 crc kubenswrapper[4664]: I1003 08:22:30.506895 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tp6fr\" (UniqueName: \"kubernetes.io/projected/ef635d74-a284-4694-be18-b8bb1ba79214-kube-api-access-tp6fr\") pod \"redhat-marketplace-vnrkk\" (UID: \"ef635d74-a284-4694-be18-b8bb1ba79214\") " pod="openshift-marketplace/redhat-marketplace-vnrkk" Oct 03 08:22:30 crc kubenswrapper[4664]: I1003 08:22:30.506990 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef635d74-a284-4694-be18-b8bb1ba79214-catalog-content\") pod \"redhat-marketplace-vnrkk\" (UID: \"ef635d74-a284-4694-be18-b8bb1ba79214\") " pod="openshift-marketplace/redhat-marketplace-vnrkk" Oct 03 08:22:30 crc kubenswrapper[4664]: I1003 08:22:30.507204 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef635d74-a284-4694-be18-b8bb1ba79214-utilities\") pod \"redhat-marketplace-vnrkk\" (UID: \"ef635d74-a284-4694-be18-b8bb1ba79214\") " pod="openshift-marketplace/redhat-marketplace-vnrkk" Oct 03 08:22:30 crc kubenswrapper[4664]: I1003 08:22:30.507308 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef635d74-a284-4694-be18-b8bb1ba79214-catalog-content\") pod \"redhat-marketplace-vnrkk\" (UID: \"ef635d74-a284-4694-be18-b8bb1ba79214\") " pod="openshift-marketplace/redhat-marketplace-vnrkk" Oct 03 08:22:30 crc kubenswrapper[4664]: I1003 08:22:30.531881 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tp6fr\" (UniqueName: \"kubernetes.io/projected/ef635d74-a284-4694-be18-b8bb1ba79214-kube-api-access-tp6fr\") pod \"redhat-marketplace-vnrkk\" (UID: \"ef635d74-a284-4694-be18-b8bb1ba79214\") " pod="openshift-marketplace/redhat-marketplace-vnrkk" Oct 03 08:22:30 crc kubenswrapper[4664]: I1003 08:22:30.618970 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vnrkk" Oct 03 08:22:31 crc kubenswrapper[4664]: I1003 08:22:31.119840 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vnrkk"] Oct 03 08:22:31 crc kubenswrapper[4664]: I1003 08:22:31.773334 4664 generic.go:334] "Generic (PLEG): container finished" podID="ef635d74-a284-4694-be18-b8bb1ba79214" containerID="8aeccae7b25abea0736a885ed1cc1b3056414e69f1c6103b8f3a79db4500f559" exitCode=0 Oct 03 08:22:31 crc kubenswrapper[4664]: I1003 08:22:31.773496 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vnrkk" event={"ID":"ef635d74-a284-4694-be18-b8bb1ba79214","Type":"ContainerDied","Data":"8aeccae7b25abea0736a885ed1cc1b3056414e69f1c6103b8f3a79db4500f559"} Oct 03 08:22:31 crc kubenswrapper[4664]: I1003 08:22:31.774353 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vnrkk" event={"ID":"ef635d74-a284-4694-be18-b8bb1ba79214","Type":"ContainerStarted","Data":"03db9a320086139d368522aa31c7f59f57336aed7894b3cf38ec35ab7326efa9"} Oct 03 08:22:33 crc kubenswrapper[4664]: I1003 08:22:33.793749 4664 generic.go:334] "Generic (PLEG): container finished" podID="ef635d74-a284-4694-be18-b8bb1ba79214" containerID="c1ffadc3ec8201395e4f9b714d882b779f3b911fcda6bebceaf3ce7158c3739f" exitCode=0 Oct 03 08:22:33 crc kubenswrapper[4664]: I1003 08:22:33.793842 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vnrkk" event={"ID":"ef635d74-a284-4694-be18-b8bb1ba79214","Type":"ContainerDied","Data":"c1ffadc3ec8201395e4f9b714d882b779f3b911fcda6bebceaf3ce7158c3739f"} Oct 03 08:22:34 crc kubenswrapper[4664]: I1003 08:22:34.805576 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vnrkk" event={"ID":"ef635d74-a284-4694-be18-b8bb1ba79214","Type":"ContainerStarted","Data":"d2308f87fe018731e4bf5c847415526dae54b31f7d4a5fc5490165b0ad8051b7"} Oct 03 08:22:34 crc kubenswrapper[4664]: I1003 08:22:34.808060 4664 generic.go:334] "Generic (PLEG): container finished" podID="fa362fe3-175b-4212-b34c-341eab1572cf" containerID="69948ce356e51ab67998072d5f8c92156426ba1c3fe7b4f57c1f576c323bfb17" exitCode=0 Oct 03 08:22:34 crc kubenswrapper[4664]: I1003 08:22:34.808107 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-77v4r" event={"ID":"fa362fe3-175b-4212-b34c-341eab1572cf","Type":"ContainerDied","Data":"69948ce356e51ab67998072d5f8c92156426ba1c3fe7b4f57c1f576c323bfb17"} Oct 03 08:22:34 crc kubenswrapper[4664]: I1003 08:22:34.826797 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vnrkk" podStartSLOduration=2.340654506 podStartE2EDuration="4.826770624s" podCreationTimestamp="2025-10-03 08:22:30 +0000 UTC" firstStartedPulling="2025-10-03 08:22:31.776746052 +0000 UTC m=+2052.597936542" lastFinishedPulling="2025-10-03 08:22:34.26286217 +0000 UTC m=+2055.084052660" observedRunningTime="2025-10-03 08:22:34.824739548 +0000 UTC m=+2055.645930058" watchObservedRunningTime="2025-10-03 08:22:34.826770624 +0000 UTC m=+2055.647961124" Oct 03 08:22:36 crc kubenswrapper[4664]: I1003 08:22:36.261989 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-77v4r" Oct 03 08:22:36 crc kubenswrapper[4664]: I1003 08:22:36.330685 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkw5w\" (UniqueName: \"kubernetes.io/projected/fa362fe3-175b-4212-b34c-341eab1572cf-kube-api-access-nkw5w\") pod \"fa362fe3-175b-4212-b34c-341eab1572cf\" (UID: \"fa362fe3-175b-4212-b34c-341eab1572cf\") " Oct 03 08:22:36 crc kubenswrapper[4664]: I1003 08:22:36.331228 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fa362fe3-175b-4212-b34c-341eab1572cf-inventory\") pod \"fa362fe3-175b-4212-b34c-341eab1572cf\" (UID: \"fa362fe3-175b-4212-b34c-341eab1572cf\") " Oct 03 08:22:36 crc kubenswrapper[4664]: I1003 08:22:36.331267 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fa362fe3-175b-4212-b34c-341eab1572cf-ssh-key\") pod \"fa362fe3-175b-4212-b34c-341eab1572cf\" (UID: \"fa362fe3-175b-4212-b34c-341eab1572cf\") " Oct 03 08:22:36 crc kubenswrapper[4664]: I1003 08:22:36.337595 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa362fe3-175b-4212-b34c-341eab1572cf-kube-api-access-nkw5w" (OuterVolumeSpecName: "kube-api-access-nkw5w") pod "fa362fe3-175b-4212-b34c-341eab1572cf" (UID: "fa362fe3-175b-4212-b34c-341eab1572cf"). InnerVolumeSpecName "kube-api-access-nkw5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:22:36 crc kubenswrapper[4664]: I1003 08:22:36.360927 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa362fe3-175b-4212-b34c-341eab1572cf-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fa362fe3-175b-4212-b34c-341eab1572cf" (UID: "fa362fe3-175b-4212-b34c-341eab1572cf"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:22:36 crc kubenswrapper[4664]: I1003 08:22:36.363489 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa362fe3-175b-4212-b34c-341eab1572cf-inventory" (OuterVolumeSpecName: "inventory") pod "fa362fe3-175b-4212-b34c-341eab1572cf" (UID: "fa362fe3-175b-4212-b34c-341eab1572cf"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:22:36 crc kubenswrapper[4664]: I1003 08:22:36.434018 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkw5w\" (UniqueName: \"kubernetes.io/projected/fa362fe3-175b-4212-b34c-341eab1572cf-kube-api-access-nkw5w\") on node \"crc\" DevicePath \"\"" Oct 03 08:22:36 crc kubenswrapper[4664]: I1003 08:22:36.434066 4664 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fa362fe3-175b-4212-b34c-341eab1572cf-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 08:22:36 crc kubenswrapper[4664]: I1003 08:22:36.434076 4664 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fa362fe3-175b-4212-b34c-341eab1572cf-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 08:22:36 crc kubenswrapper[4664]: I1003 08:22:36.837098 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-77v4r" event={"ID":"fa362fe3-175b-4212-b34c-341eab1572cf","Type":"ContainerDied","Data":"89c5ed45760366f6e231e7840c768dd2318282935c439501b2184e8ecedfebcf"} Oct 03 08:22:36 crc kubenswrapper[4664]: I1003 08:22:36.837151 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89c5ed45760366f6e231e7840c768dd2318282935c439501b2184e8ecedfebcf" Oct 03 08:22:36 crc kubenswrapper[4664]: I1003 08:22:36.837155 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-77v4r" Oct 03 08:22:36 crc kubenswrapper[4664]: I1003 08:22:36.944207 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-fgwgk"] Oct 03 08:22:36 crc kubenswrapper[4664]: E1003 08:22:36.944679 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa362fe3-175b-4212-b34c-341eab1572cf" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 03 08:22:36 crc kubenswrapper[4664]: I1003 08:22:36.944694 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa362fe3-175b-4212-b34c-341eab1572cf" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 03 08:22:36 crc kubenswrapper[4664]: I1003 08:22:36.944931 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa362fe3-175b-4212-b34c-341eab1572cf" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 03 08:22:36 crc kubenswrapper[4664]: I1003 08:22:36.945767 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-fgwgk" Oct 03 08:22:36 crc kubenswrapper[4664]: I1003 08:22:36.948642 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h6s74" Oct 03 08:22:36 crc kubenswrapper[4664]: I1003 08:22:36.949065 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 08:22:36 crc kubenswrapper[4664]: I1003 08:22:36.949340 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 08:22:36 crc kubenswrapper[4664]: I1003 08:22:36.949509 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 08:22:36 crc kubenswrapper[4664]: I1003 08:22:36.958085 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-fgwgk"] Oct 03 08:22:37 crc kubenswrapper[4664]: I1003 08:22:37.047336 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/08f00e80-ab83-47e3-b8e6-71d4d76300c4-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-fgwgk\" (UID: \"08f00e80-ab83-47e3-b8e6-71d4d76300c4\") " pod="openstack/ssh-known-hosts-edpm-deployment-fgwgk" Oct 03 08:22:37 crc kubenswrapper[4664]: I1003 08:22:37.047410 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjjjp\" (UniqueName: \"kubernetes.io/projected/08f00e80-ab83-47e3-b8e6-71d4d76300c4-kube-api-access-wjjjp\") pod \"ssh-known-hosts-edpm-deployment-fgwgk\" (UID: \"08f00e80-ab83-47e3-b8e6-71d4d76300c4\") " pod="openstack/ssh-known-hosts-edpm-deployment-fgwgk" Oct 03 08:22:37 crc kubenswrapper[4664]: I1003 08:22:37.047462 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/08f00e80-ab83-47e3-b8e6-71d4d76300c4-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-fgwgk\" (UID: \"08f00e80-ab83-47e3-b8e6-71d4d76300c4\") " pod="openstack/ssh-known-hosts-edpm-deployment-fgwgk" Oct 03 08:22:37 crc kubenswrapper[4664]: I1003 08:22:37.149555 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/08f00e80-ab83-47e3-b8e6-71d4d76300c4-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-fgwgk\" (UID: \"08f00e80-ab83-47e3-b8e6-71d4d76300c4\") " pod="openstack/ssh-known-hosts-edpm-deployment-fgwgk" Oct 03 08:22:37 crc kubenswrapper[4664]: I1003 08:22:37.149937 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjjjp\" (UniqueName: \"kubernetes.io/projected/08f00e80-ab83-47e3-b8e6-71d4d76300c4-kube-api-access-wjjjp\") pod \"ssh-known-hosts-edpm-deployment-fgwgk\" (UID: \"08f00e80-ab83-47e3-b8e6-71d4d76300c4\") " pod="openstack/ssh-known-hosts-edpm-deployment-fgwgk" Oct 03 08:22:37 crc kubenswrapper[4664]: I1003 08:22:37.150096 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/08f00e80-ab83-47e3-b8e6-71d4d76300c4-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-fgwgk\" (UID: \"08f00e80-ab83-47e3-b8e6-71d4d76300c4\") " pod="openstack/ssh-known-hosts-edpm-deployment-fgwgk" Oct 03 08:22:37 crc kubenswrapper[4664]: I1003 08:22:37.154232 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/08f00e80-ab83-47e3-b8e6-71d4d76300c4-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-fgwgk\" (UID: \"08f00e80-ab83-47e3-b8e6-71d4d76300c4\") " pod="openstack/ssh-known-hosts-edpm-deployment-fgwgk" Oct 03 08:22:37 crc kubenswrapper[4664]: I1003 08:22:37.156501 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/08f00e80-ab83-47e3-b8e6-71d4d76300c4-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-fgwgk\" (UID: \"08f00e80-ab83-47e3-b8e6-71d4d76300c4\") " pod="openstack/ssh-known-hosts-edpm-deployment-fgwgk" Oct 03 08:22:37 crc kubenswrapper[4664]: I1003 08:22:37.171796 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjjjp\" (UniqueName: \"kubernetes.io/projected/08f00e80-ab83-47e3-b8e6-71d4d76300c4-kube-api-access-wjjjp\") pod \"ssh-known-hosts-edpm-deployment-fgwgk\" (UID: \"08f00e80-ab83-47e3-b8e6-71d4d76300c4\") " pod="openstack/ssh-known-hosts-edpm-deployment-fgwgk" Oct 03 08:22:37 crc kubenswrapper[4664]: I1003 08:22:37.266341 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-fgwgk" Oct 03 08:22:37 crc kubenswrapper[4664]: I1003 08:22:37.777650 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-fgwgk"] Oct 03 08:22:37 crc kubenswrapper[4664]: W1003 08:22:37.779492 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08f00e80_ab83_47e3_b8e6_71d4d76300c4.slice/crio-1bf8496989118e1389b7b31d53bd971fcab758c30bbdbda22b3b9fa187a1cbf5 WatchSource:0}: Error finding container 1bf8496989118e1389b7b31d53bd971fcab758c30bbdbda22b3b9fa187a1cbf5: Status 404 returned error can't find the container with id 1bf8496989118e1389b7b31d53bd971fcab758c30bbdbda22b3b9fa187a1cbf5 Oct 03 08:22:37 crc kubenswrapper[4664]: I1003 08:22:37.853901 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-fgwgk" event={"ID":"08f00e80-ab83-47e3-b8e6-71d4d76300c4","Type":"ContainerStarted","Data":"1bf8496989118e1389b7b31d53bd971fcab758c30bbdbda22b3b9fa187a1cbf5"} Oct 03 08:22:38 crc kubenswrapper[4664]: I1003 08:22:38.866719 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-fgwgk" event={"ID":"08f00e80-ab83-47e3-b8e6-71d4d76300c4","Type":"ContainerStarted","Data":"d6935d7768ed7bb1aa33d0e4284dca095b9dd29009f74ac60e5a57cf7f5b18fe"} Oct 03 08:22:38 crc kubenswrapper[4664]: I1003 08:22:38.891587 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-fgwgk" podStartSLOduration=2.383673398 podStartE2EDuration="2.891564579s" podCreationTimestamp="2025-10-03 08:22:36 +0000 UTC" firstStartedPulling="2025-10-03 08:22:37.781356326 +0000 UTC m=+2058.602546816" lastFinishedPulling="2025-10-03 08:22:38.289247497 +0000 UTC m=+2059.110437997" observedRunningTime="2025-10-03 08:22:38.883738314 +0000 UTC m=+2059.704928824" watchObservedRunningTime="2025-10-03 08:22:38.891564579 +0000 UTC m=+2059.712755079" Oct 03 08:22:40 crc kubenswrapper[4664]: I1003 08:22:40.620532 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vnrkk" Oct 03 08:22:40 crc kubenswrapper[4664]: I1003 08:22:40.620593 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vnrkk" Oct 03 08:22:40 crc kubenswrapper[4664]: I1003 08:22:40.666940 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vnrkk" Oct 03 08:22:40 crc kubenswrapper[4664]: I1003 08:22:40.937953 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vnrkk" Oct 03 08:22:40 crc kubenswrapper[4664]: I1003 08:22:40.987412 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vnrkk"] Oct 03 08:22:42 crc kubenswrapper[4664]: I1003 08:22:42.901668 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vnrkk" podUID="ef635d74-a284-4694-be18-b8bb1ba79214" containerName="registry-server" containerID="cri-o://d2308f87fe018731e4bf5c847415526dae54b31f7d4a5fc5490165b0ad8051b7" gracePeriod=2 Oct 03 08:22:43 crc kubenswrapper[4664]: I1003 08:22:43.339512 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vnrkk" Oct 03 08:22:43 crc kubenswrapper[4664]: I1003 08:22:43.494564 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tp6fr\" (UniqueName: \"kubernetes.io/projected/ef635d74-a284-4694-be18-b8bb1ba79214-kube-api-access-tp6fr\") pod \"ef635d74-a284-4694-be18-b8bb1ba79214\" (UID: \"ef635d74-a284-4694-be18-b8bb1ba79214\") " Oct 03 08:22:43 crc kubenswrapper[4664]: I1003 08:22:43.495088 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef635d74-a284-4694-be18-b8bb1ba79214-catalog-content\") pod \"ef635d74-a284-4694-be18-b8bb1ba79214\" (UID: \"ef635d74-a284-4694-be18-b8bb1ba79214\") " Oct 03 08:22:43 crc kubenswrapper[4664]: I1003 08:22:43.495319 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef635d74-a284-4694-be18-b8bb1ba79214-utilities\") pod \"ef635d74-a284-4694-be18-b8bb1ba79214\" (UID: \"ef635d74-a284-4694-be18-b8bb1ba79214\") " Oct 03 08:22:43 crc kubenswrapper[4664]: I1003 08:22:43.496427 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef635d74-a284-4694-be18-b8bb1ba79214-utilities" (OuterVolumeSpecName: "utilities") pod "ef635d74-a284-4694-be18-b8bb1ba79214" (UID: "ef635d74-a284-4694-be18-b8bb1ba79214"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:22:43 crc kubenswrapper[4664]: I1003 08:22:43.501815 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef635d74-a284-4694-be18-b8bb1ba79214-kube-api-access-tp6fr" (OuterVolumeSpecName: "kube-api-access-tp6fr") pod "ef635d74-a284-4694-be18-b8bb1ba79214" (UID: "ef635d74-a284-4694-be18-b8bb1ba79214"). InnerVolumeSpecName "kube-api-access-tp6fr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:22:43 crc kubenswrapper[4664]: I1003 08:22:43.510642 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef635d74-a284-4694-be18-b8bb1ba79214-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ef635d74-a284-4694-be18-b8bb1ba79214" (UID: "ef635d74-a284-4694-be18-b8bb1ba79214"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:22:43 crc kubenswrapper[4664]: I1003 08:22:43.597980 4664 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef635d74-a284-4694-be18-b8bb1ba79214-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:22:43 crc kubenswrapper[4664]: I1003 08:22:43.598029 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tp6fr\" (UniqueName: \"kubernetes.io/projected/ef635d74-a284-4694-be18-b8bb1ba79214-kube-api-access-tp6fr\") on node \"crc\" DevicePath \"\"" Oct 03 08:22:43 crc kubenswrapper[4664]: I1003 08:22:43.598042 4664 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef635d74-a284-4694-be18-b8bb1ba79214-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:22:43 crc kubenswrapper[4664]: I1003 08:22:43.926313 4664 generic.go:334] "Generic (PLEG): container finished" podID="ef635d74-a284-4694-be18-b8bb1ba79214" containerID="d2308f87fe018731e4bf5c847415526dae54b31f7d4a5fc5490165b0ad8051b7" exitCode=0 Oct 03 08:22:43 crc kubenswrapper[4664]: I1003 08:22:43.926371 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vnrkk" event={"ID":"ef635d74-a284-4694-be18-b8bb1ba79214","Type":"ContainerDied","Data":"d2308f87fe018731e4bf5c847415526dae54b31f7d4a5fc5490165b0ad8051b7"} Oct 03 08:22:43 crc kubenswrapper[4664]: I1003 08:22:43.926407 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vnrkk" event={"ID":"ef635d74-a284-4694-be18-b8bb1ba79214","Type":"ContainerDied","Data":"03db9a320086139d368522aa31c7f59f57336aed7894b3cf38ec35ab7326efa9"} Oct 03 08:22:43 crc kubenswrapper[4664]: I1003 08:22:43.926430 4664 scope.go:117] "RemoveContainer" containerID="d2308f87fe018731e4bf5c847415526dae54b31f7d4a5fc5490165b0ad8051b7" Oct 03 08:22:43 crc kubenswrapper[4664]: I1003 08:22:43.926465 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vnrkk" Oct 03 08:22:43 crc kubenswrapper[4664]: I1003 08:22:43.954651 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vnrkk"] Oct 03 08:22:43 crc kubenswrapper[4664]: I1003 08:22:43.957860 4664 scope.go:117] "RemoveContainer" containerID="c1ffadc3ec8201395e4f9b714d882b779f3b911fcda6bebceaf3ce7158c3739f" Oct 03 08:22:43 crc kubenswrapper[4664]: I1003 08:22:43.962976 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vnrkk"] Oct 03 08:22:43 crc kubenswrapper[4664]: I1003 08:22:43.977687 4664 scope.go:117] "RemoveContainer" containerID="8aeccae7b25abea0736a885ed1cc1b3056414e69f1c6103b8f3a79db4500f559" Oct 03 08:22:44 crc kubenswrapper[4664]: I1003 08:22:44.024017 4664 scope.go:117] "RemoveContainer" containerID="d2308f87fe018731e4bf5c847415526dae54b31f7d4a5fc5490165b0ad8051b7" Oct 03 08:22:44 crc kubenswrapper[4664]: E1003 08:22:44.024666 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2308f87fe018731e4bf5c847415526dae54b31f7d4a5fc5490165b0ad8051b7\": container with ID starting with d2308f87fe018731e4bf5c847415526dae54b31f7d4a5fc5490165b0ad8051b7 not found: ID does not exist" containerID="d2308f87fe018731e4bf5c847415526dae54b31f7d4a5fc5490165b0ad8051b7" Oct 03 08:22:44 crc kubenswrapper[4664]: I1003 08:22:44.024749 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2308f87fe018731e4bf5c847415526dae54b31f7d4a5fc5490165b0ad8051b7"} err="failed to get container status \"d2308f87fe018731e4bf5c847415526dae54b31f7d4a5fc5490165b0ad8051b7\": rpc error: code = NotFound desc = could not find container \"d2308f87fe018731e4bf5c847415526dae54b31f7d4a5fc5490165b0ad8051b7\": container with ID starting with d2308f87fe018731e4bf5c847415526dae54b31f7d4a5fc5490165b0ad8051b7 not found: ID does not exist" Oct 03 08:22:44 crc kubenswrapper[4664]: I1003 08:22:44.024783 4664 scope.go:117] "RemoveContainer" containerID="c1ffadc3ec8201395e4f9b714d882b779f3b911fcda6bebceaf3ce7158c3739f" Oct 03 08:22:44 crc kubenswrapper[4664]: E1003 08:22:44.025092 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1ffadc3ec8201395e4f9b714d882b779f3b911fcda6bebceaf3ce7158c3739f\": container with ID starting with c1ffadc3ec8201395e4f9b714d882b779f3b911fcda6bebceaf3ce7158c3739f not found: ID does not exist" containerID="c1ffadc3ec8201395e4f9b714d882b779f3b911fcda6bebceaf3ce7158c3739f" Oct 03 08:22:44 crc kubenswrapper[4664]: I1003 08:22:44.025113 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1ffadc3ec8201395e4f9b714d882b779f3b911fcda6bebceaf3ce7158c3739f"} err="failed to get container status \"c1ffadc3ec8201395e4f9b714d882b779f3b911fcda6bebceaf3ce7158c3739f\": rpc error: code = NotFound desc = could not find container \"c1ffadc3ec8201395e4f9b714d882b779f3b911fcda6bebceaf3ce7158c3739f\": container with ID starting with c1ffadc3ec8201395e4f9b714d882b779f3b911fcda6bebceaf3ce7158c3739f not found: ID does not exist" Oct 03 08:22:44 crc kubenswrapper[4664]: I1003 08:22:44.025127 4664 scope.go:117] "RemoveContainer" containerID="8aeccae7b25abea0736a885ed1cc1b3056414e69f1c6103b8f3a79db4500f559" Oct 03 08:22:44 crc kubenswrapper[4664]: E1003 08:22:44.025461 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8aeccae7b25abea0736a885ed1cc1b3056414e69f1c6103b8f3a79db4500f559\": container with ID starting with 8aeccae7b25abea0736a885ed1cc1b3056414e69f1c6103b8f3a79db4500f559 not found: ID does not exist" containerID="8aeccae7b25abea0736a885ed1cc1b3056414e69f1c6103b8f3a79db4500f559" Oct 03 08:22:44 crc kubenswrapper[4664]: I1003 08:22:44.025488 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8aeccae7b25abea0736a885ed1cc1b3056414e69f1c6103b8f3a79db4500f559"} err="failed to get container status \"8aeccae7b25abea0736a885ed1cc1b3056414e69f1c6103b8f3a79db4500f559\": rpc error: code = NotFound desc = could not find container \"8aeccae7b25abea0736a885ed1cc1b3056414e69f1c6103b8f3a79db4500f559\": container with ID starting with 8aeccae7b25abea0736a885ed1cc1b3056414e69f1c6103b8f3a79db4500f559 not found: ID does not exist" Oct 03 08:22:45 crc kubenswrapper[4664]: I1003 08:22:45.889273 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef635d74-a284-4694-be18-b8bb1ba79214" path="/var/lib/kubelet/pods/ef635d74-a284-4694-be18-b8bb1ba79214/volumes" Oct 03 08:22:45 crc kubenswrapper[4664]: I1003 08:22:45.948794 4664 generic.go:334] "Generic (PLEG): container finished" podID="08f00e80-ab83-47e3-b8e6-71d4d76300c4" containerID="d6935d7768ed7bb1aa33d0e4284dca095b9dd29009f74ac60e5a57cf7f5b18fe" exitCode=0 Oct 03 08:22:45 crc kubenswrapper[4664]: I1003 08:22:45.948849 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-fgwgk" event={"ID":"08f00e80-ab83-47e3-b8e6-71d4d76300c4","Type":"ContainerDied","Data":"d6935d7768ed7bb1aa33d0e4284dca095b9dd29009f74ac60e5a57cf7f5b18fe"} Oct 03 08:22:47 crc kubenswrapper[4664]: I1003 08:22:47.373926 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-fgwgk" Oct 03 08:22:47 crc kubenswrapper[4664]: I1003 08:22:47.480269 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/08f00e80-ab83-47e3-b8e6-71d4d76300c4-inventory-0\") pod \"08f00e80-ab83-47e3-b8e6-71d4d76300c4\" (UID: \"08f00e80-ab83-47e3-b8e6-71d4d76300c4\") " Oct 03 08:22:47 crc kubenswrapper[4664]: I1003 08:22:47.480397 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjjjp\" (UniqueName: \"kubernetes.io/projected/08f00e80-ab83-47e3-b8e6-71d4d76300c4-kube-api-access-wjjjp\") pod \"08f00e80-ab83-47e3-b8e6-71d4d76300c4\" (UID: \"08f00e80-ab83-47e3-b8e6-71d4d76300c4\") " Oct 03 08:22:47 crc kubenswrapper[4664]: I1003 08:22:47.480428 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/08f00e80-ab83-47e3-b8e6-71d4d76300c4-ssh-key-openstack-edpm-ipam\") pod \"08f00e80-ab83-47e3-b8e6-71d4d76300c4\" (UID: \"08f00e80-ab83-47e3-b8e6-71d4d76300c4\") " Oct 03 08:22:47 crc kubenswrapper[4664]: I1003 08:22:47.486215 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08f00e80-ab83-47e3-b8e6-71d4d76300c4-kube-api-access-wjjjp" (OuterVolumeSpecName: "kube-api-access-wjjjp") pod "08f00e80-ab83-47e3-b8e6-71d4d76300c4" (UID: "08f00e80-ab83-47e3-b8e6-71d4d76300c4"). InnerVolumeSpecName "kube-api-access-wjjjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:22:47 crc kubenswrapper[4664]: I1003 08:22:47.509215 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08f00e80-ab83-47e3-b8e6-71d4d76300c4-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "08f00e80-ab83-47e3-b8e6-71d4d76300c4" (UID: "08f00e80-ab83-47e3-b8e6-71d4d76300c4"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:22:47 crc kubenswrapper[4664]: I1003 08:22:47.513083 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08f00e80-ab83-47e3-b8e6-71d4d76300c4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "08f00e80-ab83-47e3-b8e6-71d4d76300c4" (UID: "08f00e80-ab83-47e3-b8e6-71d4d76300c4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:22:47 crc kubenswrapper[4664]: I1003 08:22:47.583271 4664 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/08f00e80-ab83-47e3-b8e6-71d4d76300c4-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 03 08:22:47 crc kubenswrapper[4664]: I1003 08:22:47.583310 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjjjp\" (UniqueName: \"kubernetes.io/projected/08f00e80-ab83-47e3-b8e6-71d4d76300c4-kube-api-access-wjjjp\") on node \"crc\" DevicePath \"\"" Oct 03 08:22:47 crc kubenswrapper[4664]: I1003 08:22:47.583326 4664 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/08f00e80-ab83-47e3-b8e6-71d4d76300c4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 03 08:22:47 crc kubenswrapper[4664]: I1003 08:22:47.969091 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-fgwgk" event={"ID":"08f00e80-ab83-47e3-b8e6-71d4d76300c4","Type":"ContainerDied","Data":"1bf8496989118e1389b7b31d53bd971fcab758c30bbdbda22b3b9fa187a1cbf5"} Oct 03 08:22:47 crc kubenswrapper[4664]: I1003 08:22:47.969164 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-fgwgk" Oct 03 08:22:47 crc kubenswrapper[4664]: I1003 08:22:47.969177 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1bf8496989118e1389b7b31d53bd971fcab758c30bbdbda22b3b9fa187a1cbf5" Oct 03 08:22:48 crc kubenswrapper[4664]: I1003 08:22:48.045585 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-zs964"] Oct 03 08:22:48 crc kubenswrapper[4664]: E1003 08:22:48.046566 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef635d74-a284-4694-be18-b8bb1ba79214" containerName="extract-content" Oct 03 08:22:48 crc kubenswrapper[4664]: I1003 08:22:48.046582 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef635d74-a284-4694-be18-b8bb1ba79214" containerName="extract-content" Oct 03 08:22:48 crc kubenswrapper[4664]: E1003 08:22:48.046595 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08f00e80-ab83-47e3-b8e6-71d4d76300c4" containerName="ssh-known-hosts-edpm-deployment" Oct 03 08:22:48 crc kubenswrapper[4664]: I1003 08:22:48.046601 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="08f00e80-ab83-47e3-b8e6-71d4d76300c4" containerName="ssh-known-hosts-edpm-deployment" Oct 03 08:22:48 crc kubenswrapper[4664]: E1003 08:22:48.046620 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef635d74-a284-4694-be18-b8bb1ba79214" containerName="extract-utilities" Oct 03 08:22:48 crc kubenswrapper[4664]: I1003 08:22:48.046626 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef635d74-a284-4694-be18-b8bb1ba79214" containerName="extract-utilities" Oct 03 08:22:48 crc kubenswrapper[4664]: E1003 08:22:48.046645 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef635d74-a284-4694-be18-b8bb1ba79214" containerName="registry-server" Oct 03 08:22:48 crc kubenswrapper[4664]: I1003 08:22:48.046651 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef635d74-a284-4694-be18-b8bb1ba79214" containerName="registry-server" Oct 03 08:22:48 crc kubenswrapper[4664]: I1003 08:22:48.046816 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef635d74-a284-4694-be18-b8bb1ba79214" containerName="registry-server" Oct 03 08:22:48 crc kubenswrapper[4664]: I1003 08:22:48.046845 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="08f00e80-ab83-47e3-b8e6-71d4d76300c4" containerName="ssh-known-hosts-edpm-deployment" Oct 03 08:22:48 crc kubenswrapper[4664]: I1003 08:22:48.047534 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zs964" Oct 03 08:22:48 crc kubenswrapper[4664]: I1003 08:22:48.052995 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 08:22:48 crc kubenswrapper[4664]: I1003 08:22:48.053358 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 08:22:48 crc kubenswrapper[4664]: I1003 08:22:48.053543 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h6s74" Oct 03 08:22:48 crc kubenswrapper[4664]: I1003 08:22:48.054369 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 08:22:48 crc kubenswrapper[4664]: I1003 08:22:48.064231 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-zs964"] Oct 03 08:22:48 crc kubenswrapper[4664]: I1003 08:22:48.199883 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slqhf\" (UniqueName: \"kubernetes.io/projected/9f440401-13c2-4c2f-aff8-41e856a920c5-kube-api-access-slqhf\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zs964\" (UID: \"9f440401-13c2-4c2f-aff8-41e856a920c5\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zs964" Oct 03 08:22:48 crc kubenswrapper[4664]: I1003 08:22:48.200004 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f440401-13c2-4c2f-aff8-41e856a920c5-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zs964\" (UID: \"9f440401-13c2-4c2f-aff8-41e856a920c5\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zs964" Oct 03 08:22:48 crc kubenswrapper[4664]: I1003 08:22:48.200127 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f440401-13c2-4c2f-aff8-41e856a920c5-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zs964\" (UID: \"9f440401-13c2-4c2f-aff8-41e856a920c5\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zs964" Oct 03 08:22:48 crc kubenswrapper[4664]: I1003 08:22:48.302174 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slqhf\" (UniqueName: \"kubernetes.io/projected/9f440401-13c2-4c2f-aff8-41e856a920c5-kube-api-access-slqhf\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zs964\" (UID: \"9f440401-13c2-4c2f-aff8-41e856a920c5\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zs964" Oct 03 08:22:48 crc kubenswrapper[4664]: I1003 08:22:48.302270 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f440401-13c2-4c2f-aff8-41e856a920c5-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zs964\" (UID: \"9f440401-13c2-4c2f-aff8-41e856a920c5\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zs964" Oct 03 08:22:48 crc kubenswrapper[4664]: I1003 08:22:48.302355 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f440401-13c2-4c2f-aff8-41e856a920c5-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zs964\" (UID: \"9f440401-13c2-4c2f-aff8-41e856a920c5\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zs964" Oct 03 08:22:48 crc kubenswrapper[4664]: I1003 08:22:48.310548 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f440401-13c2-4c2f-aff8-41e856a920c5-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zs964\" (UID: \"9f440401-13c2-4c2f-aff8-41e856a920c5\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zs964" Oct 03 08:22:48 crc kubenswrapper[4664]: I1003 08:22:48.313367 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f440401-13c2-4c2f-aff8-41e856a920c5-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zs964\" (UID: \"9f440401-13c2-4c2f-aff8-41e856a920c5\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zs964" Oct 03 08:22:48 crc kubenswrapper[4664]: I1003 08:22:48.321918 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slqhf\" (UniqueName: \"kubernetes.io/projected/9f440401-13c2-4c2f-aff8-41e856a920c5-kube-api-access-slqhf\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zs964\" (UID: \"9f440401-13c2-4c2f-aff8-41e856a920c5\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zs964" Oct 03 08:22:48 crc kubenswrapper[4664]: I1003 08:22:48.373781 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zs964" Oct 03 08:22:48 crc kubenswrapper[4664]: I1003 08:22:48.866826 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-zs964"] Oct 03 08:22:48 crc kubenswrapper[4664]: I1003 08:22:48.978329 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zs964" event={"ID":"9f440401-13c2-4c2f-aff8-41e856a920c5","Type":"ContainerStarted","Data":"328e836aa9a92d39f7358af2b83f7e1ff08a6f9ccd3a1e03ef24d1d2c5d184d6"} Oct 03 08:22:49 crc kubenswrapper[4664]: I1003 08:22:49.987249 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zs964" event={"ID":"9f440401-13c2-4c2f-aff8-41e856a920c5","Type":"ContainerStarted","Data":"afa04ff0aa192821d9da77b71c4db4c7edaeedfdb6b8ffce0c3ddea19ffb58e7"} Oct 03 08:22:50 crc kubenswrapper[4664]: I1003 08:22:50.012053 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zs964" podStartSLOduration=1.782028191 podStartE2EDuration="2.01202608s" podCreationTimestamp="2025-10-03 08:22:48 +0000 UTC" firstStartedPulling="2025-10-03 08:22:48.873145028 +0000 UTC m=+2069.694335518" lastFinishedPulling="2025-10-03 08:22:49.103142917 +0000 UTC m=+2069.924333407" observedRunningTime="2025-10-03 08:22:50.005644865 +0000 UTC m=+2070.826835365" watchObservedRunningTime="2025-10-03 08:22:50.01202608 +0000 UTC m=+2070.833216570" Oct 03 08:22:58 crc kubenswrapper[4664]: I1003 08:22:58.090861 4664 generic.go:334] "Generic (PLEG): container finished" podID="9f440401-13c2-4c2f-aff8-41e856a920c5" containerID="afa04ff0aa192821d9da77b71c4db4c7edaeedfdb6b8ffce0c3ddea19ffb58e7" exitCode=0 Oct 03 08:22:58 crc kubenswrapper[4664]: I1003 08:22:58.090947 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zs964" event={"ID":"9f440401-13c2-4c2f-aff8-41e856a920c5","Type":"ContainerDied","Data":"afa04ff0aa192821d9da77b71c4db4c7edaeedfdb6b8ffce0c3ddea19ffb58e7"} Oct 03 08:22:59 crc kubenswrapper[4664]: I1003 08:22:59.565315 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zs964" Oct 03 08:22:59 crc kubenswrapper[4664]: I1003 08:22:59.742079 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f440401-13c2-4c2f-aff8-41e856a920c5-ssh-key\") pod \"9f440401-13c2-4c2f-aff8-41e856a920c5\" (UID: \"9f440401-13c2-4c2f-aff8-41e856a920c5\") " Oct 03 08:22:59 crc kubenswrapper[4664]: I1003 08:22:59.742237 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f440401-13c2-4c2f-aff8-41e856a920c5-inventory\") pod \"9f440401-13c2-4c2f-aff8-41e856a920c5\" (UID: \"9f440401-13c2-4c2f-aff8-41e856a920c5\") " Oct 03 08:22:59 crc kubenswrapper[4664]: I1003 08:22:59.742276 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slqhf\" (UniqueName: \"kubernetes.io/projected/9f440401-13c2-4c2f-aff8-41e856a920c5-kube-api-access-slqhf\") pod \"9f440401-13c2-4c2f-aff8-41e856a920c5\" (UID: \"9f440401-13c2-4c2f-aff8-41e856a920c5\") " Oct 03 08:22:59 crc kubenswrapper[4664]: I1003 08:22:59.757344 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f440401-13c2-4c2f-aff8-41e856a920c5-kube-api-access-slqhf" (OuterVolumeSpecName: "kube-api-access-slqhf") pod "9f440401-13c2-4c2f-aff8-41e856a920c5" (UID: "9f440401-13c2-4c2f-aff8-41e856a920c5"). InnerVolumeSpecName "kube-api-access-slqhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:22:59 crc kubenswrapper[4664]: I1003 08:22:59.774257 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f440401-13c2-4c2f-aff8-41e856a920c5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9f440401-13c2-4c2f-aff8-41e856a920c5" (UID: "9f440401-13c2-4c2f-aff8-41e856a920c5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:22:59 crc kubenswrapper[4664]: I1003 08:22:59.778766 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f440401-13c2-4c2f-aff8-41e856a920c5-inventory" (OuterVolumeSpecName: "inventory") pod "9f440401-13c2-4c2f-aff8-41e856a920c5" (UID: "9f440401-13c2-4c2f-aff8-41e856a920c5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:22:59 crc kubenswrapper[4664]: I1003 08:22:59.845348 4664 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f440401-13c2-4c2f-aff8-41e856a920c5-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 08:22:59 crc kubenswrapper[4664]: I1003 08:22:59.845386 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slqhf\" (UniqueName: \"kubernetes.io/projected/9f440401-13c2-4c2f-aff8-41e856a920c5-kube-api-access-slqhf\") on node \"crc\" DevicePath \"\"" Oct 03 08:22:59 crc kubenswrapper[4664]: I1003 08:22:59.845398 4664 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f440401-13c2-4c2f-aff8-41e856a920c5-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 08:23:00 crc kubenswrapper[4664]: I1003 08:23:00.110316 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zs964" event={"ID":"9f440401-13c2-4c2f-aff8-41e856a920c5","Type":"ContainerDied","Data":"328e836aa9a92d39f7358af2b83f7e1ff08a6f9ccd3a1e03ef24d1d2c5d184d6"} Oct 03 08:23:00 crc kubenswrapper[4664]: I1003 08:23:00.110362 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="328e836aa9a92d39f7358af2b83f7e1ff08a6f9ccd3a1e03ef24d1d2c5d184d6" Oct 03 08:23:00 crc kubenswrapper[4664]: I1003 08:23:00.110835 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zs964" Oct 03 08:23:00 crc kubenswrapper[4664]: I1003 08:23:00.193228 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f7php"] Oct 03 08:23:00 crc kubenswrapper[4664]: E1003 08:23:00.193791 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f440401-13c2-4c2f-aff8-41e856a920c5" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 03 08:23:00 crc kubenswrapper[4664]: I1003 08:23:00.193814 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f440401-13c2-4c2f-aff8-41e856a920c5" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 03 08:23:00 crc kubenswrapper[4664]: I1003 08:23:00.194056 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f440401-13c2-4c2f-aff8-41e856a920c5" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 03 08:23:00 crc kubenswrapper[4664]: I1003 08:23:00.194900 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f7php" Oct 03 08:23:00 crc kubenswrapper[4664]: I1003 08:23:00.197433 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 08:23:00 crc kubenswrapper[4664]: I1003 08:23:00.197504 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 08:23:00 crc kubenswrapper[4664]: I1003 08:23:00.197760 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 08:23:00 crc kubenswrapper[4664]: I1003 08:23:00.199164 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h6s74" Oct 03 08:23:00 crc kubenswrapper[4664]: I1003 08:23:00.245372 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f7php"] Oct 03 08:23:00 crc kubenswrapper[4664]: I1003 08:23:00.366955 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52361bdc-8d63-429f-b009-d28f61360dd8-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-f7php\" (UID: \"52361bdc-8d63-429f-b009-d28f61360dd8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f7php" Oct 03 08:23:00 crc kubenswrapper[4664]: I1003 08:23:00.367459 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pvv7\" (UniqueName: \"kubernetes.io/projected/52361bdc-8d63-429f-b009-d28f61360dd8-kube-api-access-2pvv7\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-f7php\" (UID: \"52361bdc-8d63-429f-b009-d28f61360dd8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f7php" Oct 03 08:23:00 crc kubenswrapper[4664]: I1003 08:23:00.367736 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/52361bdc-8d63-429f-b009-d28f61360dd8-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-f7php\" (UID: \"52361bdc-8d63-429f-b009-d28f61360dd8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f7php" Oct 03 08:23:00 crc kubenswrapper[4664]: I1003 08:23:00.469659 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/52361bdc-8d63-429f-b009-d28f61360dd8-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-f7php\" (UID: \"52361bdc-8d63-429f-b009-d28f61360dd8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f7php" Oct 03 08:23:00 crc kubenswrapper[4664]: I1003 08:23:00.469814 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52361bdc-8d63-429f-b009-d28f61360dd8-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-f7php\" (UID: \"52361bdc-8d63-429f-b009-d28f61360dd8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f7php" Oct 03 08:23:00 crc kubenswrapper[4664]: I1003 08:23:00.469850 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pvv7\" (UniqueName: \"kubernetes.io/projected/52361bdc-8d63-429f-b009-d28f61360dd8-kube-api-access-2pvv7\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-f7php\" (UID: \"52361bdc-8d63-429f-b009-d28f61360dd8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f7php" Oct 03 08:23:00 crc kubenswrapper[4664]: I1003 08:23:00.475110 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52361bdc-8d63-429f-b009-d28f61360dd8-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-f7php\" (UID: \"52361bdc-8d63-429f-b009-d28f61360dd8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f7php" Oct 03 08:23:00 crc kubenswrapper[4664]: I1003 08:23:00.479132 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/52361bdc-8d63-429f-b009-d28f61360dd8-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-f7php\" (UID: \"52361bdc-8d63-429f-b009-d28f61360dd8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f7php" Oct 03 08:23:00 crc kubenswrapper[4664]: I1003 08:23:00.489563 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pvv7\" (UniqueName: \"kubernetes.io/projected/52361bdc-8d63-429f-b009-d28f61360dd8-kube-api-access-2pvv7\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-f7php\" (UID: \"52361bdc-8d63-429f-b009-d28f61360dd8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f7php" Oct 03 08:23:00 crc kubenswrapper[4664]: I1003 08:23:00.515491 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f7php" Oct 03 08:23:01 crc kubenswrapper[4664]: I1003 08:23:01.054474 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f7php"] Oct 03 08:23:01 crc kubenswrapper[4664]: I1003 08:23:01.122071 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f7php" event={"ID":"52361bdc-8d63-429f-b009-d28f61360dd8","Type":"ContainerStarted","Data":"d08dc4ae1055d5174f74934994459b42de1d34889ff47ccd3067decd4f1ac413"} Oct 03 08:23:02 crc kubenswrapper[4664]: I1003 08:23:02.138590 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f7php" event={"ID":"52361bdc-8d63-429f-b009-d28f61360dd8","Type":"ContainerStarted","Data":"07f1d81cd5dc37d1979b6433ddae2e1b87a70f784b01ab5b733b7b6758b3895c"} Oct 03 08:23:02 crc kubenswrapper[4664]: I1003 08:23:02.156587 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f7php" podStartSLOduration=2.010284357 podStartE2EDuration="2.156561329s" podCreationTimestamp="2025-10-03 08:23:00 +0000 UTC" firstStartedPulling="2025-10-03 08:23:01.061368471 +0000 UTC m=+2081.882558961" lastFinishedPulling="2025-10-03 08:23:01.207645443 +0000 UTC m=+2082.028835933" observedRunningTime="2025-10-03 08:23:02.155189841 +0000 UTC m=+2082.976380341" watchObservedRunningTime="2025-10-03 08:23:02.156561329 +0000 UTC m=+2082.977751819" Oct 03 08:23:11 crc kubenswrapper[4664]: I1003 08:23:11.235815 4664 generic.go:334] "Generic (PLEG): container finished" podID="52361bdc-8d63-429f-b009-d28f61360dd8" containerID="07f1d81cd5dc37d1979b6433ddae2e1b87a70f784b01ab5b733b7b6758b3895c" exitCode=0 Oct 03 08:23:11 crc kubenswrapper[4664]: I1003 08:23:11.235913 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f7php" event={"ID":"52361bdc-8d63-429f-b009-d28f61360dd8","Type":"ContainerDied","Data":"07f1d81cd5dc37d1979b6433ddae2e1b87a70f784b01ab5b733b7b6758b3895c"} Oct 03 08:23:12 crc kubenswrapper[4664]: I1003 08:23:12.651518 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f7php" Oct 03 08:23:12 crc kubenswrapper[4664]: I1003 08:23:12.729439 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52361bdc-8d63-429f-b009-d28f61360dd8-inventory\") pod \"52361bdc-8d63-429f-b009-d28f61360dd8\" (UID: \"52361bdc-8d63-429f-b009-d28f61360dd8\") " Oct 03 08:23:12 crc kubenswrapper[4664]: I1003 08:23:12.729843 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/52361bdc-8d63-429f-b009-d28f61360dd8-ssh-key\") pod \"52361bdc-8d63-429f-b009-d28f61360dd8\" (UID: \"52361bdc-8d63-429f-b009-d28f61360dd8\") " Oct 03 08:23:12 crc kubenswrapper[4664]: I1003 08:23:12.729916 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pvv7\" (UniqueName: \"kubernetes.io/projected/52361bdc-8d63-429f-b009-d28f61360dd8-kube-api-access-2pvv7\") pod \"52361bdc-8d63-429f-b009-d28f61360dd8\" (UID: \"52361bdc-8d63-429f-b009-d28f61360dd8\") " Oct 03 08:23:12 crc kubenswrapper[4664]: I1003 08:23:12.736306 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52361bdc-8d63-429f-b009-d28f61360dd8-kube-api-access-2pvv7" (OuterVolumeSpecName: "kube-api-access-2pvv7") pod "52361bdc-8d63-429f-b009-d28f61360dd8" (UID: "52361bdc-8d63-429f-b009-d28f61360dd8"). InnerVolumeSpecName "kube-api-access-2pvv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:23:12 crc kubenswrapper[4664]: I1003 08:23:12.758727 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52361bdc-8d63-429f-b009-d28f61360dd8-inventory" (OuterVolumeSpecName: "inventory") pod "52361bdc-8d63-429f-b009-d28f61360dd8" (UID: "52361bdc-8d63-429f-b009-d28f61360dd8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:23:12 crc kubenswrapper[4664]: I1003 08:23:12.758842 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52361bdc-8d63-429f-b009-d28f61360dd8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "52361bdc-8d63-429f-b009-d28f61360dd8" (UID: "52361bdc-8d63-429f-b009-d28f61360dd8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:23:12 crc kubenswrapper[4664]: I1003 08:23:12.832202 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pvv7\" (UniqueName: \"kubernetes.io/projected/52361bdc-8d63-429f-b009-d28f61360dd8-kube-api-access-2pvv7\") on node \"crc\" DevicePath \"\"" Oct 03 08:23:12 crc kubenswrapper[4664]: I1003 08:23:12.832235 4664 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52361bdc-8d63-429f-b009-d28f61360dd8-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 08:23:12 crc kubenswrapper[4664]: I1003 08:23:12.832245 4664 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/52361bdc-8d63-429f-b009-d28f61360dd8-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.282646 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f7php" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.282824 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f7php" event={"ID":"52361bdc-8d63-429f-b009-d28f61360dd8","Type":"ContainerDied","Data":"d08dc4ae1055d5174f74934994459b42de1d34889ff47ccd3067decd4f1ac413"} Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.283903 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d08dc4ae1055d5174f74934994459b42de1d34889ff47ccd3067decd4f1ac413" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.359534 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5l55k"] Oct 03 08:23:13 crc kubenswrapper[4664]: E1003 08:23:13.360209 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52361bdc-8d63-429f-b009-d28f61360dd8" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.360237 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="52361bdc-8d63-429f-b009-d28f61360dd8" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.360504 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="52361bdc-8d63-429f-b009-d28f61360dd8" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.361481 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5l55k" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.364992 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.365159 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.365493 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.365573 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.365673 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.366529 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h6s74" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.366792 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.368321 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.375295 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5l55k"] Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.444566 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9334fa32-3e3c-4a4c-ab00-71902c455beb-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5l55k\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5l55k" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.444686 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9334fa32-3e3c-4a4c-ab00-71902c455beb-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5l55k\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5l55k" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.444736 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9334fa32-3e3c-4a4c-ab00-71902c455beb-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5l55k\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5l55k" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.444771 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9334fa32-3e3c-4a4c-ab00-71902c455beb-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5l55k\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5l55k" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.444805 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9334fa32-3e3c-4a4c-ab00-71902c455beb-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5l55k\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5l55k" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.444830 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9334fa32-3e3c-4a4c-ab00-71902c455beb-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5l55k\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5l55k" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.444857 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kmkj\" (UniqueName: \"kubernetes.io/projected/9334fa32-3e3c-4a4c-ab00-71902c455beb-kube-api-access-5kmkj\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5l55k\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5l55k" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.444898 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9334fa32-3e3c-4a4c-ab00-71902c455beb-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5l55k\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5l55k" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.444947 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9334fa32-3e3c-4a4c-ab00-71902c455beb-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5l55k\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5l55k" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.444981 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9334fa32-3e3c-4a4c-ab00-71902c455beb-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5l55k\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5l55k" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.445014 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9334fa32-3e3c-4a4c-ab00-71902c455beb-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5l55k\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5l55k" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.445050 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9334fa32-3e3c-4a4c-ab00-71902c455beb-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5l55k\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5l55k" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.445079 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9334fa32-3e3c-4a4c-ab00-71902c455beb-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5l55k\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5l55k" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.445156 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9334fa32-3e3c-4a4c-ab00-71902c455beb-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5l55k\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5l55k" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.546659 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9334fa32-3e3c-4a4c-ab00-71902c455beb-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5l55k\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5l55k" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.546737 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9334fa32-3e3c-4a4c-ab00-71902c455beb-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5l55k\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5l55k" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.546802 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9334fa32-3e3c-4a4c-ab00-71902c455beb-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5l55k\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5l55k" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.546851 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9334fa32-3e3c-4a4c-ab00-71902c455beb-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5l55k\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5l55k" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.546883 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9334fa32-3e3c-4a4c-ab00-71902c455beb-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5l55k\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5l55k" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.546916 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9334fa32-3e3c-4a4c-ab00-71902c455beb-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5l55k\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5l55k" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.546945 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kmkj\" (UniqueName: \"kubernetes.io/projected/9334fa32-3e3c-4a4c-ab00-71902c455beb-kube-api-access-5kmkj\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5l55k\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5l55k" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.547000 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9334fa32-3e3c-4a4c-ab00-71902c455beb-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5l55k\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5l55k" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.547075 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9334fa32-3e3c-4a4c-ab00-71902c455beb-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5l55k\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5l55k" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.547115 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9334fa32-3e3c-4a4c-ab00-71902c455beb-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5l55k\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5l55k" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.547157 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9334fa32-3e3c-4a4c-ab00-71902c455beb-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5l55k\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5l55k" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.547208 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9334fa32-3e3c-4a4c-ab00-71902c455beb-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5l55k\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5l55k" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.547241 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9334fa32-3e3c-4a4c-ab00-71902c455beb-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5l55k\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5l55k" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.547359 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9334fa32-3e3c-4a4c-ab00-71902c455beb-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5l55k\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5l55k" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.553693 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9334fa32-3e3c-4a4c-ab00-71902c455beb-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5l55k\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5l55k" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.554385 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9334fa32-3e3c-4a4c-ab00-71902c455beb-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5l55k\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5l55k" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.554730 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9334fa32-3e3c-4a4c-ab00-71902c455beb-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5l55k\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5l55k" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.555072 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9334fa32-3e3c-4a4c-ab00-71902c455beb-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5l55k\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5l55k" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.555432 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9334fa32-3e3c-4a4c-ab00-71902c455beb-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5l55k\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5l55k" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.556910 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9334fa32-3e3c-4a4c-ab00-71902c455beb-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5l55k\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5l55k" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.557383 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9334fa32-3e3c-4a4c-ab00-71902c455beb-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5l55k\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5l55k" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.557504 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9334fa32-3e3c-4a4c-ab00-71902c455beb-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5l55k\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5l55k" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.558713 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9334fa32-3e3c-4a4c-ab00-71902c455beb-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5l55k\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5l55k" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.560398 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9334fa32-3e3c-4a4c-ab00-71902c455beb-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5l55k\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5l55k" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.561014 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9334fa32-3e3c-4a4c-ab00-71902c455beb-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5l55k\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5l55k" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.562959 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9334fa32-3e3c-4a4c-ab00-71902c455beb-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5l55k\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5l55k" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.565678 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9334fa32-3e3c-4a4c-ab00-71902c455beb-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5l55k\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5l55k" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.569080 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kmkj\" (UniqueName: \"kubernetes.io/projected/9334fa32-3e3c-4a4c-ab00-71902c455beb-kube-api-access-5kmkj\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5l55k\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5l55k" Oct 03 08:23:13 crc kubenswrapper[4664]: I1003 08:23:13.690013 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5l55k" Oct 03 08:23:14 crc kubenswrapper[4664]: I1003 08:23:14.262197 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5l55k"] Oct 03 08:23:14 crc kubenswrapper[4664]: I1003 08:23:14.294178 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5l55k" event={"ID":"9334fa32-3e3c-4a4c-ab00-71902c455beb","Type":"ContainerStarted","Data":"df5d65353f7bf16ab4fbf0d1906e1b993364406101d7298f9b57b4972bd18f6b"} Oct 03 08:23:15 crc kubenswrapper[4664]: I1003 08:23:15.305037 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5l55k" event={"ID":"9334fa32-3e3c-4a4c-ab00-71902c455beb","Type":"ContainerStarted","Data":"3bc5a18791fae9a8edb4e9f6a219adadfb4ce55c1bd429b25e4157fd89d87a9e"} Oct 03 08:23:15 crc kubenswrapper[4664]: I1003 08:23:15.340560 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5l55k" podStartSLOduration=2.136249647 podStartE2EDuration="2.340536839s" podCreationTimestamp="2025-10-03 08:23:13 +0000 UTC" firstStartedPulling="2025-10-03 08:23:14.264995002 +0000 UTC m=+2095.086185492" lastFinishedPulling="2025-10-03 08:23:14.469282194 +0000 UTC m=+2095.290472684" observedRunningTime="2025-10-03 08:23:15.331064598 +0000 UTC m=+2096.152255098" watchObservedRunningTime="2025-10-03 08:23:15.340536839 +0000 UTC m=+2096.161727329" Oct 03 08:23:25 crc kubenswrapper[4664]: I1003 08:23:25.142464 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-w48ld"] Oct 03 08:23:25 crc kubenswrapper[4664]: I1003 08:23:25.146652 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w48ld" Oct 03 08:23:25 crc kubenswrapper[4664]: I1003 08:23:25.155441 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w48ld"] Oct 03 08:23:25 crc kubenswrapper[4664]: I1003 08:23:25.298522 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2zn5\" (UniqueName: \"kubernetes.io/projected/c72c84e0-19b3-456a-ad59-dc41f4584315-kube-api-access-s2zn5\") pod \"redhat-operators-w48ld\" (UID: \"c72c84e0-19b3-456a-ad59-dc41f4584315\") " pod="openshift-marketplace/redhat-operators-w48ld" Oct 03 08:23:25 crc kubenswrapper[4664]: I1003 08:23:25.298596 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c72c84e0-19b3-456a-ad59-dc41f4584315-catalog-content\") pod \"redhat-operators-w48ld\" (UID: \"c72c84e0-19b3-456a-ad59-dc41f4584315\") " pod="openshift-marketplace/redhat-operators-w48ld" Oct 03 08:23:25 crc kubenswrapper[4664]: I1003 08:23:25.298737 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c72c84e0-19b3-456a-ad59-dc41f4584315-utilities\") pod \"redhat-operators-w48ld\" (UID: \"c72c84e0-19b3-456a-ad59-dc41f4584315\") " pod="openshift-marketplace/redhat-operators-w48ld" Oct 03 08:23:25 crc kubenswrapper[4664]: I1003 08:23:25.400642 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c72c84e0-19b3-456a-ad59-dc41f4584315-catalog-content\") pod \"redhat-operators-w48ld\" (UID: \"c72c84e0-19b3-456a-ad59-dc41f4584315\") " pod="openshift-marketplace/redhat-operators-w48ld" Oct 03 08:23:25 crc kubenswrapper[4664]: I1003 08:23:25.400752 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c72c84e0-19b3-456a-ad59-dc41f4584315-utilities\") pod \"redhat-operators-w48ld\" (UID: \"c72c84e0-19b3-456a-ad59-dc41f4584315\") " pod="openshift-marketplace/redhat-operators-w48ld" Oct 03 08:23:25 crc kubenswrapper[4664]: I1003 08:23:25.400898 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2zn5\" (UniqueName: \"kubernetes.io/projected/c72c84e0-19b3-456a-ad59-dc41f4584315-kube-api-access-s2zn5\") pod \"redhat-operators-w48ld\" (UID: \"c72c84e0-19b3-456a-ad59-dc41f4584315\") " pod="openshift-marketplace/redhat-operators-w48ld" Oct 03 08:23:25 crc kubenswrapper[4664]: I1003 08:23:25.401306 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c72c84e0-19b3-456a-ad59-dc41f4584315-catalog-content\") pod \"redhat-operators-w48ld\" (UID: \"c72c84e0-19b3-456a-ad59-dc41f4584315\") " pod="openshift-marketplace/redhat-operators-w48ld" Oct 03 08:23:25 crc kubenswrapper[4664]: I1003 08:23:25.401346 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c72c84e0-19b3-456a-ad59-dc41f4584315-utilities\") pod \"redhat-operators-w48ld\" (UID: \"c72c84e0-19b3-456a-ad59-dc41f4584315\") " pod="openshift-marketplace/redhat-operators-w48ld" Oct 03 08:23:25 crc kubenswrapper[4664]: I1003 08:23:25.424312 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2zn5\" (UniqueName: \"kubernetes.io/projected/c72c84e0-19b3-456a-ad59-dc41f4584315-kube-api-access-s2zn5\") pod \"redhat-operators-w48ld\" (UID: \"c72c84e0-19b3-456a-ad59-dc41f4584315\") " pod="openshift-marketplace/redhat-operators-w48ld" Oct 03 08:23:25 crc kubenswrapper[4664]: I1003 08:23:25.497828 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w48ld" Oct 03 08:23:25 crc kubenswrapper[4664]: I1003 08:23:25.986476 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w48ld"] Oct 03 08:23:26 crc kubenswrapper[4664]: I1003 08:23:26.401079 4664 generic.go:334] "Generic (PLEG): container finished" podID="c72c84e0-19b3-456a-ad59-dc41f4584315" containerID="4881d9ce37a8efbf1a72c1977c18bf5ec6f7ac59eb5b47f5b19d9e9220aa5af0" exitCode=0 Oct 03 08:23:26 crc kubenswrapper[4664]: I1003 08:23:26.401248 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w48ld" event={"ID":"c72c84e0-19b3-456a-ad59-dc41f4584315","Type":"ContainerDied","Data":"4881d9ce37a8efbf1a72c1977c18bf5ec6f7ac59eb5b47f5b19d9e9220aa5af0"} Oct 03 08:23:26 crc kubenswrapper[4664]: I1003 08:23:26.401410 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w48ld" event={"ID":"c72c84e0-19b3-456a-ad59-dc41f4584315","Type":"ContainerStarted","Data":"7ef25ed5f5719fc69a3923ec901a5009331f071c5aeddb2411bc1e365a0d96c0"} Oct 03 08:23:27 crc kubenswrapper[4664]: I1003 08:23:27.412825 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w48ld" event={"ID":"c72c84e0-19b3-456a-ad59-dc41f4584315","Type":"ContainerStarted","Data":"33e9e6ba8a10deb8acca5cac40c3cd662f13d56cd3c1284c09fa82ae25a09557"} Oct 03 08:23:29 crc kubenswrapper[4664]: I1003 08:23:29.429349 4664 generic.go:334] "Generic (PLEG): container finished" podID="c72c84e0-19b3-456a-ad59-dc41f4584315" containerID="33e9e6ba8a10deb8acca5cac40c3cd662f13d56cd3c1284c09fa82ae25a09557" exitCode=0 Oct 03 08:23:29 crc kubenswrapper[4664]: I1003 08:23:29.429433 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w48ld" event={"ID":"c72c84e0-19b3-456a-ad59-dc41f4584315","Type":"ContainerDied","Data":"33e9e6ba8a10deb8acca5cac40c3cd662f13d56cd3c1284c09fa82ae25a09557"} Oct 03 08:23:30 crc kubenswrapper[4664]: I1003 08:23:30.454951 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w48ld" event={"ID":"c72c84e0-19b3-456a-ad59-dc41f4584315","Type":"ContainerStarted","Data":"6b243f006f909f8aa85f389ee2ad9ec1d04613a604ba3141e3ea5493ff4cb2d9"} Oct 03 08:23:30 crc kubenswrapper[4664]: I1003 08:23:30.481618 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-w48ld" podStartSLOduration=2.024112254 podStartE2EDuration="5.481581737s" podCreationTimestamp="2025-10-03 08:23:25 +0000 UTC" firstStartedPulling="2025-10-03 08:23:26.402930801 +0000 UTC m=+2107.224121291" lastFinishedPulling="2025-10-03 08:23:29.860400284 +0000 UTC m=+2110.681590774" observedRunningTime="2025-10-03 08:23:30.478188974 +0000 UTC m=+2111.299379464" watchObservedRunningTime="2025-10-03 08:23:30.481581737 +0000 UTC m=+2111.302772227" Oct 03 08:23:35 crc kubenswrapper[4664]: I1003 08:23:35.498469 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-w48ld" Oct 03 08:23:35 crc kubenswrapper[4664]: I1003 08:23:35.499202 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-w48ld" Oct 03 08:23:35 crc kubenswrapper[4664]: I1003 08:23:35.547278 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-w48ld" Oct 03 08:23:36 crc kubenswrapper[4664]: I1003 08:23:36.561256 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-w48ld" Oct 03 08:23:36 crc kubenswrapper[4664]: I1003 08:23:36.612437 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w48ld"] Oct 03 08:23:38 crc kubenswrapper[4664]: I1003 08:23:38.521472 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-w48ld" podUID="c72c84e0-19b3-456a-ad59-dc41f4584315" containerName="registry-server" containerID="cri-o://6b243f006f909f8aa85f389ee2ad9ec1d04613a604ba3141e3ea5493ff4cb2d9" gracePeriod=2 Oct 03 08:23:39 crc kubenswrapper[4664]: I1003 08:23:39.037712 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w48ld" Oct 03 08:23:39 crc kubenswrapper[4664]: I1003 08:23:39.102331 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2zn5\" (UniqueName: \"kubernetes.io/projected/c72c84e0-19b3-456a-ad59-dc41f4584315-kube-api-access-s2zn5\") pod \"c72c84e0-19b3-456a-ad59-dc41f4584315\" (UID: \"c72c84e0-19b3-456a-ad59-dc41f4584315\") " Oct 03 08:23:39 crc kubenswrapper[4664]: I1003 08:23:39.102455 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c72c84e0-19b3-456a-ad59-dc41f4584315-utilities\") pod \"c72c84e0-19b3-456a-ad59-dc41f4584315\" (UID: \"c72c84e0-19b3-456a-ad59-dc41f4584315\") " Oct 03 08:23:39 crc kubenswrapper[4664]: I1003 08:23:39.102533 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c72c84e0-19b3-456a-ad59-dc41f4584315-catalog-content\") pod \"c72c84e0-19b3-456a-ad59-dc41f4584315\" (UID: \"c72c84e0-19b3-456a-ad59-dc41f4584315\") " Oct 03 08:23:39 crc kubenswrapper[4664]: I1003 08:23:39.104290 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c72c84e0-19b3-456a-ad59-dc41f4584315-utilities" (OuterVolumeSpecName: "utilities") pod "c72c84e0-19b3-456a-ad59-dc41f4584315" (UID: "c72c84e0-19b3-456a-ad59-dc41f4584315"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:23:39 crc kubenswrapper[4664]: I1003 08:23:39.110096 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c72c84e0-19b3-456a-ad59-dc41f4584315-kube-api-access-s2zn5" (OuterVolumeSpecName: "kube-api-access-s2zn5") pod "c72c84e0-19b3-456a-ad59-dc41f4584315" (UID: "c72c84e0-19b3-456a-ad59-dc41f4584315"). InnerVolumeSpecName "kube-api-access-s2zn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:23:39 crc kubenswrapper[4664]: I1003 08:23:39.189661 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c72c84e0-19b3-456a-ad59-dc41f4584315-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c72c84e0-19b3-456a-ad59-dc41f4584315" (UID: "c72c84e0-19b3-456a-ad59-dc41f4584315"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:23:39 crc kubenswrapper[4664]: I1003 08:23:39.204774 4664 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c72c84e0-19b3-456a-ad59-dc41f4584315-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:23:39 crc kubenswrapper[4664]: I1003 08:23:39.205106 4664 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c72c84e0-19b3-456a-ad59-dc41f4584315-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:23:39 crc kubenswrapper[4664]: I1003 08:23:39.205177 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2zn5\" (UniqueName: \"kubernetes.io/projected/c72c84e0-19b3-456a-ad59-dc41f4584315-kube-api-access-s2zn5\") on node \"crc\" DevicePath \"\"" Oct 03 08:23:39 crc kubenswrapper[4664]: I1003 08:23:39.531909 4664 generic.go:334] "Generic (PLEG): container finished" podID="c72c84e0-19b3-456a-ad59-dc41f4584315" containerID="6b243f006f909f8aa85f389ee2ad9ec1d04613a604ba3141e3ea5493ff4cb2d9" exitCode=0 Oct 03 08:23:39 crc kubenswrapper[4664]: I1003 08:23:39.531992 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w48ld" event={"ID":"c72c84e0-19b3-456a-ad59-dc41f4584315","Type":"ContainerDied","Data":"6b243f006f909f8aa85f389ee2ad9ec1d04613a604ba3141e3ea5493ff4cb2d9"} Oct 03 08:23:39 crc kubenswrapper[4664]: I1003 08:23:39.532484 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w48ld" event={"ID":"c72c84e0-19b3-456a-ad59-dc41f4584315","Type":"ContainerDied","Data":"7ef25ed5f5719fc69a3923ec901a5009331f071c5aeddb2411bc1e365a0d96c0"} Oct 03 08:23:39 crc kubenswrapper[4664]: I1003 08:23:39.532532 4664 scope.go:117] "RemoveContainer" containerID="6b243f006f909f8aa85f389ee2ad9ec1d04613a604ba3141e3ea5493ff4cb2d9" Oct 03 08:23:39 crc kubenswrapper[4664]: I1003 08:23:39.532019 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w48ld" Oct 03 08:23:39 crc kubenswrapper[4664]: I1003 08:23:39.564326 4664 scope.go:117] "RemoveContainer" containerID="33e9e6ba8a10deb8acca5cac40c3cd662f13d56cd3c1284c09fa82ae25a09557" Oct 03 08:23:39 crc kubenswrapper[4664]: I1003 08:23:39.577992 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w48ld"] Oct 03 08:23:39 crc kubenswrapper[4664]: I1003 08:23:39.586034 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-w48ld"] Oct 03 08:23:39 crc kubenswrapper[4664]: I1003 08:23:39.590176 4664 scope.go:117] "RemoveContainer" containerID="4881d9ce37a8efbf1a72c1977c18bf5ec6f7ac59eb5b47f5b19d9e9220aa5af0" Oct 03 08:23:39 crc kubenswrapper[4664]: I1003 08:23:39.650793 4664 scope.go:117] "RemoveContainer" containerID="6b243f006f909f8aa85f389ee2ad9ec1d04613a604ba3141e3ea5493ff4cb2d9" Oct 03 08:23:39 crc kubenswrapper[4664]: E1003 08:23:39.651417 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b243f006f909f8aa85f389ee2ad9ec1d04613a604ba3141e3ea5493ff4cb2d9\": container with ID starting with 6b243f006f909f8aa85f389ee2ad9ec1d04613a604ba3141e3ea5493ff4cb2d9 not found: ID does not exist" containerID="6b243f006f909f8aa85f389ee2ad9ec1d04613a604ba3141e3ea5493ff4cb2d9" Oct 03 08:23:39 crc kubenswrapper[4664]: I1003 08:23:39.651520 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b243f006f909f8aa85f389ee2ad9ec1d04613a604ba3141e3ea5493ff4cb2d9"} err="failed to get container status \"6b243f006f909f8aa85f389ee2ad9ec1d04613a604ba3141e3ea5493ff4cb2d9\": rpc error: code = NotFound desc = could not find container \"6b243f006f909f8aa85f389ee2ad9ec1d04613a604ba3141e3ea5493ff4cb2d9\": container with ID starting with 6b243f006f909f8aa85f389ee2ad9ec1d04613a604ba3141e3ea5493ff4cb2d9 not found: ID does not exist" Oct 03 08:23:39 crc kubenswrapper[4664]: I1003 08:23:39.651565 4664 scope.go:117] "RemoveContainer" containerID="33e9e6ba8a10deb8acca5cac40c3cd662f13d56cd3c1284c09fa82ae25a09557" Oct 03 08:23:39 crc kubenswrapper[4664]: E1003 08:23:39.652339 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33e9e6ba8a10deb8acca5cac40c3cd662f13d56cd3c1284c09fa82ae25a09557\": container with ID starting with 33e9e6ba8a10deb8acca5cac40c3cd662f13d56cd3c1284c09fa82ae25a09557 not found: ID does not exist" containerID="33e9e6ba8a10deb8acca5cac40c3cd662f13d56cd3c1284c09fa82ae25a09557" Oct 03 08:23:39 crc kubenswrapper[4664]: I1003 08:23:39.652400 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33e9e6ba8a10deb8acca5cac40c3cd662f13d56cd3c1284c09fa82ae25a09557"} err="failed to get container status \"33e9e6ba8a10deb8acca5cac40c3cd662f13d56cd3c1284c09fa82ae25a09557\": rpc error: code = NotFound desc = could not find container \"33e9e6ba8a10deb8acca5cac40c3cd662f13d56cd3c1284c09fa82ae25a09557\": container with ID starting with 33e9e6ba8a10deb8acca5cac40c3cd662f13d56cd3c1284c09fa82ae25a09557 not found: ID does not exist" Oct 03 08:23:39 crc kubenswrapper[4664]: I1003 08:23:39.652440 4664 scope.go:117] "RemoveContainer" containerID="4881d9ce37a8efbf1a72c1977c18bf5ec6f7ac59eb5b47f5b19d9e9220aa5af0" Oct 03 08:23:39 crc kubenswrapper[4664]: E1003 08:23:39.652881 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4881d9ce37a8efbf1a72c1977c18bf5ec6f7ac59eb5b47f5b19d9e9220aa5af0\": container with ID starting with 4881d9ce37a8efbf1a72c1977c18bf5ec6f7ac59eb5b47f5b19d9e9220aa5af0 not found: ID does not exist" containerID="4881d9ce37a8efbf1a72c1977c18bf5ec6f7ac59eb5b47f5b19d9e9220aa5af0" Oct 03 08:23:39 crc kubenswrapper[4664]: I1003 08:23:39.652941 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4881d9ce37a8efbf1a72c1977c18bf5ec6f7ac59eb5b47f5b19d9e9220aa5af0"} err="failed to get container status \"4881d9ce37a8efbf1a72c1977c18bf5ec6f7ac59eb5b47f5b19d9e9220aa5af0\": rpc error: code = NotFound desc = could not find container \"4881d9ce37a8efbf1a72c1977c18bf5ec6f7ac59eb5b47f5b19d9e9220aa5af0\": container with ID starting with 4881d9ce37a8efbf1a72c1977c18bf5ec6f7ac59eb5b47f5b19d9e9220aa5af0 not found: ID does not exist" Oct 03 08:23:39 crc kubenswrapper[4664]: I1003 08:23:39.887834 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c72c84e0-19b3-456a-ad59-dc41f4584315" path="/var/lib/kubelet/pods/c72c84e0-19b3-456a-ad59-dc41f4584315/volumes" Oct 03 08:23:53 crc kubenswrapper[4664]: I1003 08:23:53.685564 4664 generic.go:334] "Generic (PLEG): container finished" podID="9334fa32-3e3c-4a4c-ab00-71902c455beb" containerID="3bc5a18791fae9a8edb4e9f6a219adadfb4ce55c1bd429b25e4157fd89d87a9e" exitCode=0 Oct 03 08:23:53 crc kubenswrapper[4664]: I1003 08:23:53.685761 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5l55k" event={"ID":"9334fa32-3e3c-4a4c-ab00-71902c455beb","Type":"ContainerDied","Data":"3bc5a18791fae9a8edb4e9f6a219adadfb4ce55c1bd429b25e4157fd89d87a9e"} Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.156421 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5l55k" Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.239288 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9334fa32-3e3c-4a4c-ab00-71902c455beb-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"9334fa32-3e3c-4a4c-ab00-71902c455beb\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.239357 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9334fa32-3e3c-4a4c-ab00-71902c455beb-bootstrap-combined-ca-bundle\") pod \"9334fa32-3e3c-4a4c-ab00-71902c455beb\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.239458 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9334fa32-3e3c-4a4c-ab00-71902c455beb-nova-combined-ca-bundle\") pod \"9334fa32-3e3c-4a4c-ab00-71902c455beb\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.239504 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9334fa32-3e3c-4a4c-ab00-71902c455beb-inventory\") pod \"9334fa32-3e3c-4a4c-ab00-71902c455beb\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.239529 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9334fa32-3e3c-4a4c-ab00-71902c455beb-libvirt-combined-ca-bundle\") pod \"9334fa32-3e3c-4a4c-ab00-71902c455beb\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.239562 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9334fa32-3e3c-4a4c-ab00-71902c455beb-openstack-edpm-ipam-ovn-default-certs-0\") pod \"9334fa32-3e3c-4a4c-ab00-71902c455beb\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.239646 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9334fa32-3e3c-4a4c-ab00-71902c455beb-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"9334fa32-3e3c-4a4c-ab00-71902c455beb\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.239691 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9334fa32-3e3c-4a4c-ab00-71902c455beb-neutron-metadata-combined-ca-bundle\") pod \"9334fa32-3e3c-4a4c-ab00-71902c455beb\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.239742 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kmkj\" (UniqueName: \"kubernetes.io/projected/9334fa32-3e3c-4a4c-ab00-71902c455beb-kube-api-access-5kmkj\") pod \"9334fa32-3e3c-4a4c-ab00-71902c455beb\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.239799 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9334fa32-3e3c-4a4c-ab00-71902c455beb-telemetry-combined-ca-bundle\") pod \"9334fa32-3e3c-4a4c-ab00-71902c455beb\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.239848 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9334fa32-3e3c-4a4c-ab00-71902c455beb-ovn-combined-ca-bundle\") pod \"9334fa32-3e3c-4a4c-ab00-71902c455beb\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.239895 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9334fa32-3e3c-4a4c-ab00-71902c455beb-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"9334fa32-3e3c-4a4c-ab00-71902c455beb\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.240467 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9334fa32-3e3c-4a4c-ab00-71902c455beb-repo-setup-combined-ca-bundle\") pod \"9334fa32-3e3c-4a4c-ab00-71902c455beb\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.240512 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9334fa32-3e3c-4a4c-ab00-71902c455beb-ssh-key\") pod \"9334fa32-3e3c-4a4c-ab00-71902c455beb\" (UID: \"9334fa32-3e3c-4a4c-ab00-71902c455beb\") " Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.248656 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9334fa32-3e3c-4a4c-ab00-71902c455beb-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "9334fa32-3e3c-4a4c-ab00-71902c455beb" (UID: "9334fa32-3e3c-4a4c-ab00-71902c455beb"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.249079 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9334fa32-3e3c-4a4c-ab00-71902c455beb-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "9334fa32-3e3c-4a4c-ab00-71902c455beb" (UID: "9334fa32-3e3c-4a4c-ab00-71902c455beb"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.249198 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9334fa32-3e3c-4a4c-ab00-71902c455beb-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "9334fa32-3e3c-4a4c-ab00-71902c455beb" (UID: "9334fa32-3e3c-4a4c-ab00-71902c455beb"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.250026 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9334fa32-3e3c-4a4c-ab00-71902c455beb-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "9334fa32-3e3c-4a4c-ab00-71902c455beb" (UID: "9334fa32-3e3c-4a4c-ab00-71902c455beb"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.250152 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9334fa32-3e3c-4a4c-ab00-71902c455beb-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "9334fa32-3e3c-4a4c-ab00-71902c455beb" (UID: "9334fa32-3e3c-4a4c-ab00-71902c455beb"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.250055 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9334fa32-3e3c-4a4c-ab00-71902c455beb-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "9334fa32-3e3c-4a4c-ab00-71902c455beb" (UID: "9334fa32-3e3c-4a4c-ab00-71902c455beb"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.251185 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9334fa32-3e3c-4a4c-ab00-71902c455beb-kube-api-access-5kmkj" (OuterVolumeSpecName: "kube-api-access-5kmkj") pod "9334fa32-3e3c-4a4c-ab00-71902c455beb" (UID: "9334fa32-3e3c-4a4c-ab00-71902c455beb"). InnerVolumeSpecName "kube-api-access-5kmkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.251725 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9334fa32-3e3c-4a4c-ab00-71902c455beb-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "9334fa32-3e3c-4a4c-ab00-71902c455beb" (UID: "9334fa32-3e3c-4a4c-ab00-71902c455beb"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.252022 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9334fa32-3e3c-4a4c-ab00-71902c455beb-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "9334fa32-3e3c-4a4c-ab00-71902c455beb" (UID: "9334fa32-3e3c-4a4c-ab00-71902c455beb"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.252429 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9334fa32-3e3c-4a4c-ab00-71902c455beb-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "9334fa32-3e3c-4a4c-ab00-71902c455beb" (UID: "9334fa32-3e3c-4a4c-ab00-71902c455beb"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.254617 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9334fa32-3e3c-4a4c-ab00-71902c455beb-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "9334fa32-3e3c-4a4c-ab00-71902c455beb" (UID: "9334fa32-3e3c-4a4c-ab00-71902c455beb"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.255451 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9334fa32-3e3c-4a4c-ab00-71902c455beb-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "9334fa32-3e3c-4a4c-ab00-71902c455beb" (UID: "9334fa32-3e3c-4a4c-ab00-71902c455beb"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.279757 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9334fa32-3e3c-4a4c-ab00-71902c455beb-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9334fa32-3e3c-4a4c-ab00-71902c455beb" (UID: "9334fa32-3e3c-4a4c-ab00-71902c455beb"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.284011 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9334fa32-3e3c-4a4c-ab00-71902c455beb-inventory" (OuterVolumeSpecName: "inventory") pod "9334fa32-3e3c-4a4c-ab00-71902c455beb" (UID: "9334fa32-3e3c-4a4c-ab00-71902c455beb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.343619 4664 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9334fa32-3e3c-4a4c-ab00-71902c455beb-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.343932 4664 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9334fa32-3e3c-4a4c-ab00-71902c455beb-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.343960 4664 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9334fa32-3e3c-4a4c-ab00-71902c455beb-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.343973 4664 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9334fa32-3e3c-4a4c-ab00-71902c455beb-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.343984 4664 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9334fa32-3e3c-4a4c-ab00-71902c455beb-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.343996 4664 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9334fa32-3e3c-4a4c-ab00-71902c455beb-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.344010 4664 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9334fa32-3e3c-4a4c-ab00-71902c455beb-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.344027 4664 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9334fa32-3e3c-4a4c-ab00-71902c455beb-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.344038 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kmkj\" (UniqueName: \"kubernetes.io/projected/9334fa32-3e3c-4a4c-ab00-71902c455beb-kube-api-access-5kmkj\") on node \"crc\" DevicePath \"\"" Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.344051 4664 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9334fa32-3e3c-4a4c-ab00-71902c455beb-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.344062 4664 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9334fa32-3e3c-4a4c-ab00-71902c455beb-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.344073 4664 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9334fa32-3e3c-4a4c-ab00-71902c455beb-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.344085 4664 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9334fa32-3e3c-4a4c-ab00-71902c455beb-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.344099 4664 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9334fa32-3e3c-4a4c-ab00-71902c455beb-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.713695 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5l55k" Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.713734 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5l55k" event={"ID":"9334fa32-3e3c-4a4c-ab00-71902c455beb","Type":"ContainerDied","Data":"df5d65353f7bf16ab4fbf0d1906e1b993364406101d7298f9b57b4972bd18f6b"} Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.713778 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df5d65353f7bf16ab4fbf0d1906e1b993364406101d7298f9b57b4972bd18f6b" Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.817613 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-v46ns"] Oct 03 08:23:55 crc kubenswrapper[4664]: E1003 08:23:55.818143 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c72c84e0-19b3-456a-ad59-dc41f4584315" containerName="registry-server" Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.818164 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="c72c84e0-19b3-456a-ad59-dc41f4584315" containerName="registry-server" Oct 03 08:23:55 crc kubenswrapper[4664]: E1003 08:23:55.818177 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c72c84e0-19b3-456a-ad59-dc41f4584315" containerName="extract-content" Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.818184 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="c72c84e0-19b3-456a-ad59-dc41f4584315" containerName="extract-content" Oct 03 08:23:55 crc kubenswrapper[4664]: E1003 08:23:55.818207 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9334fa32-3e3c-4a4c-ab00-71902c455beb" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.818219 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="9334fa32-3e3c-4a4c-ab00-71902c455beb" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 03 08:23:55 crc kubenswrapper[4664]: E1003 08:23:55.818251 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c72c84e0-19b3-456a-ad59-dc41f4584315" containerName="extract-utilities" Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.818258 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="c72c84e0-19b3-456a-ad59-dc41f4584315" containerName="extract-utilities" Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.818491 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="c72c84e0-19b3-456a-ad59-dc41f4584315" containerName="registry-server" Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.818519 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="9334fa32-3e3c-4a4c-ab00-71902c455beb" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.821890 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v46ns" Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.833600 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.833992 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.834364 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.834555 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h6s74" Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.834719 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.853074 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-v46ns"] Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.958990 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/c317ebff-b55c-4484-97a8-d90142316326-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v46ns\" (UID: \"c317ebff-b55c-4484-97a8-d90142316326\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v46ns" Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.959170 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c317ebff-b55c-4484-97a8-d90142316326-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v46ns\" (UID: \"c317ebff-b55c-4484-97a8-d90142316326\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v46ns" Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.959209 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c317ebff-b55c-4484-97a8-d90142316326-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v46ns\" (UID: \"c317ebff-b55c-4484-97a8-d90142316326\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v46ns" Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.959247 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c317ebff-b55c-4484-97a8-d90142316326-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v46ns\" (UID: \"c317ebff-b55c-4484-97a8-d90142316326\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v46ns" Oct 03 08:23:55 crc kubenswrapper[4664]: I1003 08:23:55.959288 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcfl6\" (UniqueName: \"kubernetes.io/projected/c317ebff-b55c-4484-97a8-d90142316326-kube-api-access-tcfl6\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v46ns\" (UID: \"c317ebff-b55c-4484-97a8-d90142316326\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v46ns" Oct 03 08:23:56 crc kubenswrapper[4664]: I1003 08:23:56.060853 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c317ebff-b55c-4484-97a8-d90142316326-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v46ns\" (UID: \"c317ebff-b55c-4484-97a8-d90142316326\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v46ns" Oct 03 08:23:56 crc kubenswrapper[4664]: I1003 08:23:56.060922 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcfl6\" (UniqueName: \"kubernetes.io/projected/c317ebff-b55c-4484-97a8-d90142316326-kube-api-access-tcfl6\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v46ns\" (UID: \"c317ebff-b55c-4484-97a8-d90142316326\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v46ns" Oct 03 08:23:56 crc kubenswrapper[4664]: I1003 08:23:56.060981 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/c317ebff-b55c-4484-97a8-d90142316326-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v46ns\" (UID: \"c317ebff-b55c-4484-97a8-d90142316326\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v46ns" Oct 03 08:23:56 crc kubenswrapper[4664]: I1003 08:23:56.061075 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c317ebff-b55c-4484-97a8-d90142316326-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v46ns\" (UID: \"c317ebff-b55c-4484-97a8-d90142316326\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v46ns" Oct 03 08:23:56 crc kubenswrapper[4664]: I1003 08:23:56.061116 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c317ebff-b55c-4484-97a8-d90142316326-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v46ns\" (UID: \"c317ebff-b55c-4484-97a8-d90142316326\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v46ns" Oct 03 08:23:56 crc kubenswrapper[4664]: I1003 08:23:56.063131 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/c317ebff-b55c-4484-97a8-d90142316326-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v46ns\" (UID: \"c317ebff-b55c-4484-97a8-d90142316326\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v46ns" Oct 03 08:23:56 crc kubenswrapper[4664]: I1003 08:23:56.066899 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c317ebff-b55c-4484-97a8-d90142316326-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v46ns\" (UID: \"c317ebff-b55c-4484-97a8-d90142316326\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v46ns" Oct 03 08:23:56 crc kubenswrapper[4664]: I1003 08:23:56.066973 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c317ebff-b55c-4484-97a8-d90142316326-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v46ns\" (UID: \"c317ebff-b55c-4484-97a8-d90142316326\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v46ns" Oct 03 08:23:56 crc kubenswrapper[4664]: I1003 08:23:56.067582 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c317ebff-b55c-4484-97a8-d90142316326-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v46ns\" (UID: \"c317ebff-b55c-4484-97a8-d90142316326\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v46ns" Oct 03 08:23:56 crc kubenswrapper[4664]: I1003 08:23:56.084241 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcfl6\" (UniqueName: \"kubernetes.io/projected/c317ebff-b55c-4484-97a8-d90142316326-kube-api-access-tcfl6\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v46ns\" (UID: \"c317ebff-b55c-4484-97a8-d90142316326\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v46ns" Oct 03 08:23:56 crc kubenswrapper[4664]: I1003 08:23:56.165256 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v46ns" Oct 03 08:23:56 crc kubenswrapper[4664]: I1003 08:23:56.643270 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-v46ns"] Oct 03 08:23:56 crc kubenswrapper[4664]: I1003 08:23:56.727377 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v46ns" event={"ID":"c317ebff-b55c-4484-97a8-d90142316326","Type":"ContainerStarted","Data":"27dbf5796e08debca0e890ecec25cb6a3c8746fbe88e5118d27e861bf8c339eb"} Oct 03 08:23:57 crc kubenswrapper[4664]: I1003 08:23:57.738285 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v46ns" event={"ID":"c317ebff-b55c-4484-97a8-d90142316326","Type":"ContainerStarted","Data":"63733f76bb164046e37e0ce511a4fd33f713e9239d8ecb6de6653a37787c29c0"} Oct 03 08:23:57 crc kubenswrapper[4664]: I1003 08:23:57.760789 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v46ns" podStartSLOduration=2.4468125130000002 podStartE2EDuration="2.760761117s" podCreationTimestamp="2025-10-03 08:23:55 +0000 UTC" firstStartedPulling="2025-10-03 08:23:56.654340659 +0000 UTC m=+2137.475531149" lastFinishedPulling="2025-10-03 08:23:56.968289263 +0000 UTC m=+2137.789479753" observedRunningTime="2025-10-03 08:23:57.759806411 +0000 UTC m=+2138.580996911" watchObservedRunningTime="2025-10-03 08:23:57.760761117 +0000 UTC m=+2138.581951607" Oct 03 08:24:11 crc kubenswrapper[4664]: I1003 08:24:11.987433 4664 patch_prober.go:28] interesting pod/machine-config-daemon-x9dgm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:24:11 crc kubenswrapper[4664]: I1003 08:24:11.988365 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:24:26 crc kubenswrapper[4664]: I1003 08:24:26.982869 4664 generic.go:334] "Generic (PLEG): container finished" podID="c317ebff-b55c-4484-97a8-d90142316326" containerID="63733f76bb164046e37e0ce511a4fd33f713e9239d8ecb6de6653a37787c29c0" exitCode=2 Oct 03 08:24:26 crc kubenswrapper[4664]: I1003 08:24:26.982954 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v46ns" event={"ID":"c317ebff-b55c-4484-97a8-d90142316326","Type":"ContainerDied","Data":"63733f76bb164046e37e0ce511a4fd33f713e9239d8ecb6de6653a37787c29c0"} Oct 03 08:24:28 crc kubenswrapper[4664]: I1003 08:24:28.375693 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v46ns" Oct 03 08:24:28 crc kubenswrapper[4664]: I1003 08:24:28.445862 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c317ebff-b55c-4484-97a8-d90142316326-inventory\") pod \"c317ebff-b55c-4484-97a8-d90142316326\" (UID: \"c317ebff-b55c-4484-97a8-d90142316326\") " Oct 03 08:24:28 crc kubenswrapper[4664]: I1003 08:24:28.446041 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcfl6\" (UniqueName: \"kubernetes.io/projected/c317ebff-b55c-4484-97a8-d90142316326-kube-api-access-tcfl6\") pod \"c317ebff-b55c-4484-97a8-d90142316326\" (UID: \"c317ebff-b55c-4484-97a8-d90142316326\") " Oct 03 08:24:28 crc kubenswrapper[4664]: I1003 08:24:28.446079 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c317ebff-b55c-4484-97a8-d90142316326-ssh-key\") pod \"c317ebff-b55c-4484-97a8-d90142316326\" (UID: \"c317ebff-b55c-4484-97a8-d90142316326\") " Oct 03 08:24:28 crc kubenswrapper[4664]: I1003 08:24:28.446164 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/c317ebff-b55c-4484-97a8-d90142316326-ovncontroller-config-0\") pod \"c317ebff-b55c-4484-97a8-d90142316326\" (UID: \"c317ebff-b55c-4484-97a8-d90142316326\") " Oct 03 08:24:28 crc kubenswrapper[4664]: I1003 08:24:28.446251 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c317ebff-b55c-4484-97a8-d90142316326-ovn-combined-ca-bundle\") pod \"c317ebff-b55c-4484-97a8-d90142316326\" (UID: \"c317ebff-b55c-4484-97a8-d90142316326\") " Oct 03 08:24:28 crc kubenswrapper[4664]: I1003 08:24:28.451952 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c317ebff-b55c-4484-97a8-d90142316326-kube-api-access-tcfl6" (OuterVolumeSpecName: "kube-api-access-tcfl6") pod "c317ebff-b55c-4484-97a8-d90142316326" (UID: "c317ebff-b55c-4484-97a8-d90142316326"). InnerVolumeSpecName "kube-api-access-tcfl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:24:28 crc kubenswrapper[4664]: I1003 08:24:28.457920 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c317ebff-b55c-4484-97a8-d90142316326-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "c317ebff-b55c-4484-97a8-d90142316326" (UID: "c317ebff-b55c-4484-97a8-d90142316326"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:24:28 crc kubenswrapper[4664]: I1003 08:24:28.475782 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c317ebff-b55c-4484-97a8-d90142316326-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "c317ebff-b55c-4484-97a8-d90142316326" (UID: "c317ebff-b55c-4484-97a8-d90142316326"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:24:28 crc kubenswrapper[4664]: I1003 08:24:28.475951 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c317ebff-b55c-4484-97a8-d90142316326-inventory" (OuterVolumeSpecName: "inventory") pod "c317ebff-b55c-4484-97a8-d90142316326" (UID: "c317ebff-b55c-4484-97a8-d90142316326"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:24:28 crc kubenswrapper[4664]: I1003 08:24:28.477332 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c317ebff-b55c-4484-97a8-d90142316326-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c317ebff-b55c-4484-97a8-d90142316326" (UID: "c317ebff-b55c-4484-97a8-d90142316326"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:24:28 crc kubenswrapper[4664]: I1003 08:24:28.549227 4664 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c317ebff-b55c-4484-97a8-d90142316326-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:24:28 crc kubenswrapper[4664]: I1003 08:24:28.549275 4664 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c317ebff-b55c-4484-97a8-d90142316326-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 08:24:28 crc kubenswrapper[4664]: I1003 08:24:28.549290 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcfl6\" (UniqueName: \"kubernetes.io/projected/c317ebff-b55c-4484-97a8-d90142316326-kube-api-access-tcfl6\") on node \"crc\" DevicePath \"\"" Oct 03 08:24:28 crc kubenswrapper[4664]: I1003 08:24:28.549301 4664 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c317ebff-b55c-4484-97a8-d90142316326-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 08:24:28 crc kubenswrapper[4664]: I1003 08:24:28.549312 4664 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/c317ebff-b55c-4484-97a8-d90142316326-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 03 08:24:29 crc kubenswrapper[4664]: I1003 08:24:29.001808 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v46ns" event={"ID":"c317ebff-b55c-4484-97a8-d90142316326","Type":"ContainerDied","Data":"27dbf5796e08debca0e890ecec25cb6a3c8746fbe88e5118d27e861bf8c339eb"} Oct 03 08:24:29 crc kubenswrapper[4664]: I1003 08:24:29.001863 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27dbf5796e08debca0e890ecec25cb6a3c8746fbe88e5118d27e861bf8c339eb" Oct 03 08:24:29 crc kubenswrapper[4664]: I1003 08:24:29.001900 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v46ns" Oct 03 08:24:36 crc kubenswrapper[4664]: I1003 08:24:36.033012 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-5fkkn"] Oct 03 08:24:36 crc kubenswrapper[4664]: E1003 08:24:36.033812 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c317ebff-b55c-4484-97a8-d90142316326" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 03 08:24:36 crc kubenswrapper[4664]: I1003 08:24:36.033826 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="c317ebff-b55c-4484-97a8-d90142316326" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 03 08:24:36 crc kubenswrapper[4664]: I1003 08:24:36.034061 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="c317ebff-b55c-4484-97a8-d90142316326" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 03 08:24:36 crc kubenswrapper[4664]: I1003 08:24:36.034908 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5fkkn" Oct 03 08:24:36 crc kubenswrapper[4664]: I1003 08:24:36.038254 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 03 08:24:36 crc kubenswrapper[4664]: I1003 08:24:36.038557 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 08:24:36 crc kubenswrapper[4664]: I1003 08:24:36.038681 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h6s74" Oct 03 08:24:36 crc kubenswrapper[4664]: I1003 08:24:36.039397 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 08:24:36 crc kubenswrapper[4664]: I1003 08:24:36.039581 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 08:24:36 crc kubenswrapper[4664]: I1003 08:24:36.048761 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-5fkkn"] Oct 03 08:24:36 crc kubenswrapper[4664]: I1003 08:24:36.091834 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6d8gg\" (UniqueName: \"kubernetes.io/projected/2e962f71-22f7-48d9-af7d-53baea22b9cc-kube-api-access-6d8gg\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5fkkn\" (UID: \"2e962f71-22f7-48d9-af7d-53baea22b9cc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5fkkn" Oct 03 08:24:36 crc kubenswrapper[4664]: I1003 08:24:36.091927 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e962f71-22f7-48d9-af7d-53baea22b9cc-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5fkkn\" (UID: \"2e962f71-22f7-48d9-af7d-53baea22b9cc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5fkkn" Oct 03 08:24:36 crc kubenswrapper[4664]: I1003 08:24:36.091979 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e962f71-22f7-48d9-af7d-53baea22b9cc-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5fkkn\" (UID: \"2e962f71-22f7-48d9-af7d-53baea22b9cc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5fkkn" Oct 03 08:24:36 crc kubenswrapper[4664]: I1003 08:24:36.092048 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2e962f71-22f7-48d9-af7d-53baea22b9cc-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5fkkn\" (UID: \"2e962f71-22f7-48d9-af7d-53baea22b9cc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5fkkn" Oct 03 08:24:36 crc kubenswrapper[4664]: I1003 08:24:36.092129 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/2e962f71-22f7-48d9-af7d-53baea22b9cc-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5fkkn\" (UID: \"2e962f71-22f7-48d9-af7d-53baea22b9cc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5fkkn" Oct 03 08:24:36 crc kubenswrapper[4664]: I1003 08:24:36.194246 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/2e962f71-22f7-48d9-af7d-53baea22b9cc-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5fkkn\" (UID: \"2e962f71-22f7-48d9-af7d-53baea22b9cc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5fkkn" Oct 03 08:24:36 crc kubenswrapper[4664]: I1003 08:24:36.194386 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6d8gg\" (UniqueName: \"kubernetes.io/projected/2e962f71-22f7-48d9-af7d-53baea22b9cc-kube-api-access-6d8gg\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5fkkn\" (UID: \"2e962f71-22f7-48d9-af7d-53baea22b9cc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5fkkn" Oct 03 08:24:36 crc kubenswrapper[4664]: I1003 08:24:36.194432 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e962f71-22f7-48d9-af7d-53baea22b9cc-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5fkkn\" (UID: \"2e962f71-22f7-48d9-af7d-53baea22b9cc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5fkkn" Oct 03 08:24:36 crc kubenswrapper[4664]: I1003 08:24:36.194458 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e962f71-22f7-48d9-af7d-53baea22b9cc-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5fkkn\" (UID: \"2e962f71-22f7-48d9-af7d-53baea22b9cc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5fkkn" Oct 03 08:24:36 crc kubenswrapper[4664]: I1003 08:24:36.194499 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2e962f71-22f7-48d9-af7d-53baea22b9cc-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5fkkn\" (UID: \"2e962f71-22f7-48d9-af7d-53baea22b9cc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5fkkn" Oct 03 08:24:36 crc kubenswrapper[4664]: I1003 08:24:36.196173 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/2e962f71-22f7-48d9-af7d-53baea22b9cc-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5fkkn\" (UID: \"2e962f71-22f7-48d9-af7d-53baea22b9cc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5fkkn" Oct 03 08:24:36 crc kubenswrapper[4664]: I1003 08:24:36.205472 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e962f71-22f7-48d9-af7d-53baea22b9cc-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5fkkn\" (UID: \"2e962f71-22f7-48d9-af7d-53baea22b9cc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5fkkn" Oct 03 08:24:36 crc kubenswrapper[4664]: I1003 08:24:36.205515 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2e962f71-22f7-48d9-af7d-53baea22b9cc-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5fkkn\" (UID: \"2e962f71-22f7-48d9-af7d-53baea22b9cc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5fkkn" Oct 03 08:24:36 crc kubenswrapper[4664]: I1003 08:24:36.205892 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e962f71-22f7-48d9-af7d-53baea22b9cc-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5fkkn\" (UID: \"2e962f71-22f7-48d9-af7d-53baea22b9cc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5fkkn" Oct 03 08:24:36 crc kubenswrapper[4664]: I1003 08:24:36.211225 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6d8gg\" (UniqueName: \"kubernetes.io/projected/2e962f71-22f7-48d9-af7d-53baea22b9cc-kube-api-access-6d8gg\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5fkkn\" (UID: \"2e962f71-22f7-48d9-af7d-53baea22b9cc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5fkkn" Oct 03 08:24:36 crc kubenswrapper[4664]: I1003 08:24:36.358080 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5fkkn" Oct 03 08:24:36 crc kubenswrapper[4664]: I1003 08:24:36.937508 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-5fkkn"] Oct 03 08:24:37 crc kubenswrapper[4664]: I1003 08:24:37.081181 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5fkkn" event={"ID":"2e962f71-22f7-48d9-af7d-53baea22b9cc","Type":"ContainerStarted","Data":"3b00a6e791c34d1e63dcaf95607998db8bef37fcd4998909a528b5c647680fe3"} Oct 03 08:24:38 crc kubenswrapper[4664]: I1003 08:24:38.092032 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5fkkn" event={"ID":"2e962f71-22f7-48d9-af7d-53baea22b9cc","Type":"ContainerStarted","Data":"3b0a5186d803ef3f0d583a1ccdb214fe3e2ffdc8f0bde0e724db00977cdfce53"} Oct 03 08:24:38 crc kubenswrapper[4664]: I1003 08:24:38.114761 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5fkkn" podStartSLOduration=1.91819103 podStartE2EDuration="2.114735222s" podCreationTimestamp="2025-10-03 08:24:36 +0000 UTC" firstStartedPulling="2025-10-03 08:24:36.949870157 +0000 UTC m=+2177.771060647" lastFinishedPulling="2025-10-03 08:24:37.146414349 +0000 UTC m=+2177.967604839" observedRunningTime="2025-10-03 08:24:38.108639458 +0000 UTC m=+2178.929829948" watchObservedRunningTime="2025-10-03 08:24:38.114735222 +0000 UTC m=+2178.935925712" Oct 03 08:24:41 crc kubenswrapper[4664]: I1003 08:24:41.987038 4664 patch_prober.go:28] interesting pod/machine-config-daemon-x9dgm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:24:41 crc kubenswrapper[4664]: I1003 08:24:41.987487 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:25:05 crc kubenswrapper[4664]: I1003 08:25:05.344059 4664 generic.go:334] "Generic (PLEG): container finished" podID="2e962f71-22f7-48d9-af7d-53baea22b9cc" containerID="3b0a5186d803ef3f0d583a1ccdb214fe3e2ffdc8f0bde0e724db00977cdfce53" exitCode=2 Oct 03 08:25:05 crc kubenswrapper[4664]: I1003 08:25:05.344136 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5fkkn" event={"ID":"2e962f71-22f7-48d9-af7d-53baea22b9cc","Type":"ContainerDied","Data":"3b0a5186d803ef3f0d583a1ccdb214fe3e2ffdc8f0bde0e724db00977cdfce53"} Oct 03 08:25:06 crc kubenswrapper[4664]: I1003 08:25:06.810126 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5fkkn" Oct 03 08:25:06 crc kubenswrapper[4664]: I1003 08:25:06.896258 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/2e962f71-22f7-48d9-af7d-53baea22b9cc-ovncontroller-config-0\") pod \"2e962f71-22f7-48d9-af7d-53baea22b9cc\" (UID: \"2e962f71-22f7-48d9-af7d-53baea22b9cc\") " Oct 03 08:25:06 crc kubenswrapper[4664]: I1003 08:25:06.896430 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e962f71-22f7-48d9-af7d-53baea22b9cc-ovn-combined-ca-bundle\") pod \"2e962f71-22f7-48d9-af7d-53baea22b9cc\" (UID: \"2e962f71-22f7-48d9-af7d-53baea22b9cc\") " Oct 03 08:25:06 crc kubenswrapper[4664]: I1003 08:25:06.896487 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2e962f71-22f7-48d9-af7d-53baea22b9cc-ssh-key\") pod \"2e962f71-22f7-48d9-af7d-53baea22b9cc\" (UID: \"2e962f71-22f7-48d9-af7d-53baea22b9cc\") " Oct 03 08:25:06 crc kubenswrapper[4664]: I1003 08:25:06.897271 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6d8gg\" (UniqueName: \"kubernetes.io/projected/2e962f71-22f7-48d9-af7d-53baea22b9cc-kube-api-access-6d8gg\") pod \"2e962f71-22f7-48d9-af7d-53baea22b9cc\" (UID: \"2e962f71-22f7-48d9-af7d-53baea22b9cc\") " Oct 03 08:25:06 crc kubenswrapper[4664]: I1003 08:25:06.897319 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e962f71-22f7-48d9-af7d-53baea22b9cc-inventory\") pod \"2e962f71-22f7-48d9-af7d-53baea22b9cc\" (UID: \"2e962f71-22f7-48d9-af7d-53baea22b9cc\") " Oct 03 08:25:06 crc kubenswrapper[4664]: I1003 08:25:06.903171 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e962f71-22f7-48d9-af7d-53baea22b9cc-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "2e962f71-22f7-48d9-af7d-53baea22b9cc" (UID: "2e962f71-22f7-48d9-af7d-53baea22b9cc"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:25:06 crc kubenswrapper[4664]: I1003 08:25:06.903338 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e962f71-22f7-48d9-af7d-53baea22b9cc-kube-api-access-6d8gg" (OuterVolumeSpecName: "kube-api-access-6d8gg") pod "2e962f71-22f7-48d9-af7d-53baea22b9cc" (UID: "2e962f71-22f7-48d9-af7d-53baea22b9cc"). InnerVolumeSpecName "kube-api-access-6d8gg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:25:06 crc kubenswrapper[4664]: I1003 08:25:06.925540 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e962f71-22f7-48d9-af7d-53baea22b9cc-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "2e962f71-22f7-48d9-af7d-53baea22b9cc" (UID: "2e962f71-22f7-48d9-af7d-53baea22b9cc"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:25:06 crc kubenswrapper[4664]: I1003 08:25:06.927403 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e962f71-22f7-48d9-af7d-53baea22b9cc-inventory" (OuterVolumeSpecName: "inventory") pod "2e962f71-22f7-48d9-af7d-53baea22b9cc" (UID: "2e962f71-22f7-48d9-af7d-53baea22b9cc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:25:06 crc kubenswrapper[4664]: I1003 08:25:06.929889 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e962f71-22f7-48d9-af7d-53baea22b9cc-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2e962f71-22f7-48d9-af7d-53baea22b9cc" (UID: "2e962f71-22f7-48d9-af7d-53baea22b9cc"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:25:07 crc kubenswrapper[4664]: I1003 08:25:07.000265 4664 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e962f71-22f7-48d9-af7d-53baea22b9cc-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:25:07 crc kubenswrapper[4664]: I1003 08:25:07.000309 4664 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2e962f71-22f7-48d9-af7d-53baea22b9cc-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 08:25:07 crc kubenswrapper[4664]: I1003 08:25:07.000320 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6d8gg\" (UniqueName: \"kubernetes.io/projected/2e962f71-22f7-48d9-af7d-53baea22b9cc-kube-api-access-6d8gg\") on node \"crc\" DevicePath \"\"" Oct 03 08:25:07 crc kubenswrapper[4664]: I1003 08:25:07.000328 4664 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e962f71-22f7-48d9-af7d-53baea22b9cc-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 08:25:07 crc kubenswrapper[4664]: I1003 08:25:07.000337 4664 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/2e962f71-22f7-48d9-af7d-53baea22b9cc-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 03 08:25:07 crc kubenswrapper[4664]: I1003 08:25:07.364384 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5fkkn" event={"ID":"2e962f71-22f7-48d9-af7d-53baea22b9cc","Type":"ContainerDied","Data":"3b00a6e791c34d1e63dcaf95607998db8bef37fcd4998909a528b5c647680fe3"} Oct 03 08:25:07 crc kubenswrapper[4664]: I1003 08:25:07.364733 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b00a6e791c34d1e63dcaf95607998db8bef37fcd4998909a528b5c647680fe3" Oct 03 08:25:07 crc kubenswrapper[4664]: I1003 08:25:07.364784 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5fkkn" Oct 03 08:25:11 crc kubenswrapper[4664]: I1003 08:25:11.987523 4664 patch_prober.go:28] interesting pod/machine-config-daemon-x9dgm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:25:11 crc kubenswrapper[4664]: I1003 08:25:11.988280 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:25:11 crc kubenswrapper[4664]: I1003 08:25:11.988337 4664 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" Oct 03 08:25:11 crc kubenswrapper[4664]: I1003 08:25:11.989187 4664 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"be55d517673418e2a00ffa031e5df1b78c2e2063781c6c50b4ccffd65918f5b0"} pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 08:25:11 crc kubenswrapper[4664]: I1003 08:25:11.989245 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" containerID="cri-o://be55d517673418e2a00ffa031e5df1b78c2e2063781c6c50b4ccffd65918f5b0" gracePeriod=600 Oct 03 08:25:12 crc kubenswrapper[4664]: I1003 08:25:12.411069 4664 generic.go:334] "Generic (PLEG): container finished" podID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerID="be55d517673418e2a00ffa031e5df1b78c2e2063781c6c50b4ccffd65918f5b0" exitCode=0 Oct 03 08:25:12 crc kubenswrapper[4664]: I1003 08:25:12.411185 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" event={"ID":"598b81ce-0ce7-498f-9337-ae5e6e64682b","Type":"ContainerDied","Data":"be55d517673418e2a00ffa031e5df1b78c2e2063781c6c50b4ccffd65918f5b0"} Oct 03 08:25:12 crc kubenswrapper[4664]: I1003 08:25:12.411620 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" event={"ID":"598b81ce-0ce7-498f-9337-ae5e6e64682b","Type":"ContainerStarted","Data":"87bb80884c4b381ef0d8fba25dac22cac6a15cc54ca77b684da87cc8b486fdd9"} Oct 03 08:25:12 crc kubenswrapper[4664]: I1003 08:25:12.411656 4664 scope.go:117] "RemoveContainer" containerID="bfd5903d53b3dc9e39f07ec7a1586cebd8a3cc45fb7b6842a928b48f35ec7cf9" Oct 03 08:25:24 crc kubenswrapper[4664]: I1003 08:25:24.029528 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-fgwdn"] Oct 03 08:25:24 crc kubenswrapper[4664]: E1003 08:25:24.030568 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e962f71-22f7-48d9-af7d-53baea22b9cc" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 03 08:25:24 crc kubenswrapper[4664]: I1003 08:25:24.030585 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e962f71-22f7-48d9-af7d-53baea22b9cc" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 03 08:25:24 crc kubenswrapper[4664]: I1003 08:25:24.030839 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e962f71-22f7-48d9-af7d-53baea22b9cc" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 03 08:25:24 crc kubenswrapper[4664]: I1003 08:25:24.031551 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fgwdn" Oct 03 08:25:24 crc kubenswrapper[4664]: I1003 08:25:24.033463 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 08:25:24 crc kubenswrapper[4664]: I1003 08:25:24.039155 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-fgwdn"] Oct 03 08:25:24 crc kubenswrapper[4664]: I1003 08:25:24.039455 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 08:25:24 crc kubenswrapper[4664]: I1003 08:25:24.039483 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 08:25:24 crc kubenswrapper[4664]: I1003 08:25:24.039825 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 03 08:25:24 crc kubenswrapper[4664]: I1003 08:25:24.052976 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h6s74" Oct 03 08:25:24 crc kubenswrapper[4664]: I1003 08:25:24.211512 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdkfh\" (UniqueName: \"kubernetes.io/projected/a36410cf-2458-4589-86e0-e921e7489d07-kube-api-access-wdkfh\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fgwdn\" (UID: \"a36410cf-2458-4589-86e0-e921e7489d07\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fgwdn" Oct 03 08:25:24 crc kubenswrapper[4664]: I1003 08:25:24.211640 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a36410cf-2458-4589-86e0-e921e7489d07-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fgwdn\" (UID: \"a36410cf-2458-4589-86e0-e921e7489d07\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fgwdn" Oct 03 08:25:24 crc kubenswrapper[4664]: I1003 08:25:24.211677 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a36410cf-2458-4589-86e0-e921e7489d07-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fgwdn\" (UID: \"a36410cf-2458-4589-86e0-e921e7489d07\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fgwdn" Oct 03 08:25:24 crc kubenswrapper[4664]: I1003 08:25:24.212052 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a36410cf-2458-4589-86e0-e921e7489d07-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fgwdn\" (UID: \"a36410cf-2458-4589-86e0-e921e7489d07\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fgwdn" Oct 03 08:25:24 crc kubenswrapper[4664]: I1003 08:25:24.212187 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a36410cf-2458-4589-86e0-e921e7489d07-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fgwdn\" (UID: \"a36410cf-2458-4589-86e0-e921e7489d07\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fgwdn" Oct 03 08:25:24 crc kubenswrapper[4664]: I1003 08:25:24.313850 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a36410cf-2458-4589-86e0-e921e7489d07-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fgwdn\" (UID: \"a36410cf-2458-4589-86e0-e921e7489d07\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fgwdn" Oct 03 08:25:24 crc kubenswrapper[4664]: I1003 08:25:24.313930 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a36410cf-2458-4589-86e0-e921e7489d07-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fgwdn\" (UID: \"a36410cf-2458-4589-86e0-e921e7489d07\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fgwdn" Oct 03 08:25:24 crc kubenswrapper[4664]: I1003 08:25:24.314024 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdkfh\" (UniqueName: \"kubernetes.io/projected/a36410cf-2458-4589-86e0-e921e7489d07-kube-api-access-wdkfh\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fgwdn\" (UID: \"a36410cf-2458-4589-86e0-e921e7489d07\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fgwdn" Oct 03 08:25:24 crc kubenswrapper[4664]: I1003 08:25:24.314091 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a36410cf-2458-4589-86e0-e921e7489d07-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fgwdn\" (UID: \"a36410cf-2458-4589-86e0-e921e7489d07\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fgwdn" Oct 03 08:25:24 crc kubenswrapper[4664]: I1003 08:25:24.314130 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a36410cf-2458-4589-86e0-e921e7489d07-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fgwdn\" (UID: \"a36410cf-2458-4589-86e0-e921e7489d07\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fgwdn" Oct 03 08:25:24 crc kubenswrapper[4664]: I1003 08:25:24.315106 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a36410cf-2458-4589-86e0-e921e7489d07-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fgwdn\" (UID: \"a36410cf-2458-4589-86e0-e921e7489d07\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fgwdn" Oct 03 08:25:24 crc kubenswrapper[4664]: I1003 08:25:24.321709 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a36410cf-2458-4589-86e0-e921e7489d07-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fgwdn\" (UID: \"a36410cf-2458-4589-86e0-e921e7489d07\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fgwdn" Oct 03 08:25:24 crc kubenswrapper[4664]: I1003 08:25:24.322165 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a36410cf-2458-4589-86e0-e921e7489d07-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fgwdn\" (UID: \"a36410cf-2458-4589-86e0-e921e7489d07\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fgwdn" Oct 03 08:25:24 crc kubenswrapper[4664]: I1003 08:25:24.323327 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a36410cf-2458-4589-86e0-e921e7489d07-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fgwdn\" (UID: \"a36410cf-2458-4589-86e0-e921e7489d07\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fgwdn" Oct 03 08:25:24 crc kubenswrapper[4664]: I1003 08:25:24.334061 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdkfh\" (UniqueName: \"kubernetes.io/projected/a36410cf-2458-4589-86e0-e921e7489d07-kube-api-access-wdkfh\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fgwdn\" (UID: \"a36410cf-2458-4589-86e0-e921e7489d07\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fgwdn" Oct 03 08:25:24 crc kubenswrapper[4664]: I1003 08:25:24.365023 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fgwdn" Oct 03 08:25:24 crc kubenswrapper[4664]: I1003 08:25:24.919531 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-fgwdn"] Oct 03 08:25:25 crc kubenswrapper[4664]: I1003 08:25:25.542141 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fgwdn" event={"ID":"a36410cf-2458-4589-86e0-e921e7489d07","Type":"ContainerStarted","Data":"13df8aca272471842bd99f9e23288c2f51617facb8b24e25596c5e26fa49b5dc"} Oct 03 08:25:25 crc kubenswrapper[4664]: I1003 08:25:25.543682 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fgwdn" event={"ID":"a36410cf-2458-4589-86e0-e921e7489d07","Type":"ContainerStarted","Data":"170c95a2e11097ff5e782b2dabfaac4a858161278ba990e9da5c34722ace906b"} Oct 03 08:25:25 crc kubenswrapper[4664]: I1003 08:25:25.564266 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fgwdn" podStartSLOduration=1.362905615 podStartE2EDuration="1.564239823s" podCreationTimestamp="2025-10-03 08:25:24 +0000 UTC" firstStartedPulling="2025-10-03 08:25:24.929099378 +0000 UTC m=+2225.750289868" lastFinishedPulling="2025-10-03 08:25:25.130433596 +0000 UTC m=+2225.951624076" observedRunningTime="2025-10-03 08:25:25.557310315 +0000 UTC m=+2226.378500805" watchObservedRunningTime="2025-10-03 08:25:25.564239823 +0000 UTC m=+2226.385430303" Oct 03 08:25:53 crc kubenswrapper[4664]: I1003 08:25:53.771689 4664 generic.go:334] "Generic (PLEG): container finished" podID="a36410cf-2458-4589-86e0-e921e7489d07" containerID="13df8aca272471842bd99f9e23288c2f51617facb8b24e25596c5e26fa49b5dc" exitCode=2 Oct 03 08:25:53 crc kubenswrapper[4664]: I1003 08:25:53.771770 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fgwdn" event={"ID":"a36410cf-2458-4589-86e0-e921e7489d07","Type":"ContainerDied","Data":"13df8aca272471842bd99f9e23288c2f51617facb8b24e25596c5e26fa49b5dc"} Oct 03 08:25:55 crc kubenswrapper[4664]: I1003 08:25:55.232706 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fgwdn" Oct 03 08:25:55 crc kubenswrapper[4664]: I1003 08:25:55.248469 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a36410cf-2458-4589-86e0-e921e7489d07-ovn-combined-ca-bundle\") pod \"a36410cf-2458-4589-86e0-e921e7489d07\" (UID: \"a36410cf-2458-4589-86e0-e921e7489d07\") " Oct 03 08:25:55 crc kubenswrapper[4664]: I1003 08:25:55.248586 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a36410cf-2458-4589-86e0-e921e7489d07-ovncontroller-config-0\") pod \"a36410cf-2458-4589-86e0-e921e7489d07\" (UID: \"a36410cf-2458-4589-86e0-e921e7489d07\") " Oct 03 08:25:55 crc kubenswrapper[4664]: I1003 08:25:55.248689 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a36410cf-2458-4589-86e0-e921e7489d07-inventory\") pod \"a36410cf-2458-4589-86e0-e921e7489d07\" (UID: \"a36410cf-2458-4589-86e0-e921e7489d07\") " Oct 03 08:25:55 crc kubenswrapper[4664]: I1003 08:25:55.248778 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a36410cf-2458-4589-86e0-e921e7489d07-ssh-key\") pod \"a36410cf-2458-4589-86e0-e921e7489d07\" (UID: \"a36410cf-2458-4589-86e0-e921e7489d07\") " Oct 03 08:25:55 crc kubenswrapper[4664]: I1003 08:25:55.248817 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdkfh\" (UniqueName: \"kubernetes.io/projected/a36410cf-2458-4589-86e0-e921e7489d07-kube-api-access-wdkfh\") pod \"a36410cf-2458-4589-86e0-e921e7489d07\" (UID: \"a36410cf-2458-4589-86e0-e921e7489d07\") " Oct 03 08:25:55 crc kubenswrapper[4664]: I1003 08:25:55.257272 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a36410cf-2458-4589-86e0-e921e7489d07-kube-api-access-wdkfh" (OuterVolumeSpecName: "kube-api-access-wdkfh") pod "a36410cf-2458-4589-86e0-e921e7489d07" (UID: "a36410cf-2458-4589-86e0-e921e7489d07"). InnerVolumeSpecName "kube-api-access-wdkfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:25:55 crc kubenswrapper[4664]: I1003 08:25:55.257426 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a36410cf-2458-4589-86e0-e921e7489d07-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "a36410cf-2458-4589-86e0-e921e7489d07" (UID: "a36410cf-2458-4589-86e0-e921e7489d07"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:25:55 crc kubenswrapper[4664]: I1003 08:25:55.282482 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a36410cf-2458-4589-86e0-e921e7489d07-inventory" (OuterVolumeSpecName: "inventory") pod "a36410cf-2458-4589-86e0-e921e7489d07" (UID: "a36410cf-2458-4589-86e0-e921e7489d07"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:25:55 crc kubenswrapper[4664]: I1003 08:25:55.289677 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a36410cf-2458-4589-86e0-e921e7489d07-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "a36410cf-2458-4589-86e0-e921e7489d07" (UID: "a36410cf-2458-4589-86e0-e921e7489d07"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:25:55 crc kubenswrapper[4664]: I1003 08:25:55.295983 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a36410cf-2458-4589-86e0-e921e7489d07-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a36410cf-2458-4589-86e0-e921e7489d07" (UID: "a36410cf-2458-4589-86e0-e921e7489d07"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:25:55 crc kubenswrapper[4664]: I1003 08:25:55.351208 4664 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a36410cf-2458-4589-86e0-e921e7489d07-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:25:55 crc kubenswrapper[4664]: I1003 08:25:55.351286 4664 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a36410cf-2458-4589-86e0-e921e7489d07-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 03 08:25:55 crc kubenswrapper[4664]: I1003 08:25:55.351315 4664 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a36410cf-2458-4589-86e0-e921e7489d07-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 08:25:55 crc kubenswrapper[4664]: I1003 08:25:55.351339 4664 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a36410cf-2458-4589-86e0-e921e7489d07-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 08:25:55 crc kubenswrapper[4664]: I1003 08:25:55.351361 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdkfh\" (UniqueName: \"kubernetes.io/projected/a36410cf-2458-4589-86e0-e921e7489d07-kube-api-access-wdkfh\") on node \"crc\" DevicePath \"\"" Oct 03 08:25:55 crc kubenswrapper[4664]: I1003 08:25:55.791200 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fgwdn" event={"ID":"a36410cf-2458-4589-86e0-e921e7489d07","Type":"ContainerDied","Data":"170c95a2e11097ff5e782b2dabfaac4a858161278ba990e9da5c34722ace906b"} Oct 03 08:25:55 crc kubenswrapper[4664]: I1003 08:25:55.791270 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="170c95a2e11097ff5e782b2dabfaac4a858161278ba990e9da5c34722ace906b" Oct 03 08:25:55 crc kubenswrapper[4664]: I1003 08:25:55.791289 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fgwdn" Oct 03 08:26:33 crc kubenswrapper[4664]: I1003 08:26:33.029699 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-sk7ch"] Oct 03 08:26:33 crc kubenswrapper[4664]: E1003 08:26:33.030648 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a36410cf-2458-4589-86e0-e921e7489d07" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 03 08:26:33 crc kubenswrapper[4664]: I1003 08:26:33.030662 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="a36410cf-2458-4589-86e0-e921e7489d07" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 03 08:26:33 crc kubenswrapper[4664]: I1003 08:26:33.030892 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="a36410cf-2458-4589-86e0-e921e7489d07" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 03 08:26:33 crc kubenswrapper[4664]: I1003 08:26:33.031670 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sk7ch" Oct 03 08:26:33 crc kubenswrapper[4664]: I1003 08:26:33.033876 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h6s74" Oct 03 08:26:33 crc kubenswrapper[4664]: I1003 08:26:33.034128 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 08:26:33 crc kubenswrapper[4664]: I1003 08:26:33.034262 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 08:26:33 crc kubenswrapper[4664]: I1003 08:26:33.034410 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 08:26:33 crc kubenswrapper[4664]: I1003 08:26:33.038949 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 03 08:26:33 crc kubenswrapper[4664]: I1003 08:26:33.060192 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-sk7ch"] Oct 03 08:26:33 crc kubenswrapper[4664]: I1003 08:26:33.134518 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54rtt\" (UniqueName: \"kubernetes.io/projected/50ab7fd6-d934-496e-96fd-debf67b6b634-kube-api-access-54rtt\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sk7ch\" (UID: \"50ab7fd6-d934-496e-96fd-debf67b6b634\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sk7ch" Oct 03 08:26:33 crc kubenswrapper[4664]: I1003 08:26:33.134720 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50ab7fd6-d934-496e-96fd-debf67b6b634-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sk7ch\" (UID: \"50ab7fd6-d934-496e-96fd-debf67b6b634\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sk7ch" Oct 03 08:26:33 crc kubenswrapper[4664]: I1003 08:26:33.134758 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50ab7fd6-d934-496e-96fd-debf67b6b634-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sk7ch\" (UID: \"50ab7fd6-d934-496e-96fd-debf67b6b634\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sk7ch" Oct 03 08:26:33 crc kubenswrapper[4664]: I1003 08:26:33.134776 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50ab7fd6-d934-496e-96fd-debf67b6b634-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sk7ch\" (UID: \"50ab7fd6-d934-496e-96fd-debf67b6b634\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sk7ch" Oct 03 08:26:33 crc kubenswrapper[4664]: I1003 08:26:33.134804 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/50ab7fd6-d934-496e-96fd-debf67b6b634-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sk7ch\" (UID: \"50ab7fd6-d934-496e-96fd-debf67b6b634\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sk7ch" Oct 03 08:26:33 crc kubenswrapper[4664]: I1003 08:26:33.236740 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50ab7fd6-d934-496e-96fd-debf67b6b634-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sk7ch\" (UID: \"50ab7fd6-d934-496e-96fd-debf67b6b634\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sk7ch" Oct 03 08:26:33 crc kubenswrapper[4664]: I1003 08:26:33.236806 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50ab7fd6-d934-496e-96fd-debf67b6b634-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sk7ch\" (UID: \"50ab7fd6-d934-496e-96fd-debf67b6b634\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sk7ch" Oct 03 08:26:33 crc kubenswrapper[4664]: I1003 08:26:33.236839 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50ab7fd6-d934-496e-96fd-debf67b6b634-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sk7ch\" (UID: \"50ab7fd6-d934-496e-96fd-debf67b6b634\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sk7ch" Oct 03 08:26:33 crc kubenswrapper[4664]: I1003 08:26:33.236895 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/50ab7fd6-d934-496e-96fd-debf67b6b634-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sk7ch\" (UID: \"50ab7fd6-d934-496e-96fd-debf67b6b634\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sk7ch" Oct 03 08:26:33 crc kubenswrapper[4664]: I1003 08:26:33.236957 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54rtt\" (UniqueName: \"kubernetes.io/projected/50ab7fd6-d934-496e-96fd-debf67b6b634-kube-api-access-54rtt\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sk7ch\" (UID: \"50ab7fd6-d934-496e-96fd-debf67b6b634\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sk7ch" Oct 03 08:26:33 crc kubenswrapper[4664]: I1003 08:26:33.238183 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/50ab7fd6-d934-496e-96fd-debf67b6b634-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sk7ch\" (UID: \"50ab7fd6-d934-496e-96fd-debf67b6b634\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sk7ch" Oct 03 08:26:33 crc kubenswrapper[4664]: I1003 08:26:33.251307 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50ab7fd6-d934-496e-96fd-debf67b6b634-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sk7ch\" (UID: \"50ab7fd6-d934-496e-96fd-debf67b6b634\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sk7ch" Oct 03 08:26:33 crc kubenswrapper[4664]: I1003 08:26:33.251491 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50ab7fd6-d934-496e-96fd-debf67b6b634-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sk7ch\" (UID: \"50ab7fd6-d934-496e-96fd-debf67b6b634\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sk7ch" Oct 03 08:26:33 crc kubenswrapper[4664]: I1003 08:26:33.251826 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50ab7fd6-d934-496e-96fd-debf67b6b634-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sk7ch\" (UID: \"50ab7fd6-d934-496e-96fd-debf67b6b634\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sk7ch" Oct 03 08:26:33 crc kubenswrapper[4664]: I1003 08:26:33.255378 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54rtt\" (UniqueName: \"kubernetes.io/projected/50ab7fd6-d934-496e-96fd-debf67b6b634-kube-api-access-54rtt\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sk7ch\" (UID: \"50ab7fd6-d934-496e-96fd-debf67b6b634\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sk7ch" Oct 03 08:26:33 crc kubenswrapper[4664]: I1003 08:26:33.352509 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sk7ch" Oct 03 08:26:33 crc kubenswrapper[4664]: I1003 08:26:33.916754 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-sk7ch"] Oct 03 08:26:34 crc kubenswrapper[4664]: I1003 08:26:34.113150 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sk7ch" event={"ID":"50ab7fd6-d934-496e-96fd-debf67b6b634","Type":"ContainerStarted","Data":"d65ce25d1a9fa7bff57d5b3e062bb40837a794ce90b0165b5c4ce105ed009d4d"} Oct 03 08:26:35 crc kubenswrapper[4664]: I1003 08:26:35.122402 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sk7ch" event={"ID":"50ab7fd6-d934-496e-96fd-debf67b6b634","Type":"ContainerStarted","Data":"08f6e93e517a4ad4567c3b1f9e741f204ee34b8505af858b5cc25bfe194de1e8"} Oct 03 08:26:35 crc kubenswrapper[4664]: I1003 08:26:35.143167 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sk7ch" podStartSLOduration=1.9577410419999999 podStartE2EDuration="2.143145876s" podCreationTimestamp="2025-10-03 08:26:33 +0000 UTC" firstStartedPulling="2025-10-03 08:26:33.923468624 +0000 UTC m=+2294.744659114" lastFinishedPulling="2025-10-03 08:26:34.108873458 +0000 UTC m=+2294.930063948" observedRunningTime="2025-10-03 08:26:35.141395626 +0000 UTC m=+2295.962586156" watchObservedRunningTime="2025-10-03 08:26:35.143145876 +0000 UTC m=+2295.964336366" Oct 03 08:27:02 crc kubenswrapper[4664]: I1003 08:27:02.372449 4664 generic.go:334] "Generic (PLEG): container finished" podID="50ab7fd6-d934-496e-96fd-debf67b6b634" containerID="08f6e93e517a4ad4567c3b1f9e741f204ee34b8505af858b5cc25bfe194de1e8" exitCode=2 Oct 03 08:27:02 crc kubenswrapper[4664]: I1003 08:27:02.373040 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sk7ch" event={"ID":"50ab7fd6-d934-496e-96fd-debf67b6b634","Type":"ContainerDied","Data":"08f6e93e517a4ad4567c3b1f9e741f204ee34b8505af858b5cc25bfe194de1e8"} Oct 03 08:27:03 crc kubenswrapper[4664]: I1003 08:27:03.784995 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sk7ch" Oct 03 08:27:03 crc kubenswrapper[4664]: I1003 08:27:03.893296 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50ab7fd6-d934-496e-96fd-debf67b6b634-inventory\") pod \"50ab7fd6-d934-496e-96fd-debf67b6b634\" (UID: \"50ab7fd6-d934-496e-96fd-debf67b6b634\") " Oct 03 08:27:03 crc kubenswrapper[4664]: I1003 08:27:03.893372 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54rtt\" (UniqueName: \"kubernetes.io/projected/50ab7fd6-d934-496e-96fd-debf67b6b634-kube-api-access-54rtt\") pod \"50ab7fd6-d934-496e-96fd-debf67b6b634\" (UID: \"50ab7fd6-d934-496e-96fd-debf67b6b634\") " Oct 03 08:27:03 crc kubenswrapper[4664]: I1003 08:27:03.893399 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/50ab7fd6-d934-496e-96fd-debf67b6b634-ovncontroller-config-0\") pod \"50ab7fd6-d934-496e-96fd-debf67b6b634\" (UID: \"50ab7fd6-d934-496e-96fd-debf67b6b634\") " Oct 03 08:27:03 crc kubenswrapper[4664]: I1003 08:27:03.893426 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50ab7fd6-d934-496e-96fd-debf67b6b634-ssh-key\") pod \"50ab7fd6-d934-496e-96fd-debf67b6b634\" (UID: \"50ab7fd6-d934-496e-96fd-debf67b6b634\") " Oct 03 08:27:03 crc kubenswrapper[4664]: I1003 08:27:03.893498 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50ab7fd6-d934-496e-96fd-debf67b6b634-ovn-combined-ca-bundle\") pod \"50ab7fd6-d934-496e-96fd-debf67b6b634\" (UID: \"50ab7fd6-d934-496e-96fd-debf67b6b634\") " Oct 03 08:27:03 crc kubenswrapper[4664]: I1003 08:27:03.900305 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50ab7fd6-d934-496e-96fd-debf67b6b634-kube-api-access-54rtt" (OuterVolumeSpecName: "kube-api-access-54rtt") pod "50ab7fd6-d934-496e-96fd-debf67b6b634" (UID: "50ab7fd6-d934-496e-96fd-debf67b6b634"). InnerVolumeSpecName "kube-api-access-54rtt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:27:03 crc kubenswrapper[4664]: I1003 08:27:03.902820 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50ab7fd6-d934-496e-96fd-debf67b6b634-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "50ab7fd6-d934-496e-96fd-debf67b6b634" (UID: "50ab7fd6-d934-496e-96fd-debf67b6b634"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:27:03 crc kubenswrapper[4664]: I1003 08:27:03.921093 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50ab7fd6-d934-496e-96fd-debf67b6b634-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "50ab7fd6-d934-496e-96fd-debf67b6b634" (UID: "50ab7fd6-d934-496e-96fd-debf67b6b634"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:27:03 crc kubenswrapper[4664]: I1003 08:27:03.924407 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50ab7fd6-d934-496e-96fd-debf67b6b634-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "50ab7fd6-d934-496e-96fd-debf67b6b634" (UID: "50ab7fd6-d934-496e-96fd-debf67b6b634"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:27:03 crc kubenswrapper[4664]: I1003 08:27:03.924981 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50ab7fd6-d934-496e-96fd-debf67b6b634-inventory" (OuterVolumeSpecName: "inventory") pod "50ab7fd6-d934-496e-96fd-debf67b6b634" (UID: "50ab7fd6-d934-496e-96fd-debf67b6b634"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:27:03 crc kubenswrapper[4664]: I1003 08:27:03.996512 4664 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50ab7fd6-d934-496e-96fd-debf67b6b634-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 08:27:03 crc kubenswrapper[4664]: I1003 08:27:03.996549 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54rtt\" (UniqueName: \"kubernetes.io/projected/50ab7fd6-d934-496e-96fd-debf67b6b634-kube-api-access-54rtt\") on node \"crc\" DevicePath \"\"" Oct 03 08:27:03 crc kubenswrapper[4664]: I1003 08:27:03.996560 4664 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/50ab7fd6-d934-496e-96fd-debf67b6b634-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 03 08:27:03 crc kubenswrapper[4664]: I1003 08:27:03.996569 4664 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50ab7fd6-d934-496e-96fd-debf67b6b634-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 08:27:03 crc kubenswrapper[4664]: I1003 08:27:03.996577 4664 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50ab7fd6-d934-496e-96fd-debf67b6b634-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:27:04 crc kubenswrapper[4664]: I1003 08:27:04.394176 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sk7ch" event={"ID":"50ab7fd6-d934-496e-96fd-debf67b6b634","Type":"ContainerDied","Data":"d65ce25d1a9fa7bff57d5b3e062bb40837a794ce90b0165b5c4ce105ed009d4d"} Oct 03 08:27:04 crc kubenswrapper[4664]: I1003 08:27:04.394237 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d65ce25d1a9fa7bff57d5b3e062bb40837a794ce90b0165b5c4ce105ed009d4d" Oct 03 08:27:04 crc kubenswrapper[4664]: I1003 08:27:04.394273 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sk7ch" Oct 03 08:27:41 crc kubenswrapper[4664]: I1003 08:27:41.988176 4664 patch_prober.go:28] interesting pod/machine-config-daemon-x9dgm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:27:41 crc kubenswrapper[4664]: I1003 08:27:41.988735 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:28:11 crc kubenswrapper[4664]: I1003 08:28:11.987213 4664 patch_prober.go:28] interesting pod/machine-config-daemon-x9dgm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:28:11 crc kubenswrapper[4664]: I1003 08:28:11.988160 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:28:22 crc kubenswrapper[4664]: I1003 08:28:22.034012 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-l9xcl"] Oct 03 08:28:22 crc kubenswrapper[4664]: E1003 08:28:22.036735 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50ab7fd6-d934-496e-96fd-debf67b6b634" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 03 08:28:22 crc kubenswrapper[4664]: I1003 08:28:22.036842 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="50ab7fd6-d934-496e-96fd-debf67b6b634" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 03 08:28:22 crc kubenswrapper[4664]: I1003 08:28:22.037203 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="50ab7fd6-d934-496e-96fd-debf67b6b634" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 03 08:28:22 crc kubenswrapper[4664]: I1003 08:28:22.038105 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-l9xcl" Oct 03 08:28:22 crc kubenswrapper[4664]: I1003 08:28:22.045334 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 08:28:22 crc kubenswrapper[4664]: I1003 08:28:22.045805 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h6s74" Oct 03 08:28:22 crc kubenswrapper[4664]: I1003 08:28:22.046794 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 08:28:22 crc kubenswrapper[4664]: I1003 08:28:22.047568 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 08:28:22 crc kubenswrapper[4664]: I1003 08:28:22.048173 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 03 08:28:22 crc kubenswrapper[4664]: I1003 08:28:22.050186 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-l9xcl"] Oct 03 08:28:22 crc kubenswrapper[4664]: I1003 08:28:22.122706 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/41264278-270d-4f29-b68a-15340641bcb4-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-l9xcl\" (UID: \"41264278-270d-4f29-b68a-15340641bcb4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-l9xcl" Oct 03 08:28:22 crc kubenswrapper[4664]: I1003 08:28:22.122779 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41264278-270d-4f29-b68a-15340641bcb4-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-l9xcl\" (UID: \"41264278-270d-4f29-b68a-15340641bcb4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-l9xcl" Oct 03 08:28:22 crc kubenswrapper[4664]: I1003 08:28:22.122835 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/41264278-270d-4f29-b68a-15340641bcb4-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-l9xcl\" (UID: \"41264278-270d-4f29-b68a-15340641bcb4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-l9xcl" Oct 03 08:28:22 crc kubenswrapper[4664]: I1003 08:28:22.122867 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r46tt\" (UniqueName: \"kubernetes.io/projected/41264278-270d-4f29-b68a-15340641bcb4-kube-api-access-r46tt\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-l9xcl\" (UID: \"41264278-270d-4f29-b68a-15340641bcb4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-l9xcl" Oct 03 08:28:22 crc kubenswrapper[4664]: I1003 08:28:22.122928 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/41264278-270d-4f29-b68a-15340641bcb4-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-l9xcl\" (UID: \"41264278-270d-4f29-b68a-15340641bcb4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-l9xcl" Oct 03 08:28:22 crc kubenswrapper[4664]: I1003 08:28:22.225540 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/41264278-270d-4f29-b68a-15340641bcb4-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-l9xcl\" (UID: \"41264278-270d-4f29-b68a-15340641bcb4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-l9xcl" Oct 03 08:28:22 crc kubenswrapper[4664]: I1003 08:28:22.225742 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/41264278-270d-4f29-b68a-15340641bcb4-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-l9xcl\" (UID: \"41264278-270d-4f29-b68a-15340641bcb4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-l9xcl" Oct 03 08:28:22 crc kubenswrapper[4664]: I1003 08:28:22.225789 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41264278-270d-4f29-b68a-15340641bcb4-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-l9xcl\" (UID: \"41264278-270d-4f29-b68a-15340641bcb4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-l9xcl" Oct 03 08:28:22 crc kubenswrapper[4664]: I1003 08:28:22.227029 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/41264278-270d-4f29-b68a-15340641bcb4-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-l9xcl\" (UID: \"41264278-270d-4f29-b68a-15340641bcb4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-l9xcl" Oct 03 08:28:22 crc kubenswrapper[4664]: I1003 08:28:22.227224 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/41264278-270d-4f29-b68a-15340641bcb4-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-l9xcl\" (UID: \"41264278-270d-4f29-b68a-15340641bcb4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-l9xcl" Oct 03 08:28:22 crc kubenswrapper[4664]: I1003 08:28:22.227568 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r46tt\" (UniqueName: \"kubernetes.io/projected/41264278-270d-4f29-b68a-15340641bcb4-kube-api-access-r46tt\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-l9xcl\" (UID: \"41264278-270d-4f29-b68a-15340641bcb4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-l9xcl" Oct 03 08:28:22 crc kubenswrapper[4664]: I1003 08:28:22.233937 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/41264278-270d-4f29-b68a-15340641bcb4-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-l9xcl\" (UID: \"41264278-270d-4f29-b68a-15340641bcb4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-l9xcl" Oct 03 08:28:22 crc kubenswrapper[4664]: I1003 08:28:22.236864 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41264278-270d-4f29-b68a-15340641bcb4-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-l9xcl\" (UID: \"41264278-270d-4f29-b68a-15340641bcb4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-l9xcl" Oct 03 08:28:22 crc kubenswrapper[4664]: I1003 08:28:22.237500 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/41264278-270d-4f29-b68a-15340641bcb4-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-l9xcl\" (UID: \"41264278-270d-4f29-b68a-15340641bcb4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-l9xcl" Oct 03 08:28:22 crc kubenswrapper[4664]: I1003 08:28:22.247735 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r46tt\" (UniqueName: \"kubernetes.io/projected/41264278-270d-4f29-b68a-15340641bcb4-kube-api-access-r46tt\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-l9xcl\" (UID: \"41264278-270d-4f29-b68a-15340641bcb4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-l9xcl" Oct 03 08:28:22 crc kubenswrapper[4664]: I1003 08:28:22.407290 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-l9xcl" Oct 03 08:28:22 crc kubenswrapper[4664]: I1003 08:28:22.990283 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-l9xcl"] Oct 03 08:28:22 crc kubenswrapper[4664]: I1003 08:28:22.999669 4664 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 08:28:23 crc kubenswrapper[4664]: I1003 08:28:23.119100 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-l9xcl" event={"ID":"41264278-270d-4f29-b68a-15340641bcb4","Type":"ContainerStarted","Data":"a5d23d22ebce0320d85203dfaca6498f568e1338ab30b8612244b13153905404"} Oct 03 08:28:24 crc kubenswrapper[4664]: I1003 08:28:24.140771 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-l9xcl" event={"ID":"41264278-270d-4f29-b68a-15340641bcb4","Type":"ContainerStarted","Data":"5860e4b49f7e56793ccc0746934e01308f90fe86df1d1dc82e2ca29c024a9ef6"} Oct 03 08:28:24 crc kubenswrapper[4664]: I1003 08:28:24.178741 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-l9xcl" podStartSLOduration=2.002598237 podStartE2EDuration="2.178710564s" podCreationTimestamp="2025-10-03 08:28:22 +0000 UTC" firstStartedPulling="2025-10-03 08:28:22.999424099 +0000 UTC m=+2403.820614589" lastFinishedPulling="2025-10-03 08:28:23.175536426 +0000 UTC m=+2403.996726916" observedRunningTime="2025-10-03 08:28:24.167305837 +0000 UTC m=+2404.988496327" watchObservedRunningTime="2025-10-03 08:28:24.178710564 +0000 UTC m=+2404.999901064" Oct 03 08:28:41 crc kubenswrapper[4664]: I1003 08:28:41.987681 4664 patch_prober.go:28] interesting pod/machine-config-daemon-x9dgm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:28:41 crc kubenswrapper[4664]: I1003 08:28:41.988386 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:28:41 crc kubenswrapper[4664]: I1003 08:28:41.988453 4664 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" Oct 03 08:28:41 crc kubenswrapper[4664]: I1003 08:28:41.989396 4664 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"87bb80884c4b381ef0d8fba25dac22cac6a15cc54ca77b684da87cc8b486fdd9"} pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 08:28:41 crc kubenswrapper[4664]: I1003 08:28:41.989463 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" containerID="cri-o://87bb80884c4b381ef0d8fba25dac22cac6a15cc54ca77b684da87cc8b486fdd9" gracePeriod=600 Oct 03 08:28:42 crc kubenswrapper[4664]: E1003 08:28:42.110663 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:28:42 crc kubenswrapper[4664]: I1003 08:28:42.339869 4664 generic.go:334] "Generic (PLEG): container finished" podID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerID="87bb80884c4b381ef0d8fba25dac22cac6a15cc54ca77b684da87cc8b486fdd9" exitCode=0 Oct 03 08:28:42 crc kubenswrapper[4664]: I1003 08:28:42.339990 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" event={"ID":"598b81ce-0ce7-498f-9337-ae5e6e64682b","Type":"ContainerDied","Data":"87bb80884c4b381ef0d8fba25dac22cac6a15cc54ca77b684da87cc8b486fdd9"} Oct 03 08:28:42 crc kubenswrapper[4664]: I1003 08:28:42.340434 4664 scope.go:117] "RemoveContainer" containerID="be55d517673418e2a00ffa031e5df1b78c2e2063781c6c50b4ccffd65918f5b0" Oct 03 08:28:42 crc kubenswrapper[4664]: I1003 08:28:42.341116 4664 scope.go:117] "RemoveContainer" containerID="87bb80884c4b381ef0d8fba25dac22cac6a15cc54ca77b684da87cc8b486fdd9" Oct 03 08:28:42 crc kubenswrapper[4664]: E1003 08:28:42.341375 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:28:52 crc kubenswrapper[4664]: I1003 08:28:52.440157 4664 generic.go:334] "Generic (PLEG): container finished" podID="41264278-270d-4f29-b68a-15340641bcb4" containerID="5860e4b49f7e56793ccc0746934e01308f90fe86df1d1dc82e2ca29c024a9ef6" exitCode=2 Oct 03 08:28:52 crc kubenswrapper[4664]: I1003 08:28:52.440256 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-l9xcl" event={"ID":"41264278-270d-4f29-b68a-15340641bcb4","Type":"ContainerDied","Data":"5860e4b49f7e56793ccc0746934e01308f90fe86df1d1dc82e2ca29c024a9ef6"} Oct 03 08:28:53 crc kubenswrapper[4664]: I1003 08:28:53.914234 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-l9xcl" Oct 03 08:28:54 crc kubenswrapper[4664]: I1003 08:28:54.023541 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r46tt\" (UniqueName: \"kubernetes.io/projected/41264278-270d-4f29-b68a-15340641bcb4-kube-api-access-r46tt\") pod \"41264278-270d-4f29-b68a-15340641bcb4\" (UID: \"41264278-270d-4f29-b68a-15340641bcb4\") " Oct 03 08:28:54 crc kubenswrapper[4664]: I1003 08:28:54.023723 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41264278-270d-4f29-b68a-15340641bcb4-ovn-combined-ca-bundle\") pod \"41264278-270d-4f29-b68a-15340641bcb4\" (UID: \"41264278-270d-4f29-b68a-15340641bcb4\") " Oct 03 08:28:54 crc kubenswrapper[4664]: I1003 08:28:54.023765 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/41264278-270d-4f29-b68a-15340641bcb4-ovncontroller-config-0\") pod \"41264278-270d-4f29-b68a-15340641bcb4\" (UID: \"41264278-270d-4f29-b68a-15340641bcb4\") " Oct 03 08:28:54 crc kubenswrapper[4664]: I1003 08:28:54.023829 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/41264278-270d-4f29-b68a-15340641bcb4-ssh-key\") pod \"41264278-270d-4f29-b68a-15340641bcb4\" (UID: \"41264278-270d-4f29-b68a-15340641bcb4\") " Oct 03 08:28:54 crc kubenswrapper[4664]: I1003 08:28:54.023932 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/41264278-270d-4f29-b68a-15340641bcb4-inventory\") pod \"41264278-270d-4f29-b68a-15340641bcb4\" (UID: \"41264278-270d-4f29-b68a-15340641bcb4\") " Oct 03 08:28:54 crc kubenswrapper[4664]: I1003 08:28:54.030256 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41264278-270d-4f29-b68a-15340641bcb4-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "41264278-270d-4f29-b68a-15340641bcb4" (UID: "41264278-270d-4f29-b68a-15340641bcb4"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:28:54 crc kubenswrapper[4664]: I1003 08:28:54.031460 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41264278-270d-4f29-b68a-15340641bcb4-kube-api-access-r46tt" (OuterVolumeSpecName: "kube-api-access-r46tt") pod "41264278-270d-4f29-b68a-15340641bcb4" (UID: "41264278-270d-4f29-b68a-15340641bcb4"). InnerVolumeSpecName "kube-api-access-r46tt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:28:54 crc kubenswrapper[4664]: I1003 08:28:54.053216 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41264278-270d-4f29-b68a-15340641bcb4-inventory" (OuterVolumeSpecName: "inventory") pod "41264278-270d-4f29-b68a-15340641bcb4" (UID: "41264278-270d-4f29-b68a-15340641bcb4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:28:54 crc kubenswrapper[4664]: I1003 08:28:54.054872 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41264278-270d-4f29-b68a-15340641bcb4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "41264278-270d-4f29-b68a-15340641bcb4" (UID: "41264278-270d-4f29-b68a-15340641bcb4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:28:54 crc kubenswrapper[4664]: I1003 08:28:54.055501 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41264278-270d-4f29-b68a-15340641bcb4-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "41264278-270d-4f29-b68a-15340641bcb4" (UID: "41264278-270d-4f29-b68a-15340641bcb4"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:28:54 crc kubenswrapper[4664]: I1003 08:28:54.126787 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r46tt\" (UniqueName: \"kubernetes.io/projected/41264278-270d-4f29-b68a-15340641bcb4-kube-api-access-r46tt\") on node \"crc\" DevicePath \"\"" Oct 03 08:28:54 crc kubenswrapper[4664]: I1003 08:28:54.126840 4664 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41264278-270d-4f29-b68a-15340641bcb4-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:28:54 crc kubenswrapper[4664]: I1003 08:28:54.126851 4664 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/41264278-270d-4f29-b68a-15340641bcb4-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 03 08:28:54 crc kubenswrapper[4664]: I1003 08:28:54.126862 4664 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/41264278-270d-4f29-b68a-15340641bcb4-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 08:28:54 crc kubenswrapper[4664]: I1003 08:28:54.126874 4664 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/41264278-270d-4f29-b68a-15340641bcb4-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 08:28:54 crc kubenswrapper[4664]: I1003 08:28:54.467585 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-l9xcl" event={"ID":"41264278-270d-4f29-b68a-15340641bcb4","Type":"ContainerDied","Data":"a5d23d22ebce0320d85203dfaca6498f568e1338ab30b8612244b13153905404"} Oct 03 08:28:54 crc kubenswrapper[4664]: I1003 08:28:54.467669 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5d23d22ebce0320d85203dfaca6498f568e1338ab30b8612244b13153905404" Oct 03 08:28:54 crc kubenswrapper[4664]: I1003 08:28:54.468025 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-l9xcl" Oct 03 08:28:55 crc kubenswrapper[4664]: I1003 08:28:55.884838 4664 scope.go:117] "RemoveContainer" containerID="87bb80884c4b381ef0d8fba25dac22cac6a15cc54ca77b684da87cc8b486fdd9" Oct 03 08:28:55 crc kubenswrapper[4664]: E1003 08:28:55.886430 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:29:07 crc kubenswrapper[4664]: I1003 08:29:07.876554 4664 scope.go:117] "RemoveContainer" containerID="87bb80884c4b381ef0d8fba25dac22cac6a15cc54ca77b684da87cc8b486fdd9" Oct 03 08:29:07 crc kubenswrapper[4664]: E1003 08:29:07.877278 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:29:18 crc kubenswrapper[4664]: I1003 08:29:18.877877 4664 scope.go:117] "RemoveContainer" containerID="87bb80884c4b381ef0d8fba25dac22cac6a15cc54ca77b684da87cc8b486fdd9" Oct 03 08:29:18 crc kubenswrapper[4664]: E1003 08:29:18.879026 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:29:32 crc kubenswrapper[4664]: I1003 08:29:32.876422 4664 scope.go:117] "RemoveContainer" containerID="87bb80884c4b381ef0d8fba25dac22cac6a15cc54ca77b684da87cc8b486fdd9" Oct 03 08:29:32 crc kubenswrapper[4664]: E1003 08:29:32.878012 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:29:44 crc kubenswrapper[4664]: I1003 08:29:44.876457 4664 scope.go:117] "RemoveContainer" containerID="87bb80884c4b381ef0d8fba25dac22cac6a15cc54ca77b684da87cc8b486fdd9" Oct 03 08:29:44 crc kubenswrapper[4664]: E1003 08:29:44.877577 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:29:56 crc kubenswrapper[4664]: I1003 08:29:56.876844 4664 scope.go:117] "RemoveContainer" containerID="87bb80884c4b381ef0d8fba25dac22cac6a15cc54ca77b684da87cc8b486fdd9" Oct 03 08:29:56 crc kubenswrapper[4664]: E1003 08:29:56.878636 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:30:00 crc kubenswrapper[4664]: I1003 08:30:00.169385 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324670-cwfcv"] Oct 03 08:30:00 crc kubenswrapper[4664]: E1003 08:30:00.170566 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41264278-270d-4f29-b68a-15340641bcb4" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 03 08:30:00 crc kubenswrapper[4664]: I1003 08:30:00.170589 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="41264278-270d-4f29-b68a-15340641bcb4" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 03 08:30:00 crc kubenswrapper[4664]: I1003 08:30:00.170929 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="41264278-270d-4f29-b68a-15340641bcb4" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 03 08:30:00 crc kubenswrapper[4664]: I1003 08:30:00.172104 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-cwfcv" Oct 03 08:30:00 crc kubenswrapper[4664]: I1003 08:30:00.176866 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 08:30:00 crc kubenswrapper[4664]: I1003 08:30:00.177099 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 08:30:00 crc kubenswrapper[4664]: I1003 08:30:00.184514 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324670-cwfcv"] Oct 03 08:30:00 crc kubenswrapper[4664]: I1003 08:30:00.318219 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx6l5\" (UniqueName: \"kubernetes.io/projected/8ad251f2-3025-4616-a70f-b280170c1443-kube-api-access-vx6l5\") pod \"collect-profiles-29324670-cwfcv\" (UID: \"8ad251f2-3025-4616-a70f-b280170c1443\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-cwfcv" Oct 03 08:30:00 crc kubenswrapper[4664]: I1003 08:30:00.318275 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8ad251f2-3025-4616-a70f-b280170c1443-secret-volume\") pod \"collect-profiles-29324670-cwfcv\" (UID: \"8ad251f2-3025-4616-a70f-b280170c1443\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-cwfcv" Oct 03 08:30:00 crc kubenswrapper[4664]: I1003 08:30:00.318399 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8ad251f2-3025-4616-a70f-b280170c1443-config-volume\") pod \"collect-profiles-29324670-cwfcv\" (UID: \"8ad251f2-3025-4616-a70f-b280170c1443\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-cwfcv" Oct 03 08:30:00 crc kubenswrapper[4664]: I1003 08:30:00.420869 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8ad251f2-3025-4616-a70f-b280170c1443-config-volume\") pod \"collect-profiles-29324670-cwfcv\" (UID: \"8ad251f2-3025-4616-a70f-b280170c1443\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-cwfcv" Oct 03 08:30:00 crc kubenswrapper[4664]: I1003 08:30:00.421048 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vx6l5\" (UniqueName: \"kubernetes.io/projected/8ad251f2-3025-4616-a70f-b280170c1443-kube-api-access-vx6l5\") pod \"collect-profiles-29324670-cwfcv\" (UID: \"8ad251f2-3025-4616-a70f-b280170c1443\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-cwfcv" Oct 03 08:30:00 crc kubenswrapper[4664]: I1003 08:30:00.421080 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8ad251f2-3025-4616-a70f-b280170c1443-secret-volume\") pod \"collect-profiles-29324670-cwfcv\" (UID: \"8ad251f2-3025-4616-a70f-b280170c1443\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-cwfcv" Oct 03 08:30:00 crc kubenswrapper[4664]: I1003 08:30:00.422354 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8ad251f2-3025-4616-a70f-b280170c1443-config-volume\") pod \"collect-profiles-29324670-cwfcv\" (UID: \"8ad251f2-3025-4616-a70f-b280170c1443\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-cwfcv" Oct 03 08:30:00 crc kubenswrapper[4664]: I1003 08:30:00.431450 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8ad251f2-3025-4616-a70f-b280170c1443-secret-volume\") pod \"collect-profiles-29324670-cwfcv\" (UID: \"8ad251f2-3025-4616-a70f-b280170c1443\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-cwfcv" Oct 03 08:30:00 crc kubenswrapper[4664]: I1003 08:30:00.442259 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx6l5\" (UniqueName: \"kubernetes.io/projected/8ad251f2-3025-4616-a70f-b280170c1443-kube-api-access-vx6l5\") pod \"collect-profiles-29324670-cwfcv\" (UID: \"8ad251f2-3025-4616-a70f-b280170c1443\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-cwfcv" Oct 03 08:30:00 crc kubenswrapper[4664]: I1003 08:30:00.506124 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-cwfcv" Oct 03 08:30:01 crc kubenswrapper[4664]: I1003 08:30:01.051503 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324670-cwfcv"] Oct 03 08:30:01 crc kubenswrapper[4664]: I1003 08:30:01.152418 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-cwfcv" event={"ID":"8ad251f2-3025-4616-a70f-b280170c1443","Type":"ContainerStarted","Data":"590652d21089c92532cbb6291d90b7918e779852a4e785ba5f3a6b71e3a476bb"} Oct 03 08:30:02 crc kubenswrapper[4664]: I1003 08:30:02.178649 4664 generic.go:334] "Generic (PLEG): container finished" podID="8ad251f2-3025-4616-a70f-b280170c1443" containerID="fb219621c5cd41755eb4f9aa1afe1895258112550fec6d11895f09a6ef88c605" exitCode=0 Oct 03 08:30:02 crc kubenswrapper[4664]: I1003 08:30:02.179451 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-cwfcv" event={"ID":"8ad251f2-3025-4616-a70f-b280170c1443","Type":"ContainerDied","Data":"fb219621c5cd41755eb4f9aa1afe1895258112550fec6d11895f09a6ef88c605"} Oct 03 08:30:03 crc kubenswrapper[4664]: I1003 08:30:03.531425 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-cwfcv" Oct 03 08:30:03 crc kubenswrapper[4664]: I1003 08:30:03.699356 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8ad251f2-3025-4616-a70f-b280170c1443-secret-volume\") pod \"8ad251f2-3025-4616-a70f-b280170c1443\" (UID: \"8ad251f2-3025-4616-a70f-b280170c1443\") " Oct 03 08:30:03 crc kubenswrapper[4664]: I1003 08:30:03.699438 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8ad251f2-3025-4616-a70f-b280170c1443-config-volume\") pod \"8ad251f2-3025-4616-a70f-b280170c1443\" (UID: \"8ad251f2-3025-4616-a70f-b280170c1443\") " Oct 03 08:30:03 crc kubenswrapper[4664]: I1003 08:30:03.699702 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vx6l5\" (UniqueName: \"kubernetes.io/projected/8ad251f2-3025-4616-a70f-b280170c1443-kube-api-access-vx6l5\") pod \"8ad251f2-3025-4616-a70f-b280170c1443\" (UID: \"8ad251f2-3025-4616-a70f-b280170c1443\") " Oct 03 08:30:03 crc kubenswrapper[4664]: I1003 08:30:03.700034 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ad251f2-3025-4616-a70f-b280170c1443-config-volume" (OuterVolumeSpecName: "config-volume") pod "8ad251f2-3025-4616-a70f-b280170c1443" (UID: "8ad251f2-3025-4616-a70f-b280170c1443"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:30:03 crc kubenswrapper[4664]: I1003 08:30:03.700989 4664 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8ad251f2-3025-4616-a70f-b280170c1443-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 08:30:03 crc kubenswrapper[4664]: I1003 08:30:03.710893 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ad251f2-3025-4616-a70f-b280170c1443-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8ad251f2-3025-4616-a70f-b280170c1443" (UID: "8ad251f2-3025-4616-a70f-b280170c1443"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:30:03 crc kubenswrapper[4664]: I1003 08:30:03.711098 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ad251f2-3025-4616-a70f-b280170c1443-kube-api-access-vx6l5" (OuterVolumeSpecName: "kube-api-access-vx6l5") pod "8ad251f2-3025-4616-a70f-b280170c1443" (UID: "8ad251f2-3025-4616-a70f-b280170c1443"). InnerVolumeSpecName "kube-api-access-vx6l5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:30:03 crc kubenswrapper[4664]: I1003 08:30:03.803265 4664 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8ad251f2-3025-4616-a70f-b280170c1443-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 08:30:03 crc kubenswrapper[4664]: I1003 08:30:03.803307 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vx6l5\" (UniqueName: \"kubernetes.io/projected/8ad251f2-3025-4616-a70f-b280170c1443-kube-api-access-vx6l5\") on node \"crc\" DevicePath \"\"" Oct 03 08:30:04 crc kubenswrapper[4664]: I1003 08:30:04.198669 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-cwfcv" event={"ID":"8ad251f2-3025-4616-a70f-b280170c1443","Type":"ContainerDied","Data":"590652d21089c92532cbb6291d90b7918e779852a4e785ba5f3a6b71e3a476bb"} Oct 03 08:30:04 crc kubenswrapper[4664]: I1003 08:30:04.198715 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="590652d21089c92532cbb6291d90b7918e779852a4e785ba5f3a6b71e3a476bb" Oct 03 08:30:04 crc kubenswrapper[4664]: I1003 08:30:04.198780 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-cwfcv" Oct 03 08:30:04 crc kubenswrapper[4664]: I1003 08:30:04.623502 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324625-nj78m"] Oct 03 08:30:04 crc kubenswrapper[4664]: I1003 08:30:04.633944 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324625-nj78m"] Oct 03 08:30:05 crc kubenswrapper[4664]: I1003 08:30:05.889272 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c3aff15-c0d3-4d20-8ee3-7d753d5f7c91" path="/var/lib/kubelet/pods/5c3aff15-c0d3-4d20-8ee3-7d753d5f7c91/volumes" Oct 03 08:30:10 crc kubenswrapper[4664]: I1003 08:30:10.877108 4664 scope.go:117] "RemoveContainer" containerID="87bb80884c4b381ef0d8fba25dac22cac6a15cc54ca77b684da87cc8b486fdd9" Oct 03 08:30:10 crc kubenswrapper[4664]: E1003 08:30:10.878100 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:30:21 crc kubenswrapper[4664]: I1003 08:30:21.876855 4664 scope.go:117] "RemoveContainer" containerID="87bb80884c4b381ef0d8fba25dac22cac6a15cc54ca77b684da87cc8b486fdd9" Oct 03 08:30:21 crc kubenswrapper[4664]: E1003 08:30:21.877803 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:30:26 crc kubenswrapper[4664]: I1003 08:30:26.354653 4664 scope.go:117] "RemoveContainer" containerID="45f77438245ec43407abee2ee75046f6eb1389e179ca39cbe00dce89645fb880" Oct 03 08:30:35 crc kubenswrapper[4664]: I1003 08:30:35.876778 4664 scope.go:117] "RemoveContainer" containerID="87bb80884c4b381ef0d8fba25dac22cac6a15cc54ca77b684da87cc8b486fdd9" Oct 03 08:30:35 crc kubenswrapper[4664]: E1003 08:30:35.877751 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:30:46 crc kubenswrapper[4664]: I1003 08:30:46.876033 4664 scope.go:117] "RemoveContainer" containerID="87bb80884c4b381ef0d8fba25dac22cac6a15cc54ca77b684da87cc8b486fdd9" Oct 03 08:30:46 crc kubenswrapper[4664]: E1003 08:30:46.876905 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:30:59 crc kubenswrapper[4664]: I1003 08:30:59.889203 4664 scope.go:117] "RemoveContainer" containerID="87bb80884c4b381ef0d8fba25dac22cac6a15cc54ca77b684da87cc8b486fdd9" Oct 03 08:30:59 crc kubenswrapper[4664]: E1003 08:30:59.896710 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:31:14 crc kubenswrapper[4664]: I1003 08:31:14.877214 4664 scope.go:117] "RemoveContainer" containerID="87bb80884c4b381ef0d8fba25dac22cac6a15cc54ca77b684da87cc8b486fdd9" Oct 03 08:31:14 crc kubenswrapper[4664]: E1003 08:31:14.879116 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:31:29 crc kubenswrapper[4664]: I1003 08:31:29.882159 4664 scope.go:117] "RemoveContainer" containerID="87bb80884c4b381ef0d8fba25dac22cac6a15cc54ca77b684da87cc8b486fdd9" Oct 03 08:31:29 crc kubenswrapper[4664]: E1003 08:31:29.883084 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:31:31 crc kubenswrapper[4664]: I1003 08:31:31.032762 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbsf6"] Oct 03 08:31:31 crc kubenswrapper[4664]: E1003 08:31:31.033556 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ad251f2-3025-4616-a70f-b280170c1443" containerName="collect-profiles" Oct 03 08:31:31 crc kubenswrapper[4664]: I1003 08:31:31.033571 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ad251f2-3025-4616-a70f-b280170c1443" containerName="collect-profiles" Oct 03 08:31:31 crc kubenswrapper[4664]: I1003 08:31:31.033836 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ad251f2-3025-4616-a70f-b280170c1443" containerName="collect-profiles" Oct 03 08:31:31 crc kubenswrapper[4664]: I1003 08:31:31.035590 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbsf6" Oct 03 08:31:31 crc kubenswrapper[4664]: I1003 08:31:31.039272 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 03 08:31:31 crc kubenswrapper[4664]: I1003 08:31:31.039489 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 08:31:31 crc kubenswrapper[4664]: I1003 08:31:31.039488 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h6s74" Oct 03 08:31:31 crc kubenswrapper[4664]: I1003 08:31:31.044509 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 08:31:31 crc kubenswrapper[4664]: I1003 08:31:31.046832 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 08:31:31 crc kubenswrapper[4664]: I1003 08:31:31.060518 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbsf6"] Oct 03 08:31:31 crc kubenswrapper[4664]: I1003 08:31:31.226556 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/da2b4748-a1ed-4edd-ba5e-2bdc917790dd-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cbsf6\" (UID: \"da2b4748-a1ed-4edd-ba5e-2bdc917790dd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbsf6" Oct 03 08:31:31 crc kubenswrapper[4664]: I1003 08:31:31.226708 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd955\" (UniqueName: \"kubernetes.io/projected/da2b4748-a1ed-4edd-ba5e-2bdc917790dd-kube-api-access-gd955\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cbsf6\" (UID: \"da2b4748-a1ed-4edd-ba5e-2bdc917790dd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbsf6" Oct 03 08:31:31 crc kubenswrapper[4664]: I1003 08:31:31.226786 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/da2b4748-a1ed-4edd-ba5e-2bdc917790dd-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cbsf6\" (UID: \"da2b4748-a1ed-4edd-ba5e-2bdc917790dd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbsf6" Oct 03 08:31:31 crc kubenswrapper[4664]: I1003 08:31:31.226912 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da2b4748-a1ed-4edd-ba5e-2bdc917790dd-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cbsf6\" (UID: \"da2b4748-a1ed-4edd-ba5e-2bdc917790dd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbsf6" Oct 03 08:31:31 crc kubenswrapper[4664]: I1003 08:31:31.227036 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da2b4748-a1ed-4edd-ba5e-2bdc917790dd-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cbsf6\" (UID: \"da2b4748-a1ed-4edd-ba5e-2bdc917790dd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbsf6" Oct 03 08:31:31 crc kubenswrapper[4664]: I1003 08:31:31.330120 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da2b4748-a1ed-4edd-ba5e-2bdc917790dd-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cbsf6\" (UID: \"da2b4748-a1ed-4edd-ba5e-2bdc917790dd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbsf6" Oct 03 08:31:31 crc kubenswrapper[4664]: I1003 08:31:31.330307 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da2b4748-a1ed-4edd-ba5e-2bdc917790dd-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cbsf6\" (UID: \"da2b4748-a1ed-4edd-ba5e-2bdc917790dd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbsf6" Oct 03 08:31:31 crc kubenswrapper[4664]: I1003 08:31:31.330351 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/da2b4748-a1ed-4edd-ba5e-2bdc917790dd-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cbsf6\" (UID: \"da2b4748-a1ed-4edd-ba5e-2bdc917790dd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbsf6" Oct 03 08:31:31 crc kubenswrapper[4664]: I1003 08:31:31.330396 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd955\" (UniqueName: \"kubernetes.io/projected/da2b4748-a1ed-4edd-ba5e-2bdc917790dd-kube-api-access-gd955\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cbsf6\" (UID: \"da2b4748-a1ed-4edd-ba5e-2bdc917790dd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbsf6" Oct 03 08:31:31 crc kubenswrapper[4664]: I1003 08:31:31.330470 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/da2b4748-a1ed-4edd-ba5e-2bdc917790dd-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cbsf6\" (UID: \"da2b4748-a1ed-4edd-ba5e-2bdc917790dd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbsf6" Oct 03 08:31:31 crc kubenswrapper[4664]: I1003 08:31:31.331336 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/da2b4748-a1ed-4edd-ba5e-2bdc917790dd-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cbsf6\" (UID: \"da2b4748-a1ed-4edd-ba5e-2bdc917790dd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbsf6" Oct 03 08:31:31 crc kubenswrapper[4664]: I1003 08:31:31.338490 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da2b4748-a1ed-4edd-ba5e-2bdc917790dd-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cbsf6\" (UID: \"da2b4748-a1ed-4edd-ba5e-2bdc917790dd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbsf6" Oct 03 08:31:31 crc kubenswrapper[4664]: I1003 08:31:31.339296 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/da2b4748-a1ed-4edd-ba5e-2bdc917790dd-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cbsf6\" (UID: \"da2b4748-a1ed-4edd-ba5e-2bdc917790dd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbsf6" Oct 03 08:31:31 crc kubenswrapper[4664]: I1003 08:31:31.339351 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da2b4748-a1ed-4edd-ba5e-2bdc917790dd-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cbsf6\" (UID: \"da2b4748-a1ed-4edd-ba5e-2bdc917790dd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbsf6" Oct 03 08:31:31 crc kubenswrapper[4664]: I1003 08:31:31.350730 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd955\" (UniqueName: \"kubernetes.io/projected/da2b4748-a1ed-4edd-ba5e-2bdc917790dd-kube-api-access-gd955\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cbsf6\" (UID: \"da2b4748-a1ed-4edd-ba5e-2bdc917790dd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbsf6" Oct 03 08:31:31 crc kubenswrapper[4664]: I1003 08:31:31.360649 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbsf6" Oct 03 08:31:31 crc kubenswrapper[4664]: I1003 08:31:31.950007 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbsf6"] Oct 03 08:31:32 crc kubenswrapper[4664]: I1003 08:31:32.012109 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbsf6" event={"ID":"da2b4748-a1ed-4edd-ba5e-2bdc917790dd","Type":"ContainerStarted","Data":"9b34a6532d64e7e57ad7a2af9a4163766b330bda9a6ee84342d0305db0aeb882"} Oct 03 08:31:33 crc kubenswrapper[4664]: I1003 08:31:33.025404 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbsf6" event={"ID":"da2b4748-a1ed-4edd-ba5e-2bdc917790dd","Type":"ContainerStarted","Data":"289902fcc6091e6a1337602bbdba839f0b646b3f862abc71f3855d8827e3c133"} Oct 03 08:31:33 crc kubenswrapper[4664]: I1003 08:31:33.061813 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbsf6" podStartSLOduration=1.819049913 podStartE2EDuration="2.061789009s" podCreationTimestamp="2025-10-03 08:31:31 +0000 UTC" firstStartedPulling="2025-10-03 08:31:31.951221514 +0000 UTC m=+2592.772412004" lastFinishedPulling="2025-10-03 08:31:32.19396061 +0000 UTC m=+2593.015151100" observedRunningTime="2025-10-03 08:31:33.048759275 +0000 UTC m=+2593.869949785" watchObservedRunningTime="2025-10-03 08:31:33.061789009 +0000 UTC m=+2593.882979499" Oct 03 08:31:43 crc kubenswrapper[4664]: I1003 08:31:43.876402 4664 scope.go:117] "RemoveContainer" containerID="87bb80884c4b381ef0d8fba25dac22cac6a15cc54ca77b684da87cc8b486fdd9" Oct 03 08:31:43 crc kubenswrapper[4664]: E1003 08:31:43.879242 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:31:56 crc kubenswrapper[4664]: I1003 08:31:56.876634 4664 scope.go:117] "RemoveContainer" containerID="87bb80884c4b381ef0d8fba25dac22cac6a15cc54ca77b684da87cc8b486fdd9" Oct 03 08:31:56 crc kubenswrapper[4664]: E1003 08:31:56.877703 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:32:00 crc kubenswrapper[4664]: I1003 08:32:00.282305 4664 generic.go:334] "Generic (PLEG): container finished" podID="da2b4748-a1ed-4edd-ba5e-2bdc917790dd" containerID="289902fcc6091e6a1337602bbdba839f0b646b3f862abc71f3855d8827e3c133" exitCode=2 Oct 03 08:32:00 crc kubenswrapper[4664]: I1003 08:32:00.282390 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbsf6" event={"ID":"da2b4748-a1ed-4edd-ba5e-2bdc917790dd","Type":"ContainerDied","Data":"289902fcc6091e6a1337602bbdba839f0b646b3f862abc71f3855d8827e3c133"} Oct 03 08:32:01 crc kubenswrapper[4664]: I1003 08:32:01.695447 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbsf6" Oct 03 08:32:01 crc kubenswrapper[4664]: I1003 08:32:01.777556 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/da2b4748-a1ed-4edd-ba5e-2bdc917790dd-ovncontroller-config-0\") pod \"da2b4748-a1ed-4edd-ba5e-2bdc917790dd\" (UID: \"da2b4748-a1ed-4edd-ba5e-2bdc917790dd\") " Oct 03 08:32:01 crc kubenswrapper[4664]: I1003 08:32:01.777821 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gd955\" (UniqueName: \"kubernetes.io/projected/da2b4748-a1ed-4edd-ba5e-2bdc917790dd-kube-api-access-gd955\") pod \"da2b4748-a1ed-4edd-ba5e-2bdc917790dd\" (UID: \"da2b4748-a1ed-4edd-ba5e-2bdc917790dd\") " Oct 03 08:32:01 crc kubenswrapper[4664]: I1003 08:32:01.777917 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/da2b4748-a1ed-4edd-ba5e-2bdc917790dd-ssh-key\") pod \"da2b4748-a1ed-4edd-ba5e-2bdc917790dd\" (UID: \"da2b4748-a1ed-4edd-ba5e-2bdc917790dd\") " Oct 03 08:32:01 crc kubenswrapper[4664]: I1003 08:32:01.778043 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da2b4748-a1ed-4edd-ba5e-2bdc917790dd-inventory\") pod \"da2b4748-a1ed-4edd-ba5e-2bdc917790dd\" (UID: \"da2b4748-a1ed-4edd-ba5e-2bdc917790dd\") " Oct 03 08:32:01 crc kubenswrapper[4664]: I1003 08:32:01.778099 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da2b4748-a1ed-4edd-ba5e-2bdc917790dd-ovn-combined-ca-bundle\") pod \"da2b4748-a1ed-4edd-ba5e-2bdc917790dd\" (UID: \"da2b4748-a1ed-4edd-ba5e-2bdc917790dd\") " Oct 03 08:32:01 crc kubenswrapper[4664]: I1003 08:32:01.784248 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da2b4748-a1ed-4edd-ba5e-2bdc917790dd-kube-api-access-gd955" (OuterVolumeSpecName: "kube-api-access-gd955") pod "da2b4748-a1ed-4edd-ba5e-2bdc917790dd" (UID: "da2b4748-a1ed-4edd-ba5e-2bdc917790dd"). InnerVolumeSpecName "kube-api-access-gd955". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:32:01 crc kubenswrapper[4664]: I1003 08:32:01.784845 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da2b4748-a1ed-4edd-ba5e-2bdc917790dd-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "da2b4748-a1ed-4edd-ba5e-2bdc917790dd" (UID: "da2b4748-a1ed-4edd-ba5e-2bdc917790dd"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:32:01 crc kubenswrapper[4664]: I1003 08:32:01.806000 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da2b4748-a1ed-4edd-ba5e-2bdc917790dd-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "da2b4748-a1ed-4edd-ba5e-2bdc917790dd" (UID: "da2b4748-a1ed-4edd-ba5e-2bdc917790dd"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:32:01 crc kubenswrapper[4664]: I1003 08:32:01.807833 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da2b4748-a1ed-4edd-ba5e-2bdc917790dd-inventory" (OuterVolumeSpecName: "inventory") pod "da2b4748-a1ed-4edd-ba5e-2bdc917790dd" (UID: "da2b4748-a1ed-4edd-ba5e-2bdc917790dd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:32:01 crc kubenswrapper[4664]: I1003 08:32:01.811848 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da2b4748-a1ed-4edd-ba5e-2bdc917790dd-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "da2b4748-a1ed-4edd-ba5e-2bdc917790dd" (UID: "da2b4748-a1ed-4edd-ba5e-2bdc917790dd"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:32:01 crc kubenswrapper[4664]: I1003 08:32:01.880339 4664 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da2b4748-a1ed-4edd-ba5e-2bdc917790dd-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 08:32:01 crc kubenswrapper[4664]: I1003 08:32:01.880378 4664 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da2b4748-a1ed-4edd-ba5e-2bdc917790dd-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:32:01 crc kubenswrapper[4664]: I1003 08:32:01.880391 4664 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/da2b4748-a1ed-4edd-ba5e-2bdc917790dd-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 03 08:32:01 crc kubenswrapper[4664]: I1003 08:32:01.880400 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gd955\" (UniqueName: \"kubernetes.io/projected/da2b4748-a1ed-4edd-ba5e-2bdc917790dd-kube-api-access-gd955\") on node \"crc\" DevicePath \"\"" Oct 03 08:32:01 crc kubenswrapper[4664]: I1003 08:32:01.880409 4664 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/da2b4748-a1ed-4edd-ba5e-2bdc917790dd-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 08:32:02 crc kubenswrapper[4664]: I1003 08:32:02.303643 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbsf6" event={"ID":"da2b4748-a1ed-4edd-ba5e-2bdc917790dd","Type":"ContainerDied","Data":"9b34a6532d64e7e57ad7a2af9a4163766b330bda9a6ee84342d0305db0aeb882"} Oct 03 08:32:02 crc kubenswrapper[4664]: I1003 08:32:02.303689 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbsf6" Oct 03 08:32:02 crc kubenswrapper[4664]: I1003 08:32:02.303696 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b34a6532d64e7e57ad7a2af9a4163766b330bda9a6ee84342d0305db0aeb882" Oct 03 08:32:09 crc kubenswrapper[4664]: I1003 08:32:09.884938 4664 scope.go:117] "RemoveContainer" containerID="87bb80884c4b381ef0d8fba25dac22cac6a15cc54ca77b684da87cc8b486fdd9" Oct 03 08:32:09 crc kubenswrapper[4664]: E1003 08:32:09.885723 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:32:23 crc kubenswrapper[4664]: I1003 08:32:23.876743 4664 scope.go:117] "RemoveContainer" containerID="87bb80884c4b381ef0d8fba25dac22cac6a15cc54ca77b684da87cc8b486fdd9" Oct 03 08:32:23 crc kubenswrapper[4664]: E1003 08:32:23.881878 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:32:34 crc kubenswrapper[4664]: I1003 08:32:34.876017 4664 scope.go:117] "RemoveContainer" containerID="87bb80884c4b381ef0d8fba25dac22cac6a15cc54ca77b684da87cc8b486fdd9" Oct 03 08:32:34 crc kubenswrapper[4664]: E1003 08:32:34.876803 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:32:44 crc kubenswrapper[4664]: I1003 08:32:44.382341 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zzghw"] Oct 03 08:32:44 crc kubenswrapper[4664]: E1003 08:32:44.383649 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da2b4748-a1ed-4edd-ba5e-2bdc917790dd" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 03 08:32:44 crc kubenswrapper[4664]: I1003 08:32:44.383672 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="da2b4748-a1ed-4edd-ba5e-2bdc917790dd" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 03 08:32:44 crc kubenswrapper[4664]: I1003 08:32:44.383896 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="da2b4748-a1ed-4edd-ba5e-2bdc917790dd" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 03 08:32:44 crc kubenswrapper[4664]: I1003 08:32:44.385662 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zzghw" Oct 03 08:32:44 crc kubenswrapper[4664]: I1003 08:32:44.394269 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zzghw"] Oct 03 08:32:44 crc kubenswrapper[4664]: I1003 08:32:44.561062 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99b4e10d-1cb1-4558-87a3-78a28258f5f8-catalog-content\") pod \"certified-operators-zzghw\" (UID: \"99b4e10d-1cb1-4558-87a3-78a28258f5f8\") " pod="openshift-marketplace/certified-operators-zzghw" Oct 03 08:32:44 crc kubenswrapper[4664]: I1003 08:32:44.561117 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9gsq\" (UniqueName: \"kubernetes.io/projected/99b4e10d-1cb1-4558-87a3-78a28258f5f8-kube-api-access-v9gsq\") pod \"certified-operators-zzghw\" (UID: \"99b4e10d-1cb1-4558-87a3-78a28258f5f8\") " pod="openshift-marketplace/certified-operators-zzghw" Oct 03 08:32:44 crc kubenswrapper[4664]: I1003 08:32:44.561199 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99b4e10d-1cb1-4558-87a3-78a28258f5f8-utilities\") pod \"certified-operators-zzghw\" (UID: \"99b4e10d-1cb1-4558-87a3-78a28258f5f8\") " pod="openshift-marketplace/certified-operators-zzghw" Oct 03 08:32:44 crc kubenswrapper[4664]: I1003 08:32:44.582286 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vgbg2"] Oct 03 08:32:44 crc kubenswrapper[4664]: I1003 08:32:44.585064 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vgbg2" Oct 03 08:32:44 crc kubenswrapper[4664]: I1003 08:32:44.595757 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vgbg2"] Oct 03 08:32:44 crc kubenswrapper[4664]: I1003 08:32:44.662631 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99b4e10d-1cb1-4558-87a3-78a28258f5f8-catalog-content\") pod \"certified-operators-zzghw\" (UID: \"99b4e10d-1cb1-4558-87a3-78a28258f5f8\") " pod="openshift-marketplace/certified-operators-zzghw" Oct 03 08:32:44 crc kubenswrapper[4664]: I1003 08:32:44.662679 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9gsq\" (UniqueName: \"kubernetes.io/projected/99b4e10d-1cb1-4558-87a3-78a28258f5f8-kube-api-access-v9gsq\") pod \"certified-operators-zzghw\" (UID: \"99b4e10d-1cb1-4558-87a3-78a28258f5f8\") " pod="openshift-marketplace/certified-operators-zzghw" Oct 03 08:32:44 crc kubenswrapper[4664]: I1003 08:32:44.662740 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99b4e10d-1cb1-4558-87a3-78a28258f5f8-utilities\") pod \"certified-operators-zzghw\" (UID: \"99b4e10d-1cb1-4558-87a3-78a28258f5f8\") " pod="openshift-marketplace/certified-operators-zzghw" Oct 03 08:32:44 crc kubenswrapper[4664]: I1003 08:32:44.663325 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99b4e10d-1cb1-4558-87a3-78a28258f5f8-utilities\") pod \"certified-operators-zzghw\" (UID: \"99b4e10d-1cb1-4558-87a3-78a28258f5f8\") " pod="openshift-marketplace/certified-operators-zzghw" Oct 03 08:32:44 crc kubenswrapper[4664]: I1003 08:32:44.663871 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99b4e10d-1cb1-4558-87a3-78a28258f5f8-catalog-content\") pod \"certified-operators-zzghw\" (UID: \"99b4e10d-1cb1-4558-87a3-78a28258f5f8\") " pod="openshift-marketplace/certified-operators-zzghw" Oct 03 08:32:44 crc kubenswrapper[4664]: I1003 08:32:44.700907 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9gsq\" (UniqueName: \"kubernetes.io/projected/99b4e10d-1cb1-4558-87a3-78a28258f5f8-kube-api-access-v9gsq\") pod \"certified-operators-zzghw\" (UID: \"99b4e10d-1cb1-4558-87a3-78a28258f5f8\") " pod="openshift-marketplace/certified-operators-zzghw" Oct 03 08:32:44 crc kubenswrapper[4664]: I1003 08:32:44.751241 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zzghw" Oct 03 08:32:44 crc kubenswrapper[4664]: I1003 08:32:44.764866 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpn8m\" (UniqueName: \"kubernetes.io/projected/763f1257-a48c-47f1-a607-8a185ff79ae4-kube-api-access-tpn8m\") pod \"community-operators-vgbg2\" (UID: \"763f1257-a48c-47f1-a607-8a185ff79ae4\") " pod="openshift-marketplace/community-operators-vgbg2" Oct 03 08:32:44 crc kubenswrapper[4664]: I1003 08:32:44.765669 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/763f1257-a48c-47f1-a607-8a185ff79ae4-utilities\") pod \"community-operators-vgbg2\" (UID: \"763f1257-a48c-47f1-a607-8a185ff79ae4\") " pod="openshift-marketplace/community-operators-vgbg2" Oct 03 08:32:44 crc kubenswrapper[4664]: I1003 08:32:44.766030 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/763f1257-a48c-47f1-a607-8a185ff79ae4-catalog-content\") pod \"community-operators-vgbg2\" (UID: \"763f1257-a48c-47f1-a607-8a185ff79ae4\") " pod="openshift-marketplace/community-operators-vgbg2" Oct 03 08:32:44 crc kubenswrapper[4664]: I1003 08:32:44.868070 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/763f1257-a48c-47f1-a607-8a185ff79ae4-utilities\") pod \"community-operators-vgbg2\" (UID: \"763f1257-a48c-47f1-a607-8a185ff79ae4\") " pod="openshift-marketplace/community-operators-vgbg2" Oct 03 08:32:44 crc kubenswrapper[4664]: I1003 08:32:44.868207 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/763f1257-a48c-47f1-a607-8a185ff79ae4-catalog-content\") pod \"community-operators-vgbg2\" (UID: \"763f1257-a48c-47f1-a607-8a185ff79ae4\") " pod="openshift-marketplace/community-operators-vgbg2" Oct 03 08:32:44 crc kubenswrapper[4664]: I1003 08:32:44.868259 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpn8m\" (UniqueName: \"kubernetes.io/projected/763f1257-a48c-47f1-a607-8a185ff79ae4-kube-api-access-tpn8m\") pod \"community-operators-vgbg2\" (UID: \"763f1257-a48c-47f1-a607-8a185ff79ae4\") " pod="openshift-marketplace/community-operators-vgbg2" Oct 03 08:32:44 crc kubenswrapper[4664]: I1003 08:32:44.868731 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/763f1257-a48c-47f1-a607-8a185ff79ae4-utilities\") pod \"community-operators-vgbg2\" (UID: \"763f1257-a48c-47f1-a607-8a185ff79ae4\") " pod="openshift-marketplace/community-operators-vgbg2" Oct 03 08:32:44 crc kubenswrapper[4664]: I1003 08:32:44.869008 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/763f1257-a48c-47f1-a607-8a185ff79ae4-catalog-content\") pod \"community-operators-vgbg2\" (UID: \"763f1257-a48c-47f1-a607-8a185ff79ae4\") " pod="openshift-marketplace/community-operators-vgbg2" Oct 03 08:32:44 crc kubenswrapper[4664]: I1003 08:32:44.914453 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpn8m\" (UniqueName: \"kubernetes.io/projected/763f1257-a48c-47f1-a607-8a185ff79ae4-kube-api-access-tpn8m\") pod \"community-operators-vgbg2\" (UID: \"763f1257-a48c-47f1-a607-8a185ff79ae4\") " pod="openshift-marketplace/community-operators-vgbg2" Oct 03 08:32:45 crc kubenswrapper[4664]: I1003 08:32:45.212119 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vgbg2" Oct 03 08:32:45 crc kubenswrapper[4664]: I1003 08:32:45.334771 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zzghw"] Oct 03 08:32:45 crc kubenswrapper[4664]: I1003 08:32:45.697167 4664 generic.go:334] "Generic (PLEG): container finished" podID="99b4e10d-1cb1-4558-87a3-78a28258f5f8" containerID="2985b4ea91af466e87c87a0ee14f773636b12131120b4ed2269736bd52309544" exitCode=0 Oct 03 08:32:45 crc kubenswrapper[4664]: I1003 08:32:45.697275 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zzghw" event={"ID":"99b4e10d-1cb1-4558-87a3-78a28258f5f8","Type":"ContainerDied","Data":"2985b4ea91af466e87c87a0ee14f773636b12131120b4ed2269736bd52309544"} Oct 03 08:32:45 crc kubenswrapper[4664]: I1003 08:32:45.697563 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zzghw" event={"ID":"99b4e10d-1cb1-4558-87a3-78a28258f5f8","Type":"ContainerStarted","Data":"e4f04ce7eccfcba255c05de3d72722985a5fc65e60fb92825063a3e67cb3e15d"} Oct 03 08:32:45 crc kubenswrapper[4664]: I1003 08:32:45.887301 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vgbg2"] Oct 03 08:32:45 crc kubenswrapper[4664]: W1003 08:32:45.904208 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod763f1257_a48c_47f1_a607_8a185ff79ae4.slice/crio-155284b4a08b14626310b682e079251b990047dd62970f50b4560aaf992b3eb0 WatchSource:0}: Error finding container 155284b4a08b14626310b682e079251b990047dd62970f50b4560aaf992b3eb0: Status 404 returned error can't find the container with id 155284b4a08b14626310b682e079251b990047dd62970f50b4560aaf992b3eb0 Oct 03 08:32:46 crc kubenswrapper[4664]: I1003 08:32:46.708005 4664 generic.go:334] "Generic (PLEG): container finished" podID="763f1257-a48c-47f1-a607-8a185ff79ae4" containerID="610ca5a7ecdcf751b40ca22ff23c767d138963b96b0d35ad546c8add30812a02" exitCode=0 Oct 03 08:32:46 crc kubenswrapper[4664]: I1003 08:32:46.708121 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vgbg2" event={"ID":"763f1257-a48c-47f1-a607-8a185ff79ae4","Type":"ContainerDied","Data":"610ca5a7ecdcf751b40ca22ff23c767d138963b96b0d35ad546c8add30812a02"} Oct 03 08:32:46 crc kubenswrapper[4664]: I1003 08:32:46.708873 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vgbg2" event={"ID":"763f1257-a48c-47f1-a607-8a185ff79ae4","Type":"ContainerStarted","Data":"155284b4a08b14626310b682e079251b990047dd62970f50b4560aaf992b3eb0"} Oct 03 08:32:46 crc kubenswrapper[4664]: I1003 08:32:46.712597 4664 generic.go:334] "Generic (PLEG): container finished" podID="99b4e10d-1cb1-4558-87a3-78a28258f5f8" containerID="338518b027e48c6428e271a8d21398e64b2a2525c282fdad39eb9e32d6113417" exitCode=0 Oct 03 08:32:46 crc kubenswrapper[4664]: I1003 08:32:46.712630 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zzghw" event={"ID":"99b4e10d-1cb1-4558-87a3-78a28258f5f8","Type":"ContainerDied","Data":"338518b027e48c6428e271a8d21398e64b2a2525c282fdad39eb9e32d6113417"} Oct 03 08:32:46 crc kubenswrapper[4664]: I1003 08:32:46.786382 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qc8tp"] Oct 03 08:32:46 crc kubenswrapper[4664]: I1003 08:32:46.789169 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qc8tp" Oct 03 08:32:46 crc kubenswrapper[4664]: I1003 08:32:46.798189 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qc8tp"] Oct 03 08:32:46 crc kubenswrapper[4664]: I1003 08:32:46.823403 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tlhx\" (UniqueName: \"kubernetes.io/projected/1db0c593-1288-4661-9e96-73a98f5047bd-kube-api-access-9tlhx\") pod \"redhat-marketplace-qc8tp\" (UID: \"1db0c593-1288-4661-9e96-73a98f5047bd\") " pod="openshift-marketplace/redhat-marketplace-qc8tp" Oct 03 08:32:46 crc kubenswrapper[4664]: I1003 08:32:46.824035 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1db0c593-1288-4661-9e96-73a98f5047bd-utilities\") pod \"redhat-marketplace-qc8tp\" (UID: \"1db0c593-1288-4661-9e96-73a98f5047bd\") " pod="openshift-marketplace/redhat-marketplace-qc8tp" Oct 03 08:32:46 crc kubenswrapper[4664]: I1003 08:32:46.824180 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1db0c593-1288-4661-9e96-73a98f5047bd-catalog-content\") pod \"redhat-marketplace-qc8tp\" (UID: \"1db0c593-1288-4661-9e96-73a98f5047bd\") " pod="openshift-marketplace/redhat-marketplace-qc8tp" Oct 03 08:32:46 crc kubenswrapper[4664]: I1003 08:32:46.925028 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tlhx\" (UniqueName: \"kubernetes.io/projected/1db0c593-1288-4661-9e96-73a98f5047bd-kube-api-access-9tlhx\") pod \"redhat-marketplace-qc8tp\" (UID: \"1db0c593-1288-4661-9e96-73a98f5047bd\") " pod="openshift-marketplace/redhat-marketplace-qc8tp" Oct 03 08:32:46 crc kubenswrapper[4664]: I1003 08:32:46.925537 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1db0c593-1288-4661-9e96-73a98f5047bd-utilities\") pod \"redhat-marketplace-qc8tp\" (UID: \"1db0c593-1288-4661-9e96-73a98f5047bd\") " pod="openshift-marketplace/redhat-marketplace-qc8tp" Oct 03 08:32:46 crc kubenswrapper[4664]: I1003 08:32:46.925673 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1db0c593-1288-4661-9e96-73a98f5047bd-catalog-content\") pod \"redhat-marketplace-qc8tp\" (UID: \"1db0c593-1288-4661-9e96-73a98f5047bd\") " pod="openshift-marketplace/redhat-marketplace-qc8tp" Oct 03 08:32:46 crc kubenswrapper[4664]: I1003 08:32:46.926308 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1db0c593-1288-4661-9e96-73a98f5047bd-utilities\") pod \"redhat-marketplace-qc8tp\" (UID: \"1db0c593-1288-4661-9e96-73a98f5047bd\") " pod="openshift-marketplace/redhat-marketplace-qc8tp" Oct 03 08:32:46 crc kubenswrapper[4664]: I1003 08:32:46.928038 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1db0c593-1288-4661-9e96-73a98f5047bd-catalog-content\") pod \"redhat-marketplace-qc8tp\" (UID: \"1db0c593-1288-4661-9e96-73a98f5047bd\") " pod="openshift-marketplace/redhat-marketplace-qc8tp" Oct 03 08:32:46 crc kubenswrapper[4664]: I1003 08:32:46.949664 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tlhx\" (UniqueName: \"kubernetes.io/projected/1db0c593-1288-4661-9e96-73a98f5047bd-kube-api-access-9tlhx\") pod \"redhat-marketplace-qc8tp\" (UID: \"1db0c593-1288-4661-9e96-73a98f5047bd\") " pod="openshift-marketplace/redhat-marketplace-qc8tp" Oct 03 08:32:47 crc kubenswrapper[4664]: I1003 08:32:47.108821 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qc8tp" Oct 03 08:32:47 crc kubenswrapper[4664]: I1003 08:32:47.572106 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qc8tp"] Oct 03 08:32:47 crc kubenswrapper[4664]: I1003 08:32:47.727427 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vgbg2" event={"ID":"763f1257-a48c-47f1-a607-8a185ff79ae4","Type":"ContainerStarted","Data":"72437398c31865fabf88b0e0b5212d41801480381d7c5ae6de2afe2ab260bf4b"} Oct 03 08:32:47 crc kubenswrapper[4664]: I1003 08:32:47.729568 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qc8tp" event={"ID":"1db0c593-1288-4661-9e96-73a98f5047bd","Type":"ContainerStarted","Data":"862172b6f9e699838b2043af0e641c912651829d93e97e2b5a9badf415396a57"} Oct 03 08:32:47 crc kubenswrapper[4664]: I1003 08:32:47.732396 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zzghw" event={"ID":"99b4e10d-1cb1-4558-87a3-78a28258f5f8","Type":"ContainerStarted","Data":"44c5b4688306def22bcaee87a4e36da3881d68024bbabdc183d4cc557102029f"} Oct 03 08:32:47 crc kubenswrapper[4664]: I1003 08:32:47.779460 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zzghw" podStartSLOduration=2.045040334 podStartE2EDuration="3.779435123s" podCreationTimestamp="2025-10-03 08:32:44 +0000 UTC" firstStartedPulling="2025-10-03 08:32:45.700469998 +0000 UTC m=+2666.521660488" lastFinishedPulling="2025-10-03 08:32:47.434864787 +0000 UTC m=+2668.256055277" observedRunningTime="2025-10-03 08:32:47.775456969 +0000 UTC m=+2668.596647479" watchObservedRunningTime="2025-10-03 08:32:47.779435123 +0000 UTC m=+2668.600625613" Oct 03 08:32:48 crc kubenswrapper[4664]: I1003 08:32:48.743240 4664 generic.go:334] "Generic (PLEG): container finished" podID="763f1257-a48c-47f1-a607-8a185ff79ae4" containerID="72437398c31865fabf88b0e0b5212d41801480381d7c5ae6de2afe2ab260bf4b" exitCode=0 Oct 03 08:32:48 crc kubenswrapper[4664]: I1003 08:32:48.744134 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vgbg2" event={"ID":"763f1257-a48c-47f1-a607-8a185ff79ae4","Type":"ContainerDied","Data":"72437398c31865fabf88b0e0b5212d41801480381d7c5ae6de2afe2ab260bf4b"} Oct 03 08:32:48 crc kubenswrapper[4664]: I1003 08:32:48.746489 4664 generic.go:334] "Generic (PLEG): container finished" podID="1db0c593-1288-4661-9e96-73a98f5047bd" containerID="af4110d18d5ee6352a0bd4dba9f227b32e6959a6d6124c7dc407d212833f4ee2" exitCode=0 Oct 03 08:32:48 crc kubenswrapper[4664]: I1003 08:32:48.747312 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qc8tp" event={"ID":"1db0c593-1288-4661-9e96-73a98f5047bd","Type":"ContainerDied","Data":"af4110d18d5ee6352a0bd4dba9f227b32e6959a6d6124c7dc407d212833f4ee2"} Oct 03 08:32:48 crc kubenswrapper[4664]: I1003 08:32:48.877585 4664 scope.go:117] "RemoveContainer" containerID="87bb80884c4b381ef0d8fba25dac22cac6a15cc54ca77b684da87cc8b486fdd9" Oct 03 08:32:48 crc kubenswrapper[4664]: E1003 08:32:48.877902 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:32:49 crc kubenswrapper[4664]: I1003 08:32:49.760088 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vgbg2" event={"ID":"763f1257-a48c-47f1-a607-8a185ff79ae4","Type":"ContainerStarted","Data":"a53b81bf9ab26f4234b30b1dcb1798f601b3978019ea6948187b4497391d0210"} Oct 03 08:32:49 crc kubenswrapper[4664]: I1003 08:32:49.762485 4664 generic.go:334] "Generic (PLEG): container finished" podID="1db0c593-1288-4661-9e96-73a98f5047bd" containerID="4efe15c60beb671de8a420eb8e82d8f6838a18bd58c2c72d23fee5d6f0631563" exitCode=0 Oct 03 08:32:49 crc kubenswrapper[4664]: I1003 08:32:49.762551 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qc8tp" event={"ID":"1db0c593-1288-4661-9e96-73a98f5047bd","Type":"ContainerDied","Data":"4efe15c60beb671de8a420eb8e82d8f6838a18bd58c2c72d23fee5d6f0631563"} Oct 03 08:32:49 crc kubenswrapper[4664]: I1003 08:32:49.790206 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vgbg2" podStartSLOduration=3.34506541 podStartE2EDuration="5.790183912s" podCreationTimestamp="2025-10-03 08:32:44 +0000 UTC" firstStartedPulling="2025-10-03 08:32:46.710081663 +0000 UTC m=+2667.531272153" lastFinishedPulling="2025-10-03 08:32:49.155200165 +0000 UTC m=+2669.976390655" observedRunningTime="2025-10-03 08:32:49.781122962 +0000 UTC m=+2670.602313472" watchObservedRunningTime="2025-10-03 08:32:49.790183912 +0000 UTC m=+2670.611374402" Oct 03 08:32:50 crc kubenswrapper[4664]: I1003 08:32:50.774109 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qc8tp" event={"ID":"1db0c593-1288-4661-9e96-73a98f5047bd","Type":"ContainerStarted","Data":"88a9be02a867685b14c27ea30195ccd2b80c2e16e92c1388fd1fc01b6748e56d"} Oct 03 08:32:50 crc kubenswrapper[4664]: I1003 08:32:50.793990 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qc8tp" podStartSLOduration=3.187141901 podStartE2EDuration="4.793963091s" podCreationTimestamp="2025-10-03 08:32:46 +0000 UTC" firstStartedPulling="2025-10-03 08:32:48.74864664 +0000 UTC m=+2669.569837130" lastFinishedPulling="2025-10-03 08:32:50.35546783 +0000 UTC m=+2671.176658320" observedRunningTime="2025-10-03 08:32:50.790075469 +0000 UTC m=+2671.611265969" watchObservedRunningTime="2025-10-03 08:32:50.793963091 +0000 UTC m=+2671.615153591" Oct 03 08:32:54 crc kubenswrapper[4664]: I1003 08:32:54.752060 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zzghw" Oct 03 08:32:54 crc kubenswrapper[4664]: I1003 08:32:54.752330 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zzghw" Oct 03 08:32:54 crc kubenswrapper[4664]: I1003 08:32:54.812678 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zzghw" Oct 03 08:32:54 crc kubenswrapper[4664]: I1003 08:32:54.865350 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zzghw" Oct 03 08:32:55 crc kubenswrapper[4664]: I1003 08:32:55.212668 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vgbg2" Oct 03 08:32:55 crc kubenswrapper[4664]: I1003 08:32:55.212738 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vgbg2" Oct 03 08:32:55 crc kubenswrapper[4664]: I1003 08:32:55.258956 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vgbg2" Oct 03 08:32:55 crc kubenswrapper[4664]: I1003 08:32:55.369942 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zzghw"] Oct 03 08:32:55 crc kubenswrapper[4664]: I1003 08:32:55.860739 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vgbg2" Oct 03 08:32:56 crc kubenswrapper[4664]: I1003 08:32:56.826232 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zzghw" podUID="99b4e10d-1cb1-4558-87a3-78a28258f5f8" containerName="registry-server" containerID="cri-o://44c5b4688306def22bcaee87a4e36da3881d68024bbabdc183d4cc557102029f" gracePeriod=2 Oct 03 08:32:57 crc kubenswrapper[4664]: I1003 08:32:57.109865 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qc8tp" Oct 03 08:32:57 crc kubenswrapper[4664]: I1003 08:32:57.110678 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qc8tp" Oct 03 08:32:57 crc kubenswrapper[4664]: I1003 08:32:57.171198 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qc8tp" Oct 03 08:32:57 crc kubenswrapper[4664]: I1003 08:32:57.436807 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zzghw" Oct 03 08:32:57 crc kubenswrapper[4664]: I1003 08:32:57.552658 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9gsq\" (UniqueName: \"kubernetes.io/projected/99b4e10d-1cb1-4558-87a3-78a28258f5f8-kube-api-access-v9gsq\") pod \"99b4e10d-1cb1-4558-87a3-78a28258f5f8\" (UID: \"99b4e10d-1cb1-4558-87a3-78a28258f5f8\") " Oct 03 08:32:57 crc kubenswrapper[4664]: I1003 08:32:57.554813 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99b4e10d-1cb1-4558-87a3-78a28258f5f8-catalog-content\") pod \"99b4e10d-1cb1-4558-87a3-78a28258f5f8\" (UID: \"99b4e10d-1cb1-4558-87a3-78a28258f5f8\") " Oct 03 08:32:57 crc kubenswrapper[4664]: I1003 08:32:57.554949 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99b4e10d-1cb1-4558-87a3-78a28258f5f8-utilities\") pod \"99b4e10d-1cb1-4558-87a3-78a28258f5f8\" (UID: \"99b4e10d-1cb1-4558-87a3-78a28258f5f8\") " Oct 03 08:32:57 crc kubenswrapper[4664]: I1003 08:32:57.556214 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99b4e10d-1cb1-4558-87a3-78a28258f5f8-utilities" (OuterVolumeSpecName: "utilities") pod "99b4e10d-1cb1-4558-87a3-78a28258f5f8" (UID: "99b4e10d-1cb1-4558-87a3-78a28258f5f8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:32:57 crc kubenswrapper[4664]: I1003 08:32:57.568035 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99b4e10d-1cb1-4558-87a3-78a28258f5f8-kube-api-access-v9gsq" (OuterVolumeSpecName: "kube-api-access-v9gsq") pod "99b4e10d-1cb1-4558-87a3-78a28258f5f8" (UID: "99b4e10d-1cb1-4558-87a3-78a28258f5f8"). InnerVolumeSpecName "kube-api-access-v9gsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:32:57 crc kubenswrapper[4664]: I1003 08:32:57.575248 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vgbg2"] Oct 03 08:32:57 crc kubenswrapper[4664]: I1003 08:32:57.642870 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99b4e10d-1cb1-4558-87a3-78a28258f5f8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "99b4e10d-1cb1-4558-87a3-78a28258f5f8" (UID: "99b4e10d-1cb1-4558-87a3-78a28258f5f8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:32:57 crc kubenswrapper[4664]: I1003 08:32:57.656970 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9gsq\" (UniqueName: \"kubernetes.io/projected/99b4e10d-1cb1-4558-87a3-78a28258f5f8-kube-api-access-v9gsq\") on node \"crc\" DevicePath \"\"" Oct 03 08:32:57 crc kubenswrapper[4664]: I1003 08:32:57.657013 4664 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99b4e10d-1cb1-4558-87a3-78a28258f5f8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:32:57 crc kubenswrapper[4664]: I1003 08:32:57.657026 4664 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99b4e10d-1cb1-4558-87a3-78a28258f5f8-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:32:57 crc kubenswrapper[4664]: I1003 08:32:57.838039 4664 generic.go:334] "Generic (PLEG): container finished" podID="99b4e10d-1cb1-4558-87a3-78a28258f5f8" containerID="44c5b4688306def22bcaee87a4e36da3881d68024bbabdc183d4cc557102029f" exitCode=0 Oct 03 08:32:57 crc kubenswrapper[4664]: I1003 08:32:57.838634 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vgbg2" podUID="763f1257-a48c-47f1-a607-8a185ff79ae4" containerName="registry-server" containerID="cri-o://a53b81bf9ab26f4234b30b1dcb1798f601b3978019ea6948187b4497391d0210" gracePeriod=2 Oct 03 08:32:57 crc kubenswrapper[4664]: I1003 08:32:57.839133 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zzghw" Oct 03 08:32:57 crc kubenswrapper[4664]: I1003 08:32:57.840754 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zzghw" event={"ID":"99b4e10d-1cb1-4558-87a3-78a28258f5f8","Type":"ContainerDied","Data":"44c5b4688306def22bcaee87a4e36da3881d68024bbabdc183d4cc557102029f"} Oct 03 08:32:57 crc kubenswrapper[4664]: I1003 08:32:57.840822 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zzghw" event={"ID":"99b4e10d-1cb1-4558-87a3-78a28258f5f8","Type":"ContainerDied","Data":"e4f04ce7eccfcba255c05de3d72722985a5fc65e60fb92825063a3e67cb3e15d"} Oct 03 08:32:57 crc kubenswrapper[4664]: I1003 08:32:57.840846 4664 scope.go:117] "RemoveContainer" containerID="44c5b4688306def22bcaee87a4e36da3881d68024bbabdc183d4cc557102029f" Oct 03 08:32:57 crc kubenswrapper[4664]: I1003 08:32:57.889410 4664 scope.go:117] "RemoveContainer" containerID="338518b027e48c6428e271a8d21398e64b2a2525c282fdad39eb9e32d6113417" Oct 03 08:32:57 crc kubenswrapper[4664]: I1003 08:32:57.895832 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zzghw"] Oct 03 08:32:57 crc kubenswrapper[4664]: I1003 08:32:57.903461 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zzghw"] Oct 03 08:32:57 crc kubenswrapper[4664]: I1003 08:32:57.913992 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qc8tp" Oct 03 08:32:57 crc kubenswrapper[4664]: I1003 08:32:57.956506 4664 scope.go:117] "RemoveContainer" containerID="2985b4ea91af466e87c87a0ee14f773636b12131120b4ed2269736bd52309544" Oct 03 08:32:58 crc kubenswrapper[4664]: I1003 08:32:58.040840 4664 scope.go:117] "RemoveContainer" containerID="44c5b4688306def22bcaee87a4e36da3881d68024bbabdc183d4cc557102029f" Oct 03 08:32:58 crc kubenswrapper[4664]: E1003 08:32:58.041395 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44c5b4688306def22bcaee87a4e36da3881d68024bbabdc183d4cc557102029f\": container with ID starting with 44c5b4688306def22bcaee87a4e36da3881d68024bbabdc183d4cc557102029f not found: ID does not exist" containerID="44c5b4688306def22bcaee87a4e36da3881d68024bbabdc183d4cc557102029f" Oct 03 08:32:58 crc kubenswrapper[4664]: I1003 08:32:58.041443 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44c5b4688306def22bcaee87a4e36da3881d68024bbabdc183d4cc557102029f"} err="failed to get container status \"44c5b4688306def22bcaee87a4e36da3881d68024bbabdc183d4cc557102029f\": rpc error: code = NotFound desc = could not find container \"44c5b4688306def22bcaee87a4e36da3881d68024bbabdc183d4cc557102029f\": container with ID starting with 44c5b4688306def22bcaee87a4e36da3881d68024bbabdc183d4cc557102029f not found: ID does not exist" Oct 03 08:32:58 crc kubenswrapper[4664]: I1003 08:32:58.041475 4664 scope.go:117] "RemoveContainer" containerID="338518b027e48c6428e271a8d21398e64b2a2525c282fdad39eb9e32d6113417" Oct 03 08:32:58 crc kubenswrapper[4664]: E1003 08:32:58.041846 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"338518b027e48c6428e271a8d21398e64b2a2525c282fdad39eb9e32d6113417\": container with ID starting with 338518b027e48c6428e271a8d21398e64b2a2525c282fdad39eb9e32d6113417 not found: ID does not exist" containerID="338518b027e48c6428e271a8d21398e64b2a2525c282fdad39eb9e32d6113417" Oct 03 08:32:58 crc kubenswrapper[4664]: I1003 08:32:58.041879 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"338518b027e48c6428e271a8d21398e64b2a2525c282fdad39eb9e32d6113417"} err="failed to get container status \"338518b027e48c6428e271a8d21398e64b2a2525c282fdad39eb9e32d6113417\": rpc error: code = NotFound desc = could not find container \"338518b027e48c6428e271a8d21398e64b2a2525c282fdad39eb9e32d6113417\": container with ID starting with 338518b027e48c6428e271a8d21398e64b2a2525c282fdad39eb9e32d6113417 not found: ID does not exist" Oct 03 08:32:58 crc kubenswrapper[4664]: I1003 08:32:58.041906 4664 scope.go:117] "RemoveContainer" containerID="2985b4ea91af466e87c87a0ee14f773636b12131120b4ed2269736bd52309544" Oct 03 08:32:58 crc kubenswrapper[4664]: E1003 08:32:58.042576 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2985b4ea91af466e87c87a0ee14f773636b12131120b4ed2269736bd52309544\": container with ID starting with 2985b4ea91af466e87c87a0ee14f773636b12131120b4ed2269736bd52309544 not found: ID does not exist" containerID="2985b4ea91af466e87c87a0ee14f773636b12131120b4ed2269736bd52309544" Oct 03 08:32:58 crc kubenswrapper[4664]: I1003 08:32:58.042692 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2985b4ea91af466e87c87a0ee14f773636b12131120b4ed2269736bd52309544"} err="failed to get container status \"2985b4ea91af466e87c87a0ee14f773636b12131120b4ed2269736bd52309544\": rpc error: code = NotFound desc = could not find container \"2985b4ea91af466e87c87a0ee14f773636b12131120b4ed2269736bd52309544\": container with ID starting with 2985b4ea91af466e87c87a0ee14f773636b12131120b4ed2269736bd52309544 not found: ID does not exist" Oct 03 08:32:58 crc kubenswrapper[4664]: I1003 08:32:58.370314 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vgbg2" Oct 03 08:32:58 crc kubenswrapper[4664]: I1003 08:32:58.473112 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/763f1257-a48c-47f1-a607-8a185ff79ae4-utilities\") pod \"763f1257-a48c-47f1-a607-8a185ff79ae4\" (UID: \"763f1257-a48c-47f1-a607-8a185ff79ae4\") " Oct 03 08:32:58 crc kubenswrapper[4664]: I1003 08:32:58.473570 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/763f1257-a48c-47f1-a607-8a185ff79ae4-catalog-content\") pod \"763f1257-a48c-47f1-a607-8a185ff79ae4\" (UID: \"763f1257-a48c-47f1-a607-8a185ff79ae4\") " Oct 03 08:32:58 crc kubenswrapper[4664]: I1003 08:32:58.473749 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpn8m\" (UniqueName: \"kubernetes.io/projected/763f1257-a48c-47f1-a607-8a185ff79ae4-kube-api-access-tpn8m\") pod \"763f1257-a48c-47f1-a607-8a185ff79ae4\" (UID: \"763f1257-a48c-47f1-a607-8a185ff79ae4\") " Oct 03 08:32:58 crc kubenswrapper[4664]: I1003 08:32:58.474437 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/763f1257-a48c-47f1-a607-8a185ff79ae4-utilities" (OuterVolumeSpecName: "utilities") pod "763f1257-a48c-47f1-a607-8a185ff79ae4" (UID: "763f1257-a48c-47f1-a607-8a185ff79ae4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:32:58 crc kubenswrapper[4664]: I1003 08:32:58.479457 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/763f1257-a48c-47f1-a607-8a185ff79ae4-kube-api-access-tpn8m" (OuterVolumeSpecName: "kube-api-access-tpn8m") pod "763f1257-a48c-47f1-a607-8a185ff79ae4" (UID: "763f1257-a48c-47f1-a607-8a185ff79ae4"). InnerVolumeSpecName "kube-api-access-tpn8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:32:58 crc kubenswrapper[4664]: I1003 08:32:58.576279 4664 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/763f1257-a48c-47f1-a607-8a185ff79ae4-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:32:58 crc kubenswrapper[4664]: I1003 08:32:58.576329 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpn8m\" (UniqueName: \"kubernetes.io/projected/763f1257-a48c-47f1-a607-8a185ff79ae4-kube-api-access-tpn8m\") on node \"crc\" DevicePath \"\"" Oct 03 08:32:58 crc kubenswrapper[4664]: I1003 08:32:58.855652 4664 generic.go:334] "Generic (PLEG): container finished" podID="763f1257-a48c-47f1-a607-8a185ff79ae4" containerID="a53b81bf9ab26f4234b30b1dcb1798f601b3978019ea6948187b4497391d0210" exitCode=0 Oct 03 08:32:58 crc kubenswrapper[4664]: I1003 08:32:58.856739 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vgbg2" event={"ID":"763f1257-a48c-47f1-a607-8a185ff79ae4","Type":"ContainerDied","Data":"a53b81bf9ab26f4234b30b1dcb1798f601b3978019ea6948187b4497391d0210"} Oct 03 08:32:58 crc kubenswrapper[4664]: I1003 08:32:58.856816 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vgbg2" event={"ID":"763f1257-a48c-47f1-a607-8a185ff79ae4","Type":"ContainerDied","Data":"155284b4a08b14626310b682e079251b990047dd62970f50b4560aaf992b3eb0"} Oct 03 08:32:58 crc kubenswrapper[4664]: I1003 08:32:58.856829 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vgbg2" Oct 03 08:32:58 crc kubenswrapper[4664]: I1003 08:32:58.856843 4664 scope.go:117] "RemoveContainer" containerID="a53b81bf9ab26f4234b30b1dcb1798f601b3978019ea6948187b4497391d0210" Oct 03 08:32:58 crc kubenswrapper[4664]: I1003 08:32:58.878483 4664 scope.go:117] "RemoveContainer" containerID="72437398c31865fabf88b0e0b5212d41801480381d7c5ae6de2afe2ab260bf4b" Oct 03 08:32:58 crc kubenswrapper[4664]: I1003 08:32:58.894706 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/763f1257-a48c-47f1-a607-8a185ff79ae4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "763f1257-a48c-47f1-a607-8a185ff79ae4" (UID: "763f1257-a48c-47f1-a607-8a185ff79ae4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:32:58 crc kubenswrapper[4664]: I1003 08:32:58.903017 4664 scope.go:117] "RemoveContainer" containerID="610ca5a7ecdcf751b40ca22ff23c767d138963b96b0d35ad546c8add30812a02" Oct 03 08:32:58 crc kubenswrapper[4664]: I1003 08:32:58.925924 4664 scope.go:117] "RemoveContainer" containerID="a53b81bf9ab26f4234b30b1dcb1798f601b3978019ea6948187b4497391d0210" Oct 03 08:32:58 crc kubenswrapper[4664]: E1003 08:32:58.926728 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a53b81bf9ab26f4234b30b1dcb1798f601b3978019ea6948187b4497391d0210\": container with ID starting with a53b81bf9ab26f4234b30b1dcb1798f601b3978019ea6948187b4497391d0210 not found: ID does not exist" containerID="a53b81bf9ab26f4234b30b1dcb1798f601b3978019ea6948187b4497391d0210" Oct 03 08:32:58 crc kubenswrapper[4664]: I1003 08:32:58.926804 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a53b81bf9ab26f4234b30b1dcb1798f601b3978019ea6948187b4497391d0210"} err="failed to get container status \"a53b81bf9ab26f4234b30b1dcb1798f601b3978019ea6948187b4497391d0210\": rpc error: code = NotFound desc = could not find container \"a53b81bf9ab26f4234b30b1dcb1798f601b3978019ea6948187b4497391d0210\": container with ID starting with a53b81bf9ab26f4234b30b1dcb1798f601b3978019ea6948187b4497391d0210 not found: ID does not exist" Oct 03 08:32:58 crc kubenswrapper[4664]: I1003 08:32:58.926846 4664 scope.go:117] "RemoveContainer" containerID="72437398c31865fabf88b0e0b5212d41801480381d7c5ae6de2afe2ab260bf4b" Oct 03 08:32:58 crc kubenswrapper[4664]: E1003 08:32:58.927135 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72437398c31865fabf88b0e0b5212d41801480381d7c5ae6de2afe2ab260bf4b\": container with ID starting with 72437398c31865fabf88b0e0b5212d41801480381d7c5ae6de2afe2ab260bf4b not found: ID does not exist" containerID="72437398c31865fabf88b0e0b5212d41801480381d7c5ae6de2afe2ab260bf4b" Oct 03 08:32:58 crc kubenswrapper[4664]: I1003 08:32:58.927180 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72437398c31865fabf88b0e0b5212d41801480381d7c5ae6de2afe2ab260bf4b"} err="failed to get container status \"72437398c31865fabf88b0e0b5212d41801480381d7c5ae6de2afe2ab260bf4b\": rpc error: code = NotFound desc = could not find container \"72437398c31865fabf88b0e0b5212d41801480381d7c5ae6de2afe2ab260bf4b\": container with ID starting with 72437398c31865fabf88b0e0b5212d41801480381d7c5ae6de2afe2ab260bf4b not found: ID does not exist" Oct 03 08:32:58 crc kubenswrapper[4664]: I1003 08:32:58.927197 4664 scope.go:117] "RemoveContainer" containerID="610ca5a7ecdcf751b40ca22ff23c767d138963b96b0d35ad546c8add30812a02" Oct 03 08:32:58 crc kubenswrapper[4664]: E1003 08:32:58.927712 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"610ca5a7ecdcf751b40ca22ff23c767d138963b96b0d35ad546c8add30812a02\": container with ID starting with 610ca5a7ecdcf751b40ca22ff23c767d138963b96b0d35ad546c8add30812a02 not found: ID does not exist" containerID="610ca5a7ecdcf751b40ca22ff23c767d138963b96b0d35ad546c8add30812a02" Oct 03 08:32:58 crc kubenswrapper[4664]: I1003 08:32:58.927772 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"610ca5a7ecdcf751b40ca22ff23c767d138963b96b0d35ad546c8add30812a02"} err="failed to get container status \"610ca5a7ecdcf751b40ca22ff23c767d138963b96b0d35ad546c8add30812a02\": rpc error: code = NotFound desc = could not find container \"610ca5a7ecdcf751b40ca22ff23c767d138963b96b0d35ad546c8add30812a02\": container with ID starting with 610ca5a7ecdcf751b40ca22ff23c767d138963b96b0d35ad546c8add30812a02 not found: ID does not exist" Oct 03 08:32:58 crc kubenswrapper[4664]: I1003 08:32:58.993225 4664 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/763f1257-a48c-47f1-a607-8a185ff79ae4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:32:59 crc kubenswrapper[4664]: I1003 08:32:59.204051 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vgbg2"] Oct 03 08:32:59 crc kubenswrapper[4664]: I1003 08:32:59.214977 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vgbg2"] Oct 03 08:32:59 crc kubenswrapper[4664]: I1003 08:32:59.889858 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="763f1257-a48c-47f1-a607-8a185ff79ae4" path="/var/lib/kubelet/pods/763f1257-a48c-47f1-a607-8a185ff79ae4/volumes" Oct 03 08:32:59 crc kubenswrapper[4664]: I1003 08:32:59.890738 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99b4e10d-1cb1-4558-87a3-78a28258f5f8" path="/var/lib/kubelet/pods/99b4e10d-1cb1-4558-87a3-78a28258f5f8/volumes" Oct 03 08:32:59 crc kubenswrapper[4664]: I1003 08:32:59.971922 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qc8tp"] Oct 03 08:33:00 crc kubenswrapper[4664]: I1003 08:33:00.873474 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qc8tp" podUID="1db0c593-1288-4661-9e96-73a98f5047bd" containerName="registry-server" containerID="cri-o://88a9be02a867685b14c27ea30195ccd2b80c2e16e92c1388fd1fc01b6748e56d" gracePeriod=2 Oct 03 08:33:00 crc kubenswrapper[4664]: I1003 08:33:00.876643 4664 scope.go:117] "RemoveContainer" containerID="87bb80884c4b381ef0d8fba25dac22cac6a15cc54ca77b684da87cc8b486fdd9" Oct 03 08:33:00 crc kubenswrapper[4664]: E1003 08:33:00.876928 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:33:01 crc kubenswrapper[4664]: I1003 08:33:01.330341 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qc8tp" Oct 03 08:33:01 crc kubenswrapper[4664]: I1003 08:33:01.332870 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1db0c593-1288-4661-9e96-73a98f5047bd-utilities\") pod \"1db0c593-1288-4661-9e96-73a98f5047bd\" (UID: \"1db0c593-1288-4661-9e96-73a98f5047bd\") " Oct 03 08:33:01 crc kubenswrapper[4664]: I1003 08:33:01.332920 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1db0c593-1288-4661-9e96-73a98f5047bd-catalog-content\") pod \"1db0c593-1288-4661-9e96-73a98f5047bd\" (UID: \"1db0c593-1288-4661-9e96-73a98f5047bd\") " Oct 03 08:33:01 crc kubenswrapper[4664]: I1003 08:33:01.333243 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tlhx\" (UniqueName: \"kubernetes.io/projected/1db0c593-1288-4661-9e96-73a98f5047bd-kube-api-access-9tlhx\") pod \"1db0c593-1288-4661-9e96-73a98f5047bd\" (UID: \"1db0c593-1288-4661-9e96-73a98f5047bd\") " Oct 03 08:33:01 crc kubenswrapper[4664]: I1003 08:33:01.334230 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1db0c593-1288-4661-9e96-73a98f5047bd-utilities" (OuterVolumeSpecName: "utilities") pod "1db0c593-1288-4661-9e96-73a98f5047bd" (UID: "1db0c593-1288-4661-9e96-73a98f5047bd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:33:01 crc kubenswrapper[4664]: I1003 08:33:01.338859 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1db0c593-1288-4661-9e96-73a98f5047bd-kube-api-access-9tlhx" (OuterVolumeSpecName: "kube-api-access-9tlhx") pod "1db0c593-1288-4661-9e96-73a98f5047bd" (UID: "1db0c593-1288-4661-9e96-73a98f5047bd"). InnerVolumeSpecName "kube-api-access-9tlhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:33:01 crc kubenswrapper[4664]: I1003 08:33:01.352672 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1db0c593-1288-4661-9e96-73a98f5047bd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1db0c593-1288-4661-9e96-73a98f5047bd" (UID: "1db0c593-1288-4661-9e96-73a98f5047bd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:33:01 crc kubenswrapper[4664]: I1003 08:33:01.434466 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9tlhx\" (UniqueName: \"kubernetes.io/projected/1db0c593-1288-4661-9e96-73a98f5047bd-kube-api-access-9tlhx\") on node \"crc\" DevicePath \"\"" Oct 03 08:33:01 crc kubenswrapper[4664]: I1003 08:33:01.434773 4664 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1db0c593-1288-4661-9e96-73a98f5047bd-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:33:01 crc kubenswrapper[4664]: I1003 08:33:01.434847 4664 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1db0c593-1288-4661-9e96-73a98f5047bd-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:33:01 crc kubenswrapper[4664]: I1003 08:33:01.889163 4664 generic.go:334] "Generic (PLEG): container finished" podID="1db0c593-1288-4661-9e96-73a98f5047bd" containerID="88a9be02a867685b14c27ea30195ccd2b80c2e16e92c1388fd1fc01b6748e56d" exitCode=0 Oct 03 08:33:01 crc kubenswrapper[4664]: I1003 08:33:01.889310 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qc8tp" Oct 03 08:33:01 crc kubenswrapper[4664]: I1003 08:33:01.897218 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qc8tp" event={"ID":"1db0c593-1288-4661-9e96-73a98f5047bd","Type":"ContainerDied","Data":"88a9be02a867685b14c27ea30195ccd2b80c2e16e92c1388fd1fc01b6748e56d"} Oct 03 08:33:01 crc kubenswrapper[4664]: I1003 08:33:01.897290 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qc8tp" event={"ID":"1db0c593-1288-4661-9e96-73a98f5047bd","Type":"ContainerDied","Data":"862172b6f9e699838b2043af0e641c912651829d93e97e2b5a9badf415396a57"} Oct 03 08:33:01 crc kubenswrapper[4664]: I1003 08:33:01.897318 4664 scope.go:117] "RemoveContainer" containerID="88a9be02a867685b14c27ea30195ccd2b80c2e16e92c1388fd1fc01b6748e56d" Oct 03 08:33:01 crc kubenswrapper[4664]: I1003 08:33:01.936560 4664 scope.go:117] "RemoveContainer" containerID="4efe15c60beb671de8a420eb8e82d8f6838a18bd58c2c72d23fee5d6f0631563" Oct 03 08:33:01 crc kubenswrapper[4664]: I1003 08:33:01.940268 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qc8tp"] Oct 03 08:33:01 crc kubenswrapper[4664]: I1003 08:33:01.948454 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qc8tp"] Oct 03 08:33:01 crc kubenswrapper[4664]: I1003 08:33:01.961051 4664 scope.go:117] "RemoveContainer" containerID="af4110d18d5ee6352a0bd4dba9f227b32e6959a6d6124c7dc407d212833f4ee2" Oct 03 08:33:02 crc kubenswrapper[4664]: I1003 08:33:02.012231 4664 scope.go:117] "RemoveContainer" containerID="88a9be02a867685b14c27ea30195ccd2b80c2e16e92c1388fd1fc01b6748e56d" Oct 03 08:33:02 crc kubenswrapper[4664]: E1003 08:33:02.014227 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88a9be02a867685b14c27ea30195ccd2b80c2e16e92c1388fd1fc01b6748e56d\": container with ID starting with 88a9be02a867685b14c27ea30195ccd2b80c2e16e92c1388fd1fc01b6748e56d not found: ID does not exist" containerID="88a9be02a867685b14c27ea30195ccd2b80c2e16e92c1388fd1fc01b6748e56d" Oct 03 08:33:02 crc kubenswrapper[4664]: I1003 08:33:02.014366 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88a9be02a867685b14c27ea30195ccd2b80c2e16e92c1388fd1fc01b6748e56d"} err="failed to get container status \"88a9be02a867685b14c27ea30195ccd2b80c2e16e92c1388fd1fc01b6748e56d\": rpc error: code = NotFound desc = could not find container \"88a9be02a867685b14c27ea30195ccd2b80c2e16e92c1388fd1fc01b6748e56d\": container with ID starting with 88a9be02a867685b14c27ea30195ccd2b80c2e16e92c1388fd1fc01b6748e56d not found: ID does not exist" Oct 03 08:33:02 crc kubenswrapper[4664]: I1003 08:33:02.014421 4664 scope.go:117] "RemoveContainer" containerID="4efe15c60beb671de8a420eb8e82d8f6838a18bd58c2c72d23fee5d6f0631563" Oct 03 08:33:02 crc kubenswrapper[4664]: E1003 08:33:02.015216 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4efe15c60beb671de8a420eb8e82d8f6838a18bd58c2c72d23fee5d6f0631563\": container with ID starting with 4efe15c60beb671de8a420eb8e82d8f6838a18bd58c2c72d23fee5d6f0631563 not found: ID does not exist" containerID="4efe15c60beb671de8a420eb8e82d8f6838a18bd58c2c72d23fee5d6f0631563" Oct 03 08:33:02 crc kubenswrapper[4664]: I1003 08:33:02.015288 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4efe15c60beb671de8a420eb8e82d8f6838a18bd58c2c72d23fee5d6f0631563"} err="failed to get container status \"4efe15c60beb671de8a420eb8e82d8f6838a18bd58c2c72d23fee5d6f0631563\": rpc error: code = NotFound desc = could not find container \"4efe15c60beb671de8a420eb8e82d8f6838a18bd58c2c72d23fee5d6f0631563\": container with ID starting with 4efe15c60beb671de8a420eb8e82d8f6838a18bd58c2c72d23fee5d6f0631563 not found: ID does not exist" Oct 03 08:33:02 crc kubenswrapper[4664]: I1003 08:33:02.015318 4664 scope.go:117] "RemoveContainer" containerID="af4110d18d5ee6352a0bd4dba9f227b32e6959a6d6124c7dc407d212833f4ee2" Oct 03 08:33:02 crc kubenswrapper[4664]: E1003 08:33:02.015751 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af4110d18d5ee6352a0bd4dba9f227b32e6959a6d6124c7dc407d212833f4ee2\": container with ID starting with af4110d18d5ee6352a0bd4dba9f227b32e6959a6d6124c7dc407d212833f4ee2 not found: ID does not exist" containerID="af4110d18d5ee6352a0bd4dba9f227b32e6959a6d6124c7dc407d212833f4ee2" Oct 03 08:33:02 crc kubenswrapper[4664]: I1003 08:33:02.015854 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af4110d18d5ee6352a0bd4dba9f227b32e6959a6d6124c7dc407d212833f4ee2"} err="failed to get container status \"af4110d18d5ee6352a0bd4dba9f227b32e6959a6d6124c7dc407d212833f4ee2\": rpc error: code = NotFound desc = could not find container \"af4110d18d5ee6352a0bd4dba9f227b32e6959a6d6124c7dc407d212833f4ee2\": container with ID starting with af4110d18d5ee6352a0bd4dba9f227b32e6959a6d6124c7dc407d212833f4ee2 not found: ID does not exist" Oct 03 08:33:03 crc kubenswrapper[4664]: I1003 08:33:03.891142 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1db0c593-1288-4661-9e96-73a98f5047bd" path="/var/lib/kubelet/pods/1db0c593-1288-4661-9e96-73a98f5047bd/volumes" Oct 03 08:33:14 crc kubenswrapper[4664]: I1003 08:33:14.876573 4664 scope.go:117] "RemoveContainer" containerID="87bb80884c4b381ef0d8fba25dac22cac6a15cc54ca77b684da87cc8b486fdd9" Oct 03 08:33:14 crc kubenswrapper[4664]: E1003 08:33:14.877548 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:33:25 crc kubenswrapper[4664]: I1003 08:33:25.876591 4664 scope.go:117] "RemoveContainer" containerID="87bb80884c4b381ef0d8fba25dac22cac6a15cc54ca77b684da87cc8b486fdd9" Oct 03 08:33:25 crc kubenswrapper[4664]: E1003 08:33:25.877747 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:33:35 crc kubenswrapper[4664]: I1003 08:33:35.234491 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xfjmf"] Oct 03 08:33:35 crc kubenswrapper[4664]: E1003 08:33:35.235679 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99b4e10d-1cb1-4558-87a3-78a28258f5f8" containerName="extract-utilities" Oct 03 08:33:35 crc kubenswrapper[4664]: I1003 08:33:35.235699 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="99b4e10d-1cb1-4558-87a3-78a28258f5f8" containerName="extract-utilities" Oct 03 08:33:35 crc kubenswrapper[4664]: E1003 08:33:35.235710 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99b4e10d-1cb1-4558-87a3-78a28258f5f8" containerName="extract-content" Oct 03 08:33:35 crc kubenswrapper[4664]: I1003 08:33:35.235718 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="99b4e10d-1cb1-4558-87a3-78a28258f5f8" containerName="extract-content" Oct 03 08:33:35 crc kubenswrapper[4664]: E1003 08:33:35.235742 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99b4e10d-1cb1-4558-87a3-78a28258f5f8" containerName="registry-server" Oct 03 08:33:35 crc kubenswrapper[4664]: I1003 08:33:35.235751 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="99b4e10d-1cb1-4558-87a3-78a28258f5f8" containerName="registry-server" Oct 03 08:33:35 crc kubenswrapper[4664]: E1003 08:33:35.235788 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="763f1257-a48c-47f1-a607-8a185ff79ae4" containerName="extract-content" Oct 03 08:33:35 crc kubenswrapper[4664]: I1003 08:33:35.235797 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="763f1257-a48c-47f1-a607-8a185ff79ae4" containerName="extract-content" Oct 03 08:33:35 crc kubenswrapper[4664]: E1003 08:33:35.235821 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1db0c593-1288-4661-9e96-73a98f5047bd" containerName="extract-content" Oct 03 08:33:35 crc kubenswrapper[4664]: I1003 08:33:35.235831 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="1db0c593-1288-4661-9e96-73a98f5047bd" containerName="extract-content" Oct 03 08:33:35 crc kubenswrapper[4664]: E1003 08:33:35.235852 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="763f1257-a48c-47f1-a607-8a185ff79ae4" containerName="extract-utilities" Oct 03 08:33:35 crc kubenswrapper[4664]: I1003 08:33:35.235860 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="763f1257-a48c-47f1-a607-8a185ff79ae4" containerName="extract-utilities" Oct 03 08:33:35 crc kubenswrapper[4664]: E1003 08:33:35.235870 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1db0c593-1288-4661-9e96-73a98f5047bd" containerName="extract-utilities" Oct 03 08:33:35 crc kubenswrapper[4664]: I1003 08:33:35.235878 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="1db0c593-1288-4661-9e96-73a98f5047bd" containerName="extract-utilities" Oct 03 08:33:35 crc kubenswrapper[4664]: E1003 08:33:35.235893 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="763f1257-a48c-47f1-a607-8a185ff79ae4" containerName="registry-server" Oct 03 08:33:35 crc kubenswrapper[4664]: I1003 08:33:35.235901 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="763f1257-a48c-47f1-a607-8a185ff79ae4" containerName="registry-server" Oct 03 08:33:35 crc kubenswrapper[4664]: E1003 08:33:35.235921 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1db0c593-1288-4661-9e96-73a98f5047bd" containerName="registry-server" Oct 03 08:33:35 crc kubenswrapper[4664]: I1003 08:33:35.235929 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="1db0c593-1288-4661-9e96-73a98f5047bd" containerName="registry-server" Oct 03 08:33:35 crc kubenswrapper[4664]: I1003 08:33:35.236195 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="99b4e10d-1cb1-4558-87a3-78a28258f5f8" containerName="registry-server" Oct 03 08:33:35 crc kubenswrapper[4664]: I1003 08:33:35.236228 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="1db0c593-1288-4661-9e96-73a98f5047bd" containerName="registry-server" Oct 03 08:33:35 crc kubenswrapper[4664]: I1003 08:33:35.236245 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="763f1257-a48c-47f1-a607-8a185ff79ae4" containerName="registry-server" Oct 03 08:33:35 crc kubenswrapper[4664]: I1003 08:33:35.238294 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xfjmf" Oct 03 08:33:35 crc kubenswrapper[4664]: I1003 08:33:35.246490 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xfjmf"] Oct 03 08:33:35 crc kubenswrapper[4664]: I1003 08:33:35.270694 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd4bbe8d-0cba-4580-89b9-442811220f69-utilities\") pod \"redhat-operators-xfjmf\" (UID: \"dd4bbe8d-0cba-4580-89b9-442811220f69\") " pod="openshift-marketplace/redhat-operators-xfjmf" Oct 03 08:33:35 crc kubenswrapper[4664]: I1003 08:33:35.270899 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd4bbe8d-0cba-4580-89b9-442811220f69-catalog-content\") pod \"redhat-operators-xfjmf\" (UID: \"dd4bbe8d-0cba-4580-89b9-442811220f69\") " pod="openshift-marketplace/redhat-operators-xfjmf" Oct 03 08:33:35 crc kubenswrapper[4664]: I1003 08:33:35.270925 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj87j\" (UniqueName: \"kubernetes.io/projected/dd4bbe8d-0cba-4580-89b9-442811220f69-kube-api-access-fj87j\") pod \"redhat-operators-xfjmf\" (UID: \"dd4bbe8d-0cba-4580-89b9-442811220f69\") " pod="openshift-marketplace/redhat-operators-xfjmf" Oct 03 08:33:35 crc kubenswrapper[4664]: I1003 08:33:35.373175 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd4bbe8d-0cba-4580-89b9-442811220f69-catalog-content\") pod \"redhat-operators-xfjmf\" (UID: \"dd4bbe8d-0cba-4580-89b9-442811220f69\") " pod="openshift-marketplace/redhat-operators-xfjmf" Oct 03 08:33:35 crc kubenswrapper[4664]: I1003 08:33:35.373661 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fj87j\" (UniqueName: \"kubernetes.io/projected/dd4bbe8d-0cba-4580-89b9-442811220f69-kube-api-access-fj87j\") pod \"redhat-operators-xfjmf\" (UID: \"dd4bbe8d-0cba-4580-89b9-442811220f69\") " pod="openshift-marketplace/redhat-operators-xfjmf" Oct 03 08:33:35 crc kubenswrapper[4664]: I1003 08:33:35.373852 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd4bbe8d-0cba-4580-89b9-442811220f69-utilities\") pod \"redhat-operators-xfjmf\" (UID: \"dd4bbe8d-0cba-4580-89b9-442811220f69\") " pod="openshift-marketplace/redhat-operators-xfjmf" Oct 03 08:33:35 crc kubenswrapper[4664]: I1003 08:33:35.373846 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd4bbe8d-0cba-4580-89b9-442811220f69-catalog-content\") pod \"redhat-operators-xfjmf\" (UID: \"dd4bbe8d-0cba-4580-89b9-442811220f69\") " pod="openshift-marketplace/redhat-operators-xfjmf" Oct 03 08:33:35 crc kubenswrapper[4664]: I1003 08:33:35.374175 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd4bbe8d-0cba-4580-89b9-442811220f69-utilities\") pod \"redhat-operators-xfjmf\" (UID: \"dd4bbe8d-0cba-4580-89b9-442811220f69\") " pod="openshift-marketplace/redhat-operators-xfjmf" Oct 03 08:33:35 crc kubenswrapper[4664]: I1003 08:33:35.404635 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj87j\" (UniqueName: \"kubernetes.io/projected/dd4bbe8d-0cba-4580-89b9-442811220f69-kube-api-access-fj87j\") pod \"redhat-operators-xfjmf\" (UID: \"dd4bbe8d-0cba-4580-89b9-442811220f69\") " pod="openshift-marketplace/redhat-operators-xfjmf" Oct 03 08:33:35 crc kubenswrapper[4664]: I1003 08:33:35.611503 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xfjmf" Oct 03 08:33:36 crc kubenswrapper[4664]: I1003 08:33:36.128004 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xfjmf"] Oct 03 08:33:36 crc kubenswrapper[4664]: I1003 08:33:36.216598 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xfjmf" event={"ID":"dd4bbe8d-0cba-4580-89b9-442811220f69","Type":"ContainerStarted","Data":"e62ed8a63fa2a2b29c23076cfe6d703f2b1959860cb0bc55522823c8a410114d"} Oct 03 08:33:36 crc kubenswrapper[4664]: I1003 08:33:36.877251 4664 scope.go:117] "RemoveContainer" containerID="87bb80884c4b381ef0d8fba25dac22cac6a15cc54ca77b684da87cc8b486fdd9" Oct 03 08:33:36 crc kubenswrapper[4664]: E1003 08:33:36.877473 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:33:37 crc kubenswrapper[4664]: I1003 08:33:37.227275 4664 generic.go:334] "Generic (PLEG): container finished" podID="dd4bbe8d-0cba-4580-89b9-442811220f69" containerID="4081485a2cf8c2116a35a60959ccc679735278bccc4299b38294098894fda391" exitCode=0 Oct 03 08:33:37 crc kubenswrapper[4664]: I1003 08:33:37.227375 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xfjmf" event={"ID":"dd4bbe8d-0cba-4580-89b9-442811220f69","Type":"ContainerDied","Data":"4081485a2cf8c2116a35a60959ccc679735278bccc4299b38294098894fda391"} Oct 03 08:33:37 crc kubenswrapper[4664]: I1003 08:33:37.230795 4664 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 08:33:38 crc kubenswrapper[4664]: I1003 08:33:38.250145 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xfjmf" event={"ID":"dd4bbe8d-0cba-4580-89b9-442811220f69","Type":"ContainerStarted","Data":"36d71f8750788614d972105786c5601a5df5d9190e7baec3ae9611d8cd019f59"} Oct 03 08:33:39 crc kubenswrapper[4664]: I1003 08:33:39.261752 4664 generic.go:334] "Generic (PLEG): container finished" podID="dd4bbe8d-0cba-4580-89b9-442811220f69" containerID="36d71f8750788614d972105786c5601a5df5d9190e7baec3ae9611d8cd019f59" exitCode=0 Oct 03 08:33:39 crc kubenswrapper[4664]: I1003 08:33:39.261854 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xfjmf" event={"ID":"dd4bbe8d-0cba-4580-89b9-442811220f69","Type":"ContainerDied","Data":"36d71f8750788614d972105786c5601a5df5d9190e7baec3ae9611d8cd019f59"} Oct 03 08:33:40 crc kubenswrapper[4664]: I1003 08:33:40.274134 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xfjmf" event={"ID":"dd4bbe8d-0cba-4580-89b9-442811220f69","Type":"ContainerStarted","Data":"57631fe52b966762c53de6fc15f65ffc0a4cf65dfa5dd4ef1eb77515d08fd078"} Oct 03 08:33:45 crc kubenswrapper[4664]: I1003 08:33:45.611784 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xfjmf" Oct 03 08:33:45 crc kubenswrapper[4664]: I1003 08:33:45.612448 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xfjmf" Oct 03 08:33:45 crc kubenswrapper[4664]: I1003 08:33:45.712941 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xfjmf" Oct 03 08:33:45 crc kubenswrapper[4664]: I1003 08:33:45.751592 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xfjmf" podStartSLOduration=8.180203568 podStartE2EDuration="10.751566571s" podCreationTimestamp="2025-10-03 08:33:35 +0000 UTC" firstStartedPulling="2025-10-03 08:33:37.230472661 +0000 UTC m=+2718.051663151" lastFinishedPulling="2025-10-03 08:33:39.801835664 +0000 UTC m=+2720.623026154" observedRunningTime="2025-10-03 08:33:40.307381878 +0000 UTC m=+2721.128572378" watchObservedRunningTime="2025-10-03 08:33:45.751566571 +0000 UTC m=+2726.572757081" Oct 03 08:33:46 crc kubenswrapper[4664]: I1003 08:33:46.379457 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xfjmf" Oct 03 08:33:46 crc kubenswrapper[4664]: I1003 08:33:46.430778 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xfjmf"] Oct 03 08:33:48 crc kubenswrapper[4664]: I1003 08:33:48.345816 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xfjmf" podUID="dd4bbe8d-0cba-4580-89b9-442811220f69" containerName="registry-server" containerID="cri-o://57631fe52b966762c53de6fc15f65ffc0a4cf65dfa5dd4ef1eb77515d08fd078" gracePeriod=2 Oct 03 08:33:48 crc kubenswrapper[4664]: I1003 08:33:48.836014 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xfjmf" Oct 03 08:33:48 crc kubenswrapper[4664]: I1003 08:33:48.946066 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fj87j\" (UniqueName: \"kubernetes.io/projected/dd4bbe8d-0cba-4580-89b9-442811220f69-kube-api-access-fj87j\") pod \"dd4bbe8d-0cba-4580-89b9-442811220f69\" (UID: \"dd4bbe8d-0cba-4580-89b9-442811220f69\") " Oct 03 08:33:48 crc kubenswrapper[4664]: I1003 08:33:48.946250 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd4bbe8d-0cba-4580-89b9-442811220f69-catalog-content\") pod \"dd4bbe8d-0cba-4580-89b9-442811220f69\" (UID: \"dd4bbe8d-0cba-4580-89b9-442811220f69\") " Oct 03 08:33:48 crc kubenswrapper[4664]: I1003 08:33:48.946364 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd4bbe8d-0cba-4580-89b9-442811220f69-utilities\") pod \"dd4bbe8d-0cba-4580-89b9-442811220f69\" (UID: \"dd4bbe8d-0cba-4580-89b9-442811220f69\") " Oct 03 08:33:48 crc kubenswrapper[4664]: I1003 08:33:48.947240 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd4bbe8d-0cba-4580-89b9-442811220f69-utilities" (OuterVolumeSpecName: "utilities") pod "dd4bbe8d-0cba-4580-89b9-442811220f69" (UID: "dd4bbe8d-0cba-4580-89b9-442811220f69"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:33:48 crc kubenswrapper[4664]: I1003 08:33:48.958834 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd4bbe8d-0cba-4580-89b9-442811220f69-kube-api-access-fj87j" (OuterVolumeSpecName: "kube-api-access-fj87j") pod "dd4bbe8d-0cba-4580-89b9-442811220f69" (UID: "dd4bbe8d-0cba-4580-89b9-442811220f69"). InnerVolumeSpecName "kube-api-access-fj87j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:33:49 crc kubenswrapper[4664]: I1003 08:33:49.048830 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fj87j\" (UniqueName: \"kubernetes.io/projected/dd4bbe8d-0cba-4580-89b9-442811220f69-kube-api-access-fj87j\") on node \"crc\" DevicePath \"\"" Oct 03 08:33:49 crc kubenswrapper[4664]: I1003 08:33:49.048883 4664 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd4bbe8d-0cba-4580-89b9-442811220f69-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:33:49 crc kubenswrapper[4664]: I1003 08:33:49.356448 4664 generic.go:334] "Generic (PLEG): container finished" podID="dd4bbe8d-0cba-4580-89b9-442811220f69" containerID="57631fe52b966762c53de6fc15f65ffc0a4cf65dfa5dd4ef1eb77515d08fd078" exitCode=0 Oct 03 08:33:49 crc kubenswrapper[4664]: I1003 08:33:49.356517 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xfjmf" event={"ID":"dd4bbe8d-0cba-4580-89b9-442811220f69","Type":"ContainerDied","Data":"57631fe52b966762c53de6fc15f65ffc0a4cf65dfa5dd4ef1eb77515d08fd078"} Oct 03 08:33:49 crc kubenswrapper[4664]: I1003 08:33:49.356547 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xfjmf" event={"ID":"dd4bbe8d-0cba-4580-89b9-442811220f69","Type":"ContainerDied","Data":"e62ed8a63fa2a2b29c23076cfe6d703f2b1959860cb0bc55522823c8a410114d"} Oct 03 08:33:49 crc kubenswrapper[4664]: I1003 08:33:49.356560 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xfjmf" Oct 03 08:33:49 crc kubenswrapper[4664]: I1003 08:33:49.356564 4664 scope.go:117] "RemoveContainer" containerID="57631fe52b966762c53de6fc15f65ffc0a4cf65dfa5dd4ef1eb77515d08fd078" Oct 03 08:33:49 crc kubenswrapper[4664]: I1003 08:33:49.386581 4664 scope.go:117] "RemoveContainer" containerID="36d71f8750788614d972105786c5601a5df5d9190e7baec3ae9611d8cd019f59" Oct 03 08:33:49 crc kubenswrapper[4664]: I1003 08:33:49.410406 4664 scope.go:117] "RemoveContainer" containerID="4081485a2cf8c2116a35a60959ccc679735278bccc4299b38294098894fda391" Oct 03 08:33:49 crc kubenswrapper[4664]: I1003 08:33:49.450964 4664 scope.go:117] "RemoveContainer" containerID="57631fe52b966762c53de6fc15f65ffc0a4cf65dfa5dd4ef1eb77515d08fd078" Oct 03 08:33:49 crc kubenswrapper[4664]: E1003 08:33:49.451414 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57631fe52b966762c53de6fc15f65ffc0a4cf65dfa5dd4ef1eb77515d08fd078\": container with ID starting with 57631fe52b966762c53de6fc15f65ffc0a4cf65dfa5dd4ef1eb77515d08fd078 not found: ID does not exist" containerID="57631fe52b966762c53de6fc15f65ffc0a4cf65dfa5dd4ef1eb77515d08fd078" Oct 03 08:33:49 crc kubenswrapper[4664]: I1003 08:33:49.451448 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57631fe52b966762c53de6fc15f65ffc0a4cf65dfa5dd4ef1eb77515d08fd078"} err="failed to get container status \"57631fe52b966762c53de6fc15f65ffc0a4cf65dfa5dd4ef1eb77515d08fd078\": rpc error: code = NotFound desc = could not find container \"57631fe52b966762c53de6fc15f65ffc0a4cf65dfa5dd4ef1eb77515d08fd078\": container with ID starting with 57631fe52b966762c53de6fc15f65ffc0a4cf65dfa5dd4ef1eb77515d08fd078 not found: ID does not exist" Oct 03 08:33:49 crc kubenswrapper[4664]: I1003 08:33:49.451472 4664 scope.go:117] "RemoveContainer" containerID="36d71f8750788614d972105786c5601a5df5d9190e7baec3ae9611d8cd019f59" Oct 03 08:33:49 crc kubenswrapper[4664]: E1003 08:33:49.451992 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36d71f8750788614d972105786c5601a5df5d9190e7baec3ae9611d8cd019f59\": container with ID starting with 36d71f8750788614d972105786c5601a5df5d9190e7baec3ae9611d8cd019f59 not found: ID does not exist" containerID="36d71f8750788614d972105786c5601a5df5d9190e7baec3ae9611d8cd019f59" Oct 03 08:33:49 crc kubenswrapper[4664]: I1003 08:33:49.452047 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36d71f8750788614d972105786c5601a5df5d9190e7baec3ae9611d8cd019f59"} err="failed to get container status \"36d71f8750788614d972105786c5601a5df5d9190e7baec3ae9611d8cd019f59\": rpc error: code = NotFound desc = could not find container \"36d71f8750788614d972105786c5601a5df5d9190e7baec3ae9611d8cd019f59\": container with ID starting with 36d71f8750788614d972105786c5601a5df5d9190e7baec3ae9611d8cd019f59 not found: ID does not exist" Oct 03 08:33:49 crc kubenswrapper[4664]: I1003 08:33:49.452080 4664 scope.go:117] "RemoveContainer" containerID="4081485a2cf8c2116a35a60959ccc679735278bccc4299b38294098894fda391" Oct 03 08:33:49 crc kubenswrapper[4664]: E1003 08:33:49.452499 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4081485a2cf8c2116a35a60959ccc679735278bccc4299b38294098894fda391\": container with ID starting with 4081485a2cf8c2116a35a60959ccc679735278bccc4299b38294098894fda391 not found: ID does not exist" containerID="4081485a2cf8c2116a35a60959ccc679735278bccc4299b38294098894fda391" Oct 03 08:33:49 crc kubenswrapper[4664]: I1003 08:33:49.452527 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4081485a2cf8c2116a35a60959ccc679735278bccc4299b38294098894fda391"} err="failed to get container status \"4081485a2cf8c2116a35a60959ccc679735278bccc4299b38294098894fda391\": rpc error: code = NotFound desc = could not find container \"4081485a2cf8c2116a35a60959ccc679735278bccc4299b38294098894fda391\": container with ID starting with 4081485a2cf8c2116a35a60959ccc679735278bccc4299b38294098894fda391 not found: ID does not exist" Oct 03 08:33:50 crc kubenswrapper[4664]: I1003 08:33:50.312101 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd4bbe8d-0cba-4580-89b9-442811220f69-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dd4bbe8d-0cba-4580-89b9-442811220f69" (UID: "dd4bbe8d-0cba-4580-89b9-442811220f69"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:33:50 crc kubenswrapper[4664]: I1003 08:33:50.377351 4664 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd4bbe8d-0cba-4580-89b9-442811220f69-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:33:50 crc kubenswrapper[4664]: I1003 08:33:50.597321 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xfjmf"] Oct 03 08:33:50 crc kubenswrapper[4664]: I1003 08:33:50.606314 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xfjmf"] Oct 03 08:33:50 crc kubenswrapper[4664]: I1003 08:33:50.876201 4664 scope.go:117] "RemoveContainer" containerID="87bb80884c4b381ef0d8fba25dac22cac6a15cc54ca77b684da87cc8b486fdd9" Oct 03 08:33:51 crc kubenswrapper[4664]: I1003 08:33:51.385470 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" event={"ID":"598b81ce-0ce7-498f-9337-ae5e6e64682b","Type":"ContainerStarted","Data":"d5b3bc544feebad658faa11a236c9f43ce397e933eeb21cbdb9ec6cf5b19a6b0"} Oct 03 08:33:51 crc kubenswrapper[4664]: I1003 08:33:51.919903 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd4bbe8d-0cba-4580-89b9-442811220f69" path="/var/lib/kubelet/pods/dd4bbe8d-0cba-4580-89b9-442811220f69/volumes" Oct 03 08:36:11 crc kubenswrapper[4664]: I1003 08:36:11.988332 4664 patch_prober.go:28] interesting pod/machine-config-daemon-x9dgm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:36:11 crc kubenswrapper[4664]: I1003 08:36:11.989125 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:36:41 crc kubenswrapper[4664]: I1003 08:36:41.987843 4664 patch_prober.go:28] interesting pod/machine-config-daemon-x9dgm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:36:41 crc kubenswrapper[4664]: I1003 08:36:41.988526 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:37:11 crc kubenswrapper[4664]: I1003 08:37:11.988223 4664 patch_prober.go:28] interesting pod/machine-config-daemon-x9dgm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:37:11 crc kubenswrapper[4664]: I1003 08:37:11.989026 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:37:11 crc kubenswrapper[4664]: I1003 08:37:11.989099 4664 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" Oct 03 08:37:11 crc kubenswrapper[4664]: I1003 08:37:11.990276 4664 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d5b3bc544feebad658faa11a236c9f43ce397e933eeb21cbdb9ec6cf5b19a6b0"} pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 08:37:11 crc kubenswrapper[4664]: I1003 08:37:11.990356 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" containerID="cri-o://d5b3bc544feebad658faa11a236c9f43ce397e933eeb21cbdb9ec6cf5b19a6b0" gracePeriod=600 Oct 03 08:37:12 crc kubenswrapper[4664]: I1003 08:37:12.264337 4664 generic.go:334] "Generic (PLEG): container finished" podID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerID="d5b3bc544feebad658faa11a236c9f43ce397e933eeb21cbdb9ec6cf5b19a6b0" exitCode=0 Oct 03 08:37:12 crc kubenswrapper[4664]: I1003 08:37:12.264429 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" event={"ID":"598b81ce-0ce7-498f-9337-ae5e6e64682b","Type":"ContainerDied","Data":"d5b3bc544feebad658faa11a236c9f43ce397e933eeb21cbdb9ec6cf5b19a6b0"} Oct 03 08:37:12 crc kubenswrapper[4664]: I1003 08:37:12.264983 4664 scope.go:117] "RemoveContainer" containerID="87bb80884c4b381ef0d8fba25dac22cac6a15cc54ca77b684da87cc8b486fdd9" Oct 03 08:37:13 crc kubenswrapper[4664]: I1003 08:37:13.281499 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" event={"ID":"598b81ce-0ce7-498f-9337-ae5e6e64682b","Type":"ContainerStarted","Data":"db2833a516ac8601848d83e3ded1226e2dd9d4f9bf2dd94522498c7951f646a6"} Oct 03 08:37:19 crc kubenswrapper[4664]: I1003 08:37:19.050389 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-wbt4p"] Oct 03 08:37:19 crc kubenswrapper[4664]: E1003 08:37:19.051868 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd4bbe8d-0cba-4580-89b9-442811220f69" containerName="registry-server" Oct 03 08:37:19 crc kubenswrapper[4664]: I1003 08:37:19.051889 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd4bbe8d-0cba-4580-89b9-442811220f69" containerName="registry-server" Oct 03 08:37:19 crc kubenswrapper[4664]: E1003 08:37:19.051906 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd4bbe8d-0cba-4580-89b9-442811220f69" containerName="extract-content" Oct 03 08:37:19 crc kubenswrapper[4664]: I1003 08:37:19.051913 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd4bbe8d-0cba-4580-89b9-442811220f69" containerName="extract-content" Oct 03 08:37:19 crc kubenswrapper[4664]: E1003 08:37:19.051929 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd4bbe8d-0cba-4580-89b9-442811220f69" containerName="extract-utilities" Oct 03 08:37:19 crc kubenswrapper[4664]: I1003 08:37:19.051940 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd4bbe8d-0cba-4580-89b9-442811220f69" containerName="extract-utilities" Oct 03 08:37:19 crc kubenswrapper[4664]: I1003 08:37:19.052391 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd4bbe8d-0cba-4580-89b9-442811220f69" containerName="registry-server" Oct 03 08:37:19 crc kubenswrapper[4664]: I1003 08:37:19.053290 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wbt4p" Oct 03 08:37:19 crc kubenswrapper[4664]: I1003 08:37:19.057039 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 08:37:19 crc kubenswrapper[4664]: I1003 08:37:19.057918 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 08:37:19 crc kubenswrapper[4664]: I1003 08:37:19.058577 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 08:37:19 crc kubenswrapper[4664]: I1003 08:37:19.058705 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h6s74" Oct 03 08:37:19 crc kubenswrapper[4664]: I1003 08:37:19.060103 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 03 08:37:19 crc kubenswrapper[4664]: I1003 08:37:19.064840 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-wbt4p"] Oct 03 08:37:19 crc kubenswrapper[4664]: I1003 08:37:19.225055 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e714e59e-a513-4061-852d-0e7a0f36e923-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wbt4p\" (UID: \"e714e59e-a513-4061-852d-0e7a0f36e923\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wbt4p" Oct 03 08:37:19 crc kubenswrapper[4664]: I1003 08:37:19.225133 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e714e59e-a513-4061-852d-0e7a0f36e923-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wbt4p\" (UID: \"e714e59e-a513-4061-852d-0e7a0f36e923\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wbt4p" Oct 03 08:37:19 crc kubenswrapper[4664]: I1003 08:37:19.225172 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-822gn\" (UniqueName: \"kubernetes.io/projected/e714e59e-a513-4061-852d-0e7a0f36e923-kube-api-access-822gn\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wbt4p\" (UID: \"e714e59e-a513-4061-852d-0e7a0f36e923\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wbt4p" Oct 03 08:37:19 crc kubenswrapper[4664]: I1003 08:37:19.225216 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e714e59e-a513-4061-852d-0e7a0f36e923-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wbt4p\" (UID: \"e714e59e-a513-4061-852d-0e7a0f36e923\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wbt4p" Oct 03 08:37:19 crc kubenswrapper[4664]: I1003 08:37:19.225297 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e714e59e-a513-4061-852d-0e7a0f36e923-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wbt4p\" (UID: \"e714e59e-a513-4061-852d-0e7a0f36e923\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wbt4p" Oct 03 08:37:19 crc kubenswrapper[4664]: I1003 08:37:19.326652 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e714e59e-a513-4061-852d-0e7a0f36e923-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wbt4p\" (UID: \"e714e59e-a513-4061-852d-0e7a0f36e923\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wbt4p" Oct 03 08:37:19 crc kubenswrapper[4664]: I1003 08:37:19.326717 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-822gn\" (UniqueName: \"kubernetes.io/projected/e714e59e-a513-4061-852d-0e7a0f36e923-kube-api-access-822gn\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wbt4p\" (UID: \"e714e59e-a513-4061-852d-0e7a0f36e923\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wbt4p" Oct 03 08:37:19 crc kubenswrapper[4664]: I1003 08:37:19.326773 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e714e59e-a513-4061-852d-0e7a0f36e923-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wbt4p\" (UID: \"e714e59e-a513-4061-852d-0e7a0f36e923\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wbt4p" Oct 03 08:37:19 crc kubenswrapper[4664]: I1003 08:37:19.326849 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e714e59e-a513-4061-852d-0e7a0f36e923-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wbt4p\" (UID: \"e714e59e-a513-4061-852d-0e7a0f36e923\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wbt4p" Oct 03 08:37:19 crc kubenswrapper[4664]: I1003 08:37:19.327006 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e714e59e-a513-4061-852d-0e7a0f36e923-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wbt4p\" (UID: \"e714e59e-a513-4061-852d-0e7a0f36e923\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wbt4p" Oct 03 08:37:19 crc kubenswrapper[4664]: I1003 08:37:19.328004 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e714e59e-a513-4061-852d-0e7a0f36e923-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wbt4p\" (UID: \"e714e59e-a513-4061-852d-0e7a0f36e923\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wbt4p" Oct 03 08:37:19 crc kubenswrapper[4664]: I1003 08:37:19.334048 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e714e59e-a513-4061-852d-0e7a0f36e923-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wbt4p\" (UID: \"e714e59e-a513-4061-852d-0e7a0f36e923\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wbt4p" Oct 03 08:37:19 crc kubenswrapper[4664]: I1003 08:37:19.334525 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e714e59e-a513-4061-852d-0e7a0f36e923-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wbt4p\" (UID: \"e714e59e-a513-4061-852d-0e7a0f36e923\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wbt4p" Oct 03 08:37:19 crc kubenswrapper[4664]: I1003 08:37:19.336883 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e714e59e-a513-4061-852d-0e7a0f36e923-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wbt4p\" (UID: \"e714e59e-a513-4061-852d-0e7a0f36e923\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wbt4p" Oct 03 08:37:19 crc kubenswrapper[4664]: I1003 08:37:19.344172 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-822gn\" (UniqueName: \"kubernetes.io/projected/e714e59e-a513-4061-852d-0e7a0f36e923-kube-api-access-822gn\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wbt4p\" (UID: \"e714e59e-a513-4061-852d-0e7a0f36e923\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wbt4p" Oct 03 08:37:19 crc kubenswrapper[4664]: I1003 08:37:19.380731 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wbt4p" Oct 03 08:37:19 crc kubenswrapper[4664]: I1003 08:37:19.900481 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-wbt4p"] Oct 03 08:37:20 crc kubenswrapper[4664]: I1003 08:37:20.046030 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 08:37:20 crc kubenswrapper[4664]: I1003 08:37:20.361274 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wbt4p" event={"ID":"e714e59e-a513-4061-852d-0e7a0f36e923","Type":"ContainerStarted","Data":"7304b7f2caf9dd48c987c1c3d1aad7f373407399117140fd2e3dcf9d70c7bbb8"} Oct 03 08:37:20 crc kubenswrapper[4664]: I1003 08:37:20.362847 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wbt4p" event={"ID":"e714e59e-a513-4061-852d-0e7a0f36e923","Type":"ContainerStarted","Data":"c5b10c41d71d4c7c90189620fbec076397c89b67fed0b591baf1515ad11aeb5a"} Oct 03 08:37:20 crc kubenswrapper[4664]: I1003 08:37:20.380555 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wbt4p" podStartSLOduration=1.22500751 podStartE2EDuration="1.380530332s" podCreationTimestamp="2025-10-03 08:37:19 +0000 UTC" firstStartedPulling="2025-10-03 08:37:19.888074434 +0000 UTC m=+2940.709264924" lastFinishedPulling="2025-10-03 08:37:20.043597256 +0000 UTC m=+2940.864787746" observedRunningTime="2025-10-03 08:37:20.379832522 +0000 UTC m=+2941.201023062" watchObservedRunningTime="2025-10-03 08:37:20.380530332 +0000 UTC m=+2941.201720842" Oct 03 08:37:48 crc kubenswrapper[4664]: I1003 08:37:48.717308 4664 generic.go:334] "Generic (PLEG): container finished" podID="e714e59e-a513-4061-852d-0e7a0f36e923" containerID="7304b7f2caf9dd48c987c1c3d1aad7f373407399117140fd2e3dcf9d70c7bbb8" exitCode=2 Oct 03 08:37:48 crc kubenswrapper[4664]: I1003 08:37:48.717379 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wbt4p" event={"ID":"e714e59e-a513-4061-852d-0e7a0f36e923","Type":"ContainerDied","Data":"7304b7f2caf9dd48c987c1c3d1aad7f373407399117140fd2e3dcf9d70c7bbb8"} Oct 03 08:37:50 crc kubenswrapper[4664]: I1003 08:37:50.141748 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wbt4p" Oct 03 08:37:50 crc kubenswrapper[4664]: I1003 08:37:50.194650 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e714e59e-a513-4061-852d-0e7a0f36e923-ssh-key\") pod \"e714e59e-a513-4061-852d-0e7a0f36e923\" (UID: \"e714e59e-a513-4061-852d-0e7a0f36e923\") " Oct 03 08:37:50 crc kubenswrapper[4664]: I1003 08:37:50.194766 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e714e59e-a513-4061-852d-0e7a0f36e923-inventory\") pod \"e714e59e-a513-4061-852d-0e7a0f36e923\" (UID: \"e714e59e-a513-4061-852d-0e7a0f36e923\") " Oct 03 08:37:50 crc kubenswrapper[4664]: I1003 08:37:50.194912 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e714e59e-a513-4061-852d-0e7a0f36e923-ovncontroller-config-0\") pod \"e714e59e-a513-4061-852d-0e7a0f36e923\" (UID: \"e714e59e-a513-4061-852d-0e7a0f36e923\") " Oct 03 08:37:50 crc kubenswrapper[4664]: I1003 08:37:50.195064 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e714e59e-a513-4061-852d-0e7a0f36e923-ovn-combined-ca-bundle\") pod \"e714e59e-a513-4061-852d-0e7a0f36e923\" (UID: \"e714e59e-a513-4061-852d-0e7a0f36e923\") " Oct 03 08:37:50 crc kubenswrapper[4664]: I1003 08:37:50.195100 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-822gn\" (UniqueName: \"kubernetes.io/projected/e714e59e-a513-4061-852d-0e7a0f36e923-kube-api-access-822gn\") pod \"e714e59e-a513-4061-852d-0e7a0f36e923\" (UID: \"e714e59e-a513-4061-852d-0e7a0f36e923\") " Oct 03 08:37:50 crc kubenswrapper[4664]: I1003 08:37:50.201899 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e714e59e-a513-4061-852d-0e7a0f36e923-kube-api-access-822gn" (OuterVolumeSpecName: "kube-api-access-822gn") pod "e714e59e-a513-4061-852d-0e7a0f36e923" (UID: "e714e59e-a513-4061-852d-0e7a0f36e923"). InnerVolumeSpecName "kube-api-access-822gn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:37:50 crc kubenswrapper[4664]: I1003 08:37:50.202319 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e714e59e-a513-4061-852d-0e7a0f36e923-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "e714e59e-a513-4061-852d-0e7a0f36e923" (UID: "e714e59e-a513-4061-852d-0e7a0f36e923"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:37:50 crc kubenswrapper[4664]: I1003 08:37:50.222527 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e714e59e-a513-4061-852d-0e7a0f36e923-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "e714e59e-a513-4061-852d-0e7a0f36e923" (UID: "e714e59e-a513-4061-852d-0e7a0f36e923"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:37:50 crc kubenswrapper[4664]: I1003 08:37:50.224535 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e714e59e-a513-4061-852d-0e7a0f36e923-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e714e59e-a513-4061-852d-0e7a0f36e923" (UID: "e714e59e-a513-4061-852d-0e7a0f36e923"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:37:50 crc kubenswrapper[4664]: I1003 08:37:50.227429 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e714e59e-a513-4061-852d-0e7a0f36e923-inventory" (OuterVolumeSpecName: "inventory") pod "e714e59e-a513-4061-852d-0e7a0f36e923" (UID: "e714e59e-a513-4061-852d-0e7a0f36e923"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:37:50 crc kubenswrapper[4664]: I1003 08:37:50.297367 4664 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e714e59e-a513-4061-852d-0e7a0f36e923-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 03 08:37:50 crc kubenswrapper[4664]: I1003 08:37:50.297409 4664 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e714e59e-a513-4061-852d-0e7a0f36e923-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:37:50 crc kubenswrapper[4664]: I1003 08:37:50.297422 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-822gn\" (UniqueName: \"kubernetes.io/projected/e714e59e-a513-4061-852d-0e7a0f36e923-kube-api-access-822gn\") on node \"crc\" DevicePath \"\"" Oct 03 08:37:50 crc kubenswrapper[4664]: I1003 08:37:50.297434 4664 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e714e59e-a513-4061-852d-0e7a0f36e923-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 08:37:50 crc kubenswrapper[4664]: I1003 08:37:50.297444 4664 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e714e59e-a513-4061-852d-0e7a0f36e923-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 08:37:50 crc kubenswrapper[4664]: I1003 08:37:50.734889 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wbt4p" event={"ID":"e714e59e-a513-4061-852d-0e7a0f36e923","Type":"ContainerDied","Data":"c5b10c41d71d4c7c90189620fbec076397c89b67fed0b591baf1515ad11aeb5a"} Oct 03 08:37:50 crc kubenswrapper[4664]: I1003 08:37:50.734948 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5b10c41d71d4c7c90189620fbec076397c89b67fed0b591baf1515ad11aeb5a" Oct 03 08:37:50 crc kubenswrapper[4664]: I1003 08:37:50.735019 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wbt4p" Oct 03 08:39:41 crc kubenswrapper[4664]: I1003 08:39:41.987287 4664 patch_prober.go:28] interesting pod/machine-config-daemon-x9dgm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:39:41 crc kubenswrapper[4664]: I1003 08:39:41.988097 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:40:11 crc kubenswrapper[4664]: I1003 08:40:11.988028 4664 patch_prober.go:28] interesting pod/machine-config-daemon-x9dgm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:40:11 crc kubenswrapper[4664]: I1003 08:40:11.988922 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:40:41 crc kubenswrapper[4664]: I1003 08:40:41.988560 4664 patch_prober.go:28] interesting pod/machine-config-daemon-x9dgm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:40:41 crc kubenswrapper[4664]: I1003 08:40:41.990300 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:40:41 crc kubenswrapper[4664]: I1003 08:40:41.990403 4664 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" Oct 03 08:40:41 crc kubenswrapper[4664]: I1003 08:40:41.991684 4664 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"db2833a516ac8601848d83e3ded1226e2dd9d4f9bf2dd94522498c7951f646a6"} pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 08:40:41 crc kubenswrapper[4664]: I1003 08:40:41.991763 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" containerID="cri-o://db2833a516ac8601848d83e3ded1226e2dd9d4f9bf2dd94522498c7951f646a6" gracePeriod=600 Oct 03 08:40:42 crc kubenswrapper[4664]: E1003 08:40:42.118301 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:40:42 crc kubenswrapper[4664]: I1003 08:40:42.638454 4664 generic.go:334] "Generic (PLEG): container finished" podID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerID="db2833a516ac8601848d83e3ded1226e2dd9d4f9bf2dd94522498c7951f646a6" exitCode=0 Oct 03 08:40:42 crc kubenswrapper[4664]: I1003 08:40:42.638535 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" event={"ID":"598b81ce-0ce7-498f-9337-ae5e6e64682b","Type":"ContainerDied","Data":"db2833a516ac8601848d83e3ded1226e2dd9d4f9bf2dd94522498c7951f646a6"} Oct 03 08:40:42 crc kubenswrapper[4664]: I1003 08:40:42.638865 4664 scope.go:117] "RemoveContainer" containerID="d5b3bc544feebad658faa11a236c9f43ce397e933eeb21cbdb9ec6cf5b19a6b0" Oct 03 08:40:42 crc kubenswrapper[4664]: I1003 08:40:42.639986 4664 scope.go:117] "RemoveContainer" containerID="db2833a516ac8601848d83e3ded1226e2dd9d4f9bf2dd94522498c7951f646a6" Oct 03 08:40:42 crc kubenswrapper[4664]: E1003 08:40:42.640323 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:40:53 crc kubenswrapper[4664]: I1003 08:40:53.877102 4664 scope.go:117] "RemoveContainer" containerID="db2833a516ac8601848d83e3ded1226e2dd9d4f9bf2dd94522498c7951f646a6" Oct 03 08:40:53 crc kubenswrapper[4664]: E1003 08:40:53.878204 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:41:08 crc kubenswrapper[4664]: I1003 08:41:08.876952 4664 scope.go:117] "RemoveContainer" containerID="db2833a516ac8601848d83e3ded1226e2dd9d4f9bf2dd94522498c7951f646a6" Oct 03 08:41:08 crc kubenswrapper[4664]: E1003 08:41:08.877960 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:41:21 crc kubenswrapper[4664]: I1003 08:41:21.877044 4664 scope.go:117] "RemoveContainer" containerID="db2833a516ac8601848d83e3ded1226e2dd9d4f9bf2dd94522498c7951f646a6" Oct 03 08:41:21 crc kubenswrapper[4664]: E1003 08:41:21.878272 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:41:32 crc kubenswrapper[4664]: I1003 08:41:32.877007 4664 scope.go:117] "RemoveContainer" containerID="db2833a516ac8601848d83e3ded1226e2dd9d4f9bf2dd94522498c7951f646a6" Oct 03 08:41:32 crc kubenswrapper[4664]: E1003 08:41:32.878050 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:41:47 crc kubenswrapper[4664]: I1003 08:41:47.876757 4664 scope.go:117] "RemoveContainer" containerID="db2833a516ac8601848d83e3ded1226e2dd9d4f9bf2dd94522498c7951f646a6" Oct 03 08:41:47 crc kubenswrapper[4664]: E1003 08:41:47.879153 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:41:58 crc kubenswrapper[4664]: I1003 08:41:58.875844 4664 scope.go:117] "RemoveContainer" containerID="db2833a516ac8601848d83e3ded1226e2dd9d4f9bf2dd94522498c7951f646a6" Oct 03 08:41:58 crc kubenswrapper[4664]: E1003 08:41:58.877115 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:42:11 crc kubenswrapper[4664]: I1003 08:42:11.877069 4664 scope.go:117] "RemoveContainer" containerID="db2833a516ac8601848d83e3ded1226e2dd9d4f9bf2dd94522498c7951f646a6" Oct 03 08:42:11 crc kubenswrapper[4664]: E1003 08:42:11.878046 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:42:26 crc kubenswrapper[4664]: I1003 08:42:26.877336 4664 scope.go:117] "RemoveContainer" containerID="db2833a516ac8601848d83e3ded1226e2dd9d4f9bf2dd94522498c7951f646a6" Oct 03 08:42:26 crc kubenswrapper[4664]: E1003 08:42:26.878539 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:42:39 crc kubenswrapper[4664]: I1003 08:42:39.875962 4664 scope.go:117] "RemoveContainer" containerID="db2833a516ac8601848d83e3ded1226e2dd9d4f9bf2dd94522498c7951f646a6" Oct 03 08:42:39 crc kubenswrapper[4664]: E1003 08:42:39.877076 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:42:46 crc kubenswrapper[4664]: I1003 08:42:46.583864 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wn667"] Oct 03 08:42:46 crc kubenswrapper[4664]: E1003 08:42:46.585299 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e714e59e-a513-4061-852d-0e7a0f36e923" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 03 08:42:46 crc kubenswrapper[4664]: I1003 08:42:46.585321 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="e714e59e-a513-4061-852d-0e7a0f36e923" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 03 08:42:46 crc kubenswrapper[4664]: I1003 08:42:46.585688 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="e714e59e-a513-4061-852d-0e7a0f36e923" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 03 08:42:46 crc kubenswrapper[4664]: I1003 08:42:46.587795 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wn667" Oct 03 08:42:46 crc kubenswrapper[4664]: I1003 08:42:46.602411 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wn667"] Oct 03 08:42:46 crc kubenswrapper[4664]: I1003 08:42:46.676072 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwmp6\" (UniqueName: \"kubernetes.io/projected/6d3b1030-66cf-4b71-a5b4-c38c1f02660b-kube-api-access-kwmp6\") pod \"certified-operators-wn667\" (UID: \"6d3b1030-66cf-4b71-a5b4-c38c1f02660b\") " pod="openshift-marketplace/certified-operators-wn667" Oct 03 08:42:46 crc kubenswrapper[4664]: I1003 08:42:46.676256 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d3b1030-66cf-4b71-a5b4-c38c1f02660b-catalog-content\") pod \"certified-operators-wn667\" (UID: \"6d3b1030-66cf-4b71-a5b4-c38c1f02660b\") " pod="openshift-marketplace/certified-operators-wn667" Oct 03 08:42:46 crc kubenswrapper[4664]: I1003 08:42:46.676412 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d3b1030-66cf-4b71-a5b4-c38c1f02660b-utilities\") pod \"certified-operators-wn667\" (UID: \"6d3b1030-66cf-4b71-a5b4-c38c1f02660b\") " pod="openshift-marketplace/certified-operators-wn667" Oct 03 08:42:46 crc kubenswrapper[4664]: I1003 08:42:46.778206 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwmp6\" (UniqueName: \"kubernetes.io/projected/6d3b1030-66cf-4b71-a5b4-c38c1f02660b-kube-api-access-kwmp6\") pod \"certified-operators-wn667\" (UID: \"6d3b1030-66cf-4b71-a5b4-c38c1f02660b\") " pod="openshift-marketplace/certified-operators-wn667" Oct 03 08:42:46 crc kubenswrapper[4664]: I1003 08:42:46.778312 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d3b1030-66cf-4b71-a5b4-c38c1f02660b-catalog-content\") pod \"certified-operators-wn667\" (UID: \"6d3b1030-66cf-4b71-a5b4-c38c1f02660b\") " pod="openshift-marketplace/certified-operators-wn667" Oct 03 08:42:46 crc kubenswrapper[4664]: I1003 08:42:46.778380 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d3b1030-66cf-4b71-a5b4-c38c1f02660b-utilities\") pod \"certified-operators-wn667\" (UID: \"6d3b1030-66cf-4b71-a5b4-c38c1f02660b\") " pod="openshift-marketplace/certified-operators-wn667" Oct 03 08:42:46 crc kubenswrapper[4664]: I1003 08:42:46.779091 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d3b1030-66cf-4b71-a5b4-c38c1f02660b-utilities\") pod \"certified-operators-wn667\" (UID: \"6d3b1030-66cf-4b71-a5b4-c38c1f02660b\") " pod="openshift-marketplace/certified-operators-wn667" Oct 03 08:42:46 crc kubenswrapper[4664]: I1003 08:42:46.779423 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d3b1030-66cf-4b71-a5b4-c38c1f02660b-catalog-content\") pod \"certified-operators-wn667\" (UID: \"6d3b1030-66cf-4b71-a5b4-c38c1f02660b\") " pod="openshift-marketplace/certified-operators-wn667" Oct 03 08:42:46 crc kubenswrapper[4664]: I1003 08:42:46.803775 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwmp6\" (UniqueName: \"kubernetes.io/projected/6d3b1030-66cf-4b71-a5b4-c38c1f02660b-kube-api-access-kwmp6\") pod \"certified-operators-wn667\" (UID: \"6d3b1030-66cf-4b71-a5b4-c38c1f02660b\") " pod="openshift-marketplace/certified-operators-wn667" Oct 03 08:42:46 crc kubenswrapper[4664]: I1003 08:42:46.915680 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wn667" Oct 03 08:42:47 crc kubenswrapper[4664]: I1003 08:42:47.492010 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wn667"] Oct 03 08:42:48 crc kubenswrapper[4664]: I1003 08:42:48.020210 4664 generic.go:334] "Generic (PLEG): container finished" podID="6d3b1030-66cf-4b71-a5b4-c38c1f02660b" containerID="6bbcb3fe8a725e2e7d062e2a93939938987e8750e5df74f95cabb3d3050338ad" exitCode=0 Oct 03 08:42:48 crc kubenswrapper[4664]: I1003 08:42:48.020574 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wn667" event={"ID":"6d3b1030-66cf-4b71-a5b4-c38c1f02660b","Type":"ContainerDied","Data":"6bbcb3fe8a725e2e7d062e2a93939938987e8750e5df74f95cabb3d3050338ad"} Oct 03 08:42:48 crc kubenswrapper[4664]: I1003 08:42:48.020697 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wn667" event={"ID":"6d3b1030-66cf-4b71-a5b4-c38c1f02660b","Type":"ContainerStarted","Data":"9cc6f1200eb5a0d02dc0ccb42049798cff9cea2ff5396fc32cd3dcb1a0cbcbcb"} Oct 03 08:42:48 crc kubenswrapper[4664]: I1003 08:42:48.025889 4664 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 08:42:49 crc kubenswrapper[4664]: I1003 08:42:49.034977 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wn667" event={"ID":"6d3b1030-66cf-4b71-a5b4-c38c1f02660b","Type":"ContainerStarted","Data":"2920fa0fa91647c710317a10f6f0fb2574d2ba536efaa2481cf7518d0b575470"} Oct 03 08:42:50 crc kubenswrapper[4664]: I1003 08:42:50.051582 4664 generic.go:334] "Generic (PLEG): container finished" podID="6d3b1030-66cf-4b71-a5b4-c38c1f02660b" containerID="2920fa0fa91647c710317a10f6f0fb2574d2ba536efaa2481cf7518d0b575470" exitCode=0 Oct 03 08:42:50 crc kubenswrapper[4664]: I1003 08:42:50.051715 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wn667" event={"ID":"6d3b1030-66cf-4b71-a5b4-c38c1f02660b","Type":"ContainerDied","Data":"2920fa0fa91647c710317a10f6f0fb2574d2ba536efaa2481cf7518d0b575470"} Oct 03 08:42:51 crc kubenswrapper[4664]: I1003 08:42:51.073372 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wn667" event={"ID":"6d3b1030-66cf-4b71-a5b4-c38c1f02660b","Type":"ContainerStarted","Data":"819fbba908dc5fcd237c8bccc283b69e01fb084bb1d552828af649d94d4ee545"} Oct 03 08:42:51 crc kubenswrapper[4664]: I1003 08:42:51.100744 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wn667" podStartSLOduration=2.6573891 podStartE2EDuration="5.100722086s" podCreationTimestamp="2025-10-03 08:42:46 +0000 UTC" firstStartedPulling="2025-10-03 08:42:48.025251894 +0000 UTC m=+3268.846442384" lastFinishedPulling="2025-10-03 08:42:50.46858488 +0000 UTC m=+3271.289775370" observedRunningTime="2025-10-03 08:42:51.094003034 +0000 UTC m=+3271.915193534" watchObservedRunningTime="2025-10-03 08:42:51.100722086 +0000 UTC m=+3271.921912576" Oct 03 08:42:53 crc kubenswrapper[4664]: I1003 08:42:53.876547 4664 scope.go:117] "RemoveContainer" containerID="db2833a516ac8601848d83e3ded1226e2dd9d4f9bf2dd94522498c7951f646a6" Oct 03 08:42:53 crc kubenswrapper[4664]: E1003 08:42:53.878016 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:42:56 crc kubenswrapper[4664]: I1003 08:42:56.916525 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wn667" Oct 03 08:42:56 crc kubenswrapper[4664]: I1003 08:42:56.917059 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wn667" Oct 03 08:42:56 crc kubenswrapper[4664]: I1003 08:42:56.971204 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wn667" Oct 03 08:42:57 crc kubenswrapper[4664]: I1003 08:42:57.195661 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wn667" Oct 03 08:42:57 crc kubenswrapper[4664]: I1003 08:42:57.250861 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wn667"] Oct 03 08:42:59 crc kubenswrapper[4664]: I1003 08:42:59.162382 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wn667" podUID="6d3b1030-66cf-4b71-a5b4-c38c1f02660b" containerName="registry-server" containerID="cri-o://819fbba908dc5fcd237c8bccc283b69e01fb084bb1d552828af649d94d4ee545" gracePeriod=2 Oct 03 08:42:59 crc kubenswrapper[4664]: I1003 08:42:59.701874 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wn667" Oct 03 08:42:59 crc kubenswrapper[4664]: I1003 08:42:59.787333 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d3b1030-66cf-4b71-a5b4-c38c1f02660b-utilities\") pod \"6d3b1030-66cf-4b71-a5b4-c38c1f02660b\" (UID: \"6d3b1030-66cf-4b71-a5b4-c38c1f02660b\") " Oct 03 08:42:59 crc kubenswrapper[4664]: I1003 08:42:59.787384 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d3b1030-66cf-4b71-a5b4-c38c1f02660b-catalog-content\") pod \"6d3b1030-66cf-4b71-a5b4-c38c1f02660b\" (UID: \"6d3b1030-66cf-4b71-a5b4-c38c1f02660b\") " Oct 03 08:42:59 crc kubenswrapper[4664]: I1003 08:42:59.787661 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwmp6\" (UniqueName: \"kubernetes.io/projected/6d3b1030-66cf-4b71-a5b4-c38c1f02660b-kube-api-access-kwmp6\") pod \"6d3b1030-66cf-4b71-a5b4-c38c1f02660b\" (UID: \"6d3b1030-66cf-4b71-a5b4-c38c1f02660b\") " Oct 03 08:42:59 crc kubenswrapper[4664]: I1003 08:42:59.788802 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d3b1030-66cf-4b71-a5b4-c38c1f02660b-utilities" (OuterVolumeSpecName: "utilities") pod "6d3b1030-66cf-4b71-a5b4-c38c1f02660b" (UID: "6d3b1030-66cf-4b71-a5b4-c38c1f02660b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:42:59 crc kubenswrapper[4664]: I1003 08:42:59.789631 4664 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d3b1030-66cf-4b71-a5b4-c38c1f02660b-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:42:59 crc kubenswrapper[4664]: I1003 08:42:59.800960 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d3b1030-66cf-4b71-a5b4-c38c1f02660b-kube-api-access-kwmp6" (OuterVolumeSpecName: "kube-api-access-kwmp6") pod "6d3b1030-66cf-4b71-a5b4-c38c1f02660b" (UID: "6d3b1030-66cf-4b71-a5b4-c38c1f02660b"). InnerVolumeSpecName "kube-api-access-kwmp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:42:59 crc kubenswrapper[4664]: I1003 08:42:59.891930 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwmp6\" (UniqueName: \"kubernetes.io/projected/6d3b1030-66cf-4b71-a5b4-c38c1f02660b-kube-api-access-kwmp6\") on node \"crc\" DevicePath \"\"" Oct 03 08:43:00 crc kubenswrapper[4664]: I1003 08:43:00.174200 4664 generic.go:334] "Generic (PLEG): container finished" podID="6d3b1030-66cf-4b71-a5b4-c38c1f02660b" containerID="819fbba908dc5fcd237c8bccc283b69e01fb084bb1d552828af649d94d4ee545" exitCode=0 Oct 03 08:43:00 crc kubenswrapper[4664]: I1003 08:43:00.174261 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wn667" event={"ID":"6d3b1030-66cf-4b71-a5b4-c38c1f02660b","Type":"ContainerDied","Data":"819fbba908dc5fcd237c8bccc283b69e01fb084bb1d552828af649d94d4ee545"} Oct 03 08:43:00 crc kubenswrapper[4664]: I1003 08:43:00.174308 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wn667" Oct 03 08:43:00 crc kubenswrapper[4664]: I1003 08:43:00.174640 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wn667" event={"ID":"6d3b1030-66cf-4b71-a5b4-c38c1f02660b","Type":"ContainerDied","Data":"9cc6f1200eb5a0d02dc0ccb42049798cff9cea2ff5396fc32cd3dcb1a0cbcbcb"} Oct 03 08:43:00 crc kubenswrapper[4664]: I1003 08:43:00.174677 4664 scope.go:117] "RemoveContainer" containerID="819fbba908dc5fcd237c8bccc283b69e01fb084bb1d552828af649d94d4ee545" Oct 03 08:43:00 crc kubenswrapper[4664]: I1003 08:43:00.199735 4664 scope.go:117] "RemoveContainer" containerID="2920fa0fa91647c710317a10f6f0fb2574d2ba536efaa2481cf7518d0b575470" Oct 03 08:43:00 crc kubenswrapper[4664]: I1003 08:43:00.236975 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d3b1030-66cf-4b71-a5b4-c38c1f02660b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6d3b1030-66cf-4b71-a5b4-c38c1f02660b" (UID: "6d3b1030-66cf-4b71-a5b4-c38c1f02660b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:43:00 crc kubenswrapper[4664]: I1003 08:43:00.237823 4664 scope.go:117] "RemoveContainer" containerID="6bbcb3fe8a725e2e7d062e2a93939938987e8750e5df74f95cabb3d3050338ad" Oct 03 08:43:00 crc kubenswrapper[4664]: I1003 08:43:00.284270 4664 scope.go:117] "RemoveContainer" containerID="819fbba908dc5fcd237c8bccc283b69e01fb084bb1d552828af649d94d4ee545" Oct 03 08:43:00 crc kubenswrapper[4664]: E1003 08:43:00.284949 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"819fbba908dc5fcd237c8bccc283b69e01fb084bb1d552828af649d94d4ee545\": container with ID starting with 819fbba908dc5fcd237c8bccc283b69e01fb084bb1d552828af649d94d4ee545 not found: ID does not exist" containerID="819fbba908dc5fcd237c8bccc283b69e01fb084bb1d552828af649d94d4ee545" Oct 03 08:43:00 crc kubenswrapper[4664]: I1003 08:43:00.284990 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"819fbba908dc5fcd237c8bccc283b69e01fb084bb1d552828af649d94d4ee545"} err="failed to get container status \"819fbba908dc5fcd237c8bccc283b69e01fb084bb1d552828af649d94d4ee545\": rpc error: code = NotFound desc = could not find container \"819fbba908dc5fcd237c8bccc283b69e01fb084bb1d552828af649d94d4ee545\": container with ID starting with 819fbba908dc5fcd237c8bccc283b69e01fb084bb1d552828af649d94d4ee545 not found: ID does not exist" Oct 03 08:43:00 crc kubenswrapper[4664]: I1003 08:43:00.285022 4664 scope.go:117] "RemoveContainer" containerID="2920fa0fa91647c710317a10f6f0fb2574d2ba536efaa2481cf7518d0b575470" Oct 03 08:43:00 crc kubenswrapper[4664]: E1003 08:43:00.285597 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2920fa0fa91647c710317a10f6f0fb2574d2ba536efaa2481cf7518d0b575470\": container with ID starting with 2920fa0fa91647c710317a10f6f0fb2574d2ba536efaa2481cf7518d0b575470 not found: ID does not exist" containerID="2920fa0fa91647c710317a10f6f0fb2574d2ba536efaa2481cf7518d0b575470" Oct 03 08:43:00 crc kubenswrapper[4664]: I1003 08:43:00.285824 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2920fa0fa91647c710317a10f6f0fb2574d2ba536efaa2481cf7518d0b575470"} err="failed to get container status \"2920fa0fa91647c710317a10f6f0fb2574d2ba536efaa2481cf7518d0b575470\": rpc error: code = NotFound desc = could not find container \"2920fa0fa91647c710317a10f6f0fb2574d2ba536efaa2481cf7518d0b575470\": container with ID starting with 2920fa0fa91647c710317a10f6f0fb2574d2ba536efaa2481cf7518d0b575470 not found: ID does not exist" Oct 03 08:43:00 crc kubenswrapper[4664]: I1003 08:43:00.285979 4664 scope.go:117] "RemoveContainer" containerID="6bbcb3fe8a725e2e7d062e2a93939938987e8750e5df74f95cabb3d3050338ad" Oct 03 08:43:00 crc kubenswrapper[4664]: E1003 08:43:00.286678 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bbcb3fe8a725e2e7d062e2a93939938987e8750e5df74f95cabb3d3050338ad\": container with ID starting with 6bbcb3fe8a725e2e7d062e2a93939938987e8750e5df74f95cabb3d3050338ad not found: ID does not exist" containerID="6bbcb3fe8a725e2e7d062e2a93939938987e8750e5df74f95cabb3d3050338ad" Oct 03 08:43:00 crc kubenswrapper[4664]: I1003 08:43:00.286705 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bbcb3fe8a725e2e7d062e2a93939938987e8750e5df74f95cabb3d3050338ad"} err="failed to get container status \"6bbcb3fe8a725e2e7d062e2a93939938987e8750e5df74f95cabb3d3050338ad\": rpc error: code = NotFound desc = could not find container \"6bbcb3fe8a725e2e7d062e2a93939938987e8750e5df74f95cabb3d3050338ad\": container with ID starting with 6bbcb3fe8a725e2e7d062e2a93939938987e8750e5df74f95cabb3d3050338ad not found: ID does not exist" Oct 03 08:43:00 crc kubenswrapper[4664]: I1003 08:43:00.301276 4664 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d3b1030-66cf-4b71-a5b4-c38c1f02660b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:43:00 crc kubenswrapper[4664]: I1003 08:43:00.513265 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wn667"] Oct 03 08:43:00 crc kubenswrapper[4664]: I1003 08:43:00.523420 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wn667"] Oct 03 08:43:01 crc kubenswrapper[4664]: I1003 08:43:01.894703 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d3b1030-66cf-4b71-a5b4-c38c1f02660b" path="/var/lib/kubelet/pods/6d3b1030-66cf-4b71-a5b4-c38c1f02660b/volumes" Oct 03 08:43:04 crc kubenswrapper[4664]: I1003 08:43:04.878112 4664 scope.go:117] "RemoveContainer" containerID="db2833a516ac8601848d83e3ded1226e2dd9d4f9bf2dd94522498c7951f646a6" Oct 03 08:43:04 crc kubenswrapper[4664]: E1003 08:43:04.879387 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:43:19 crc kubenswrapper[4664]: I1003 08:43:19.883753 4664 scope.go:117] "RemoveContainer" containerID="db2833a516ac8601848d83e3ded1226e2dd9d4f9bf2dd94522498c7951f646a6" Oct 03 08:43:19 crc kubenswrapper[4664]: E1003 08:43:19.884493 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:43:34 crc kubenswrapper[4664]: I1003 08:43:34.876654 4664 scope.go:117] "RemoveContainer" containerID="db2833a516ac8601848d83e3ded1226e2dd9d4f9bf2dd94522498c7951f646a6" Oct 03 08:43:34 crc kubenswrapper[4664]: E1003 08:43:34.877711 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:43:44 crc kubenswrapper[4664]: I1003 08:43:44.745821 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jp4qx"] Oct 03 08:43:44 crc kubenswrapper[4664]: E1003 08:43:44.747329 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d3b1030-66cf-4b71-a5b4-c38c1f02660b" containerName="registry-server" Oct 03 08:43:44 crc kubenswrapper[4664]: I1003 08:43:44.747353 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d3b1030-66cf-4b71-a5b4-c38c1f02660b" containerName="registry-server" Oct 03 08:43:44 crc kubenswrapper[4664]: E1003 08:43:44.747417 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d3b1030-66cf-4b71-a5b4-c38c1f02660b" containerName="extract-content" Oct 03 08:43:44 crc kubenswrapper[4664]: I1003 08:43:44.747431 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d3b1030-66cf-4b71-a5b4-c38c1f02660b" containerName="extract-content" Oct 03 08:43:44 crc kubenswrapper[4664]: E1003 08:43:44.747455 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d3b1030-66cf-4b71-a5b4-c38c1f02660b" containerName="extract-utilities" Oct 03 08:43:44 crc kubenswrapper[4664]: I1003 08:43:44.747467 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d3b1030-66cf-4b71-a5b4-c38c1f02660b" containerName="extract-utilities" Oct 03 08:43:44 crc kubenswrapper[4664]: I1003 08:43:44.747903 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d3b1030-66cf-4b71-a5b4-c38c1f02660b" containerName="registry-server" Oct 03 08:43:44 crc kubenswrapper[4664]: I1003 08:43:44.753439 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jp4qx" Oct 03 08:43:44 crc kubenswrapper[4664]: I1003 08:43:44.760792 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jp4qx"] Oct 03 08:43:44 crc kubenswrapper[4664]: I1003 08:43:44.947164 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbkfv\" (UniqueName: \"kubernetes.io/projected/6459c504-5d74-4b7d-b62a-25c9a95754f8-kube-api-access-zbkfv\") pod \"redhat-marketplace-jp4qx\" (UID: \"6459c504-5d74-4b7d-b62a-25c9a95754f8\") " pod="openshift-marketplace/redhat-marketplace-jp4qx" Oct 03 08:43:44 crc kubenswrapper[4664]: I1003 08:43:44.947294 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6459c504-5d74-4b7d-b62a-25c9a95754f8-utilities\") pod \"redhat-marketplace-jp4qx\" (UID: \"6459c504-5d74-4b7d-b62a-25c9a95754f8\") " pod="openshift-marketplace/redhat-marketplace-jp4qx" Oct 03 08:43:44 crc kubenswrapper[4664]: I1003 08:43:44.947322 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6459c504-5d74-4b7d-b62a-25c9a95754f8-catalog-content\") pod \"redhat-marketplace-jp4qx\" (UID: \"6459c504-5d74-4b7d-b62a-25c9a95754f8\") " pod="openshift-marketplace/redhat-marketplace-jp4qx" Oct 03 08:43:45 crc kubenswrapper[4664]: I1003 08:43:45.049566 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbkfv\" (UniqueName: \"kubernetes.io/projected/6459c504-5d74-4b7d-b62a-25c9a95754f8-kube-api-access-zbkfv\") pod \"redhat-marketplace-jp4qx\" (UID: \"6459c504-5d74-4b7d-b62a-25c9a95754f8\") " pod="openshift-marketplace/redhat-marketplace-jp4qx" Oct 03 08:43:45 crc kubenswrapper[4664]: I1003 08:43:45.049897 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6459c504-5d74-4b7d-b62a-25c9a95754f8-utilities\") pod \"redhat-marketplace-jp4qx\" (UID: \"6459c504-5d74-4b7d-b62a-25c9a95754f8\") " pod="openshift-marketplace/redhat-marketplace-jp4qx" Oct 03 08:43:45 crc kubenswrapper[4664]: I1003 08:43:45.049941 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6459c504-5d74-4b7d-b62a-25c9a95754f8-catalog-content\") pod \"redhat-marketplace-jp4qx\" (UID: \"6459c504-5d74-4b7d-b62a-25c9a95754f8\") " pod="openshift-marketplace/redhat-marketplace-jp4qx" Oct 03 08:43:45 crc kubenswrapper[4664]: I1003 08:43:45.050574 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6459c504-5d74-4b7d-b62a-25c9a95754f8-catalog-content\") pod \"redhat-marketplace-jp4qx\" (UID: \"6459c504-5d74-4b7d-b62a-25c9a95754f8\") " pod="openshift-marketplace/redhat-marketplace-jp4qx" Oct 03 08:43:45 crc kubenswrapper[4664]: I1003 08:43:45.050591 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6459c504-5d74-4b7d-b62a-25c9a95754f8-utilities\") pod \"redhat-marketplace-jp4qx\" (UID: \"6459c504-5d74-4b7d-b62a-25c9a95754f8\") " pod="openshift-marketplace/redhat-marketplace-jp4qx" Oct 03 08:43:45 crc kubenswrapper[4664]: I1003 08:43:45.078498 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbkfv\" (UniqueName: \"kubernetes.io/projected/6459c504-5d74-4b7d-b62a-25c9a95754f8-kube-api-access-zbkfv\") pod \"redhat-marketplace-jp4qx\" (UID: \"6459c504-5d74-4b7d-b62a-25c9a95754f8\") " pod="openshift-marketplace/redhat-marketplace-jp4qx" Oct 03 08:43:45 crc kubenswrapper[4664]: I1003 08:43:45.104677 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jp4qx" Oct 03 08:43:45 crc kubenswrapper[4664]: I1003 08:43:45.341236 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-d8qfr"] Oct 03 08:43:45 crc kubenswrapper[4664]: I1003 08:43:45.343892 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d8qfr" Oct 03 08:43:45 crc kubenswrapper[4664]: I1003 08:43:45.355710 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d8qfr"] Oct 03 08:43:45 crc kubenswrapper[4664]: I1003 08:43:45.465558 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnz8b\" (UniqueName: \"kubernetes.io/projected/81bc144e-0356-46fa-ab7e-5aeaa662e875-kube-api-access-bnz8b\") pod \"redhat-operators-d8qfr\" (UID: \"81bc144e-0356-46fa-ab7e-5aeaa662e875\") " pod="openshift-marketplace/redhat-operators-d8qfr" Oct 03 08:43:45 crc kubenswrapper[4664]: I1003 08:43:45.465653 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81bc144e-0356-46fa-ab7e-5aeaa662e875-catalog-content\") pod \"redhat-operators-d8qfr\" (UID: \"81bc144e-0356-46fa-ab7e-5aeaa662e875\") " pod="openshift-marketplace/redhat-operators-d8qfr" Oct 03 08:43:45 crc kubenswrapper[4664]: I1003 08:43:45.465700 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81bc144e-0356-46fa-ab7e-5aeaa662e875-utilities\") pod \"redhat-operators-d8qfr\" (UID: \"81bc144e-0356-46fa-ab7e-5aeaa662e875\") " pod="openshift-marketplace/redhat-operators-d8qfr" Oct 03 08:43:45 crc kubenswrapper[4664]: I1003 08:43:45.567497 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnz8b\" (UniqueName: \"kubernetes.io/projected/81bc144e-0356-46fa-ab7e-5aeaa662e875-kube-api-access-bnz8b\") pod \"redhat-operators-d8qfr\" (UID: \"81bc144e-0356-46fa-ab7e-5aeaa662e875\") " pod="openshift-marketplace/redhat-operators-d8qfr" Oct 03 08:43:45 crc kubenswrapper[4664]: I1003 08:43:45.567582 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81bc144e-0356-46fa-ab7e-5aeaa662e875-catalog-content\") pod \"redhat-operators-d8qfr\" (UID: \"81bc144e-0356-46fa-ab7e-5aeaa662e875\") " pod="openshift-marketplace/redhat-operators-d8qfr" Oct 03 08:43:45 crc kubenswrapper[4664]: I1003 08:43:45.567747 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81bc144e-0356-46fa-ab7e-5aeaa662e875-utilities\") pod \"redhat-operators-d8qfr\" (UID: \"81bc144e-0356-46fa-ab7e-5aeaa662e875\") " pod="openshift-marketplace/redhat-operators-d8qfr" Oct 03 08:43:45 crc kubenswrapper[4664]: I1003 08:43:45.569507 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81bc144e-0356-46fa-ab7e-5aeaa662e875-catalog-content\") pod \"redhat-operators-d8qfr\" (UID: \"81bc144e-0356-46fa-ab7e-5aeaa662e875\") " pod="openshift-marketplace/redhat-operators-d8qfr" Oct 03 08:43:45 crc kubenswrapper[4664]: I1003 08:43:45.569786 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81bc144e-0356-46fa-ab7e-5aeaa662e875-utilities\") pod \"redhat-operators-d8qfr\" (UID: \"81bc144e-0356-46fa-ab7e-5aeaa662e875\") " pod="openshift-marketplace/redhat-operators-d8qfr" Oct 03 08:43:45 crc kubenswrapper[4664]: I1003 08:43:45.593407 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnz8b\" (UniqueName: \"kubernetes.io/projected/81bc144e-0356-46fa-ab7e-5aeaa662e875-kube-api-access-bnz8b\") pod \"redhat-operators-d8qfr\" (UID: \"81bc144e-0356-46fa-ab7e-5aeaa662e875\") " pod="openshift-marketplace/redhat-operators-d8qfr" Oct 03 08:43:45 crc kubenswrapper[4664]: I1003 08:43:45.644954 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jp4qx"] Oct 03 08:43:45 crc kubenswrapper[4664]: I1003 08:43:45.672272 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d8qfr" Oct 03 08:43:45 crc kubenswrapper[4664]: I1003 08:43:45.685398 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jp4qx" event={"ID":"6459c504-5d74-4b7d-b62a-25c9a95754f8","Type":"ContainerStarted","Data":"7d39991af3b362338a084fa897131f7a57e7cca9e4d339a7eed486678787886c"} Oct 03 08:43:46 crc kubenswrapper[4664]: I1003 08:43:46.172174 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d8qfr"] Oct 03 08:43:46 crc kubenswrapper[4664]: I1003 08:43:46.699278 4664 generic.go:334] "Generic (PLEG): container finished" podID="81bc144e-0356-46fa-ab7e-5aeaa662e875" containerID="6aabfb349e738061833165adcc1e7607e8ddebcfabdb954faa6bcd0a12fa25bb" exitCode=0 Oct 03 08:43:46 crc kubenswrapper[4664]: I1003 08:43:46.699453 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d8qfr" event={"ID":"81bc144e-0356-46fa-ab7e-5aeaa662e875","Type":"ContainerDied","Data":"6aabfb349e738061833165adcc1e7607e8ddebcfabdb954faa6bcd0a12fa25bb"} Oct 03 08:43:46 crc kubenswrapper[4664]: I1003 08:43:46.699970 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d8qfr" event={"ID":"81bc144e-0356-46fa-ab7e-5aeaa662e875","Type":"ContainerStarted","Data":"394acdb203ad07e4705d1c9ff85d18ea397a34998a8336822a47b1fcff9d7b0a"} Oct 03 08:43:46 crc kubenswrapper[4664]: I1003 08:43:46.702683 4664 generic.go:334] "Generic (PLEG): container finished" podID="6459c504-5d74-4b7d-b62a-25c9a95754f8" containerID="dbbb67e7f7ede25e3e9932e1491b912e31a675c0f0ff5d0412af3386be9f3536" exitCode=0 Oct 03 08:43:46 crc kubenswrapper[4664]: I1003 08:43:46.702720 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jp4qx" event={"ID":"6459c504-5d74-4b7d-b62a-25c9a95754f8","Type":"ContainerDied","Data":"dbbb67e7f7ede25e3e9932e1491b912e31a675c0f0ff5d0412af3386be9f3536"} Oct 03 08:43:47 crc kubenswrapper[4664]: I1003 08:43:47.741764 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nrlkk"] Oct 03 08:43:47 crc kubenswrapper[4664]: I1003 08:43:47.748084 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nrlkk" Oct 03 08:43:47 crc kubenswrapper[4664]: I1003 08:43:47.811857 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nrlkk"] Oct 03 08:43:47 crc kubenswrapper[4664]: I1003 08:43:47.921498 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3790394-92d9-4667-9b1a-e3aa70c6a9f2-utilities\") pod \"community-operators-nrlkk\" (UID: \"e3790394-92d9-4667-9b1a-e3aa70c6a9f2\") " pod="openshift-marketplace/community-operators-nrlkk" Oct 03 08:43:47 crc kubenswrapper[4664]: I1003 08:43:47.921678 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3790394-92d9-4667-9b1a-e3aa70c6a9f2-catalog-content\") pod \"community-operators-nrlkk\" (UID: \"e3790394-92d9-4667-9b1a-e3aa70c6a9f2\") " pod="openshift-marketplace/community-operators-nrlkk" Oct 03 08:43:47 crc kubenswrapper[4664]: I1003 08:43:47.921701 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvnr8\" (UniqueName: \"kubernetes.io/projected/e3790394-92d9-4667-9b1a-e3aa70c6a9f2-kube-api-access-dvnr8\") pod \"community-operators-nrlkk\" (UID: \"e3790394-92d9-4667-9b1a-e3aa70c6a9f2\") " pod="openshift-marketplace/community-operators-nrlkk" Oct 03 08:43:48 crc kubenswrapper[4664]: I1003 08:43:48.024017 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3790394-92d9-4667-9b1a-e3aa70c6a9f2-utilities\") pod \"community-operators-nrlkk\" (UID: \"e3790394-92d9-4667-9b1a-e3aa70c6a9f2\") " pod="openshift-marketplace/community-operators-nrlkk" Oct 03 08:43:48 crc kubenswrapper[4664]: I1003 08:43:48.024458 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3790394-92d9-4667-9b1a-e3aa70c6a9f2-catalog-content\") pod \"community-operators-nrlkk\" (UID: \"e3790394-92d9-4667-9b1a-e3aa70c6a9f2\") " pod="openshift-marketplace/community-operators-nrlkk" Oct 03 08:43:48 crc kubenswrapper[4664]: I1003 08:43:48.024614 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvnr8\" (UniqueName: \"kubernetes.io/projected/e3790394-92d9-4667-9b1a-e3aa70c6a9f2-kube-api-access-dvnr8\") pod \"community-operators-nrlkk\" (UID: \"e3790394-92d9-4667-9b1a-e3aa70c6a9f2\") " pod="openshift-marketplace/community-operators-nrlkk" Oct 03 08:43:48 crc kubenswrapper[4664]: I1003 08:43:48.025139 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3790394-92d9-4667-9b1a-e3aa70c6a9f2-utilities\") pod \"community-operators-nrlkk\" (UID: \"e3790394-92d9-4667-9b1a-e3aa70c6a9f2\") " pod="openshift-marketplace/community-operators-nrlkk" Oct 03 08:43:48 crc kubenswrapper[4664]: I1003 08:43:48.025357 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3790394-92d9-4667-9b1a-e3aa70c6a9f2-catalog-content\") pod \"community-operators-nrlkk\" (UID: \"e3790394-92d9-4667-9b1a-e3aa70c6a9f2\") " pod="openshift-marketplace/community-operators-nrlkk" Oct 03 08:43:48 crc kubenswrapper[4664]: I1003 08:43:48.046112 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvnr8\" (UniqueName: \"kubernetes.io/projected/e3790394-92d9-4667-9b1a-e3aa70c6a9f2-kube-api-access-dvnr8\") pod \"community-operators-nrlkk\" (UID: \"e3790394-92d9-4667-9b1a-e3aa70c6a9f2\") " pod="openshift-marketplace/community-operators-nrlkk" Oct 03 08:43:48 crc kubenswrapper[4664]: I1003 08:43:48.084713 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nrlkk" Oct 03 08:43:48 crc kubenswrapper[4664]: I1003 08:43:48.653857 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nrlkk"] Oct 03 08:43:48 crc kubenswrapper[4664]: I1003 08:43:48.727797 4664 generic.go:334] "Generic (PLEG): container finished" podID="81bc144e-0356-46fa-ab7e-5aeaa662e875" containerID="cc94ee7a4501b69b1ac5db56145263097e5ae9ab35610bce711665c32f328f04" exitCode=0 Oct 03 08:43:48 crc kubenswrapper[4664]: I1003 08:43:48.727918 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d8qfr" event={"ID":"81bc144e-0356-46fa-ab7e-5aeaa662e875","Type":"ContainerDied","Data":"cc94ee7a4501b69b1ac5db56145263097e5ae9ab35610bce711665c32f328f04"} Oct 03 08:43:48 crc kubenswrapper[4664]: I1003 08:43:48.733131 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nrlkk" event={"ID":"e3790394-92d9-4667-9b1a-e3aa70c6a9f2","Type":"ContainerStarted","Data":"17a395175061abb23067107b01bec9434d45628f1311e08fd598e6fcce588128"} Oct 03 08:43:48 crc kubenswrapper[4664]: I1003 08:43:48.736783 4664 generic.go:334] "Generic (PLEG): container finished" podID="6459c504-5d74-4b7d-b62a-25c9a95754f8" containerID="47e23dc4a24c9270409ac470cdbd35227c3a613fe87e7859aa634151374737a8" exitCode=0 Oct 03 08:43:48 crc kubenswrapper[4664]: I1003 08:43:48.736823 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jp4qx" event={"ID":"6459c504-5d74-4b7d-b62a-25c9a95754f8","Type":"ContainerDied","Data":"47e23dc4a24c9270409ac470cdbd35227c3a613fe87e7859aa634151374737a8"} Oct 03 08:43:49 crc kubenswrapper[4664]: I1003 08:43:49.752822 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d8qfr" event={"ID":"81bc144e-0356-46fa-ab7e-5aeaa662e875","Type":"ContainerStarted","Data":"92cc3e140c13eb5eeb5a7c6c69d29ed862d83d00fe6a88796dc0af49c42255c4"} Oct 03 08:43:49 crc kubenswrapper[4664]: I1003 08:43:49.755739 4664 generic.go:334] "Generic (PLEG): container finished" podID="e3790394-92d9-4667-9b1a-e3aa70c6a9f2" containerID="03f2ca66c9da56cc31e80257907edcf0de193833d6f6bfbae222f120974f01c9" exitCode=0 Oct 03 08:43:49 crc kubenswrapper[4664]: I1003 08:43:49.755833 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nrlkk" event={"ID":"e3790394-92d9-4667-9b1a-e3aa70c6a9f2","Type":"ContainerDied","Data":"03f2ca66c9da56cc31e80257907edcf0de193833d6f6bfbae222f120974f01c9"} Oct 03 08:43:49 crc kubenswrapper[4664]: I1003 08:43:49.761663 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jp4qx" event={"ID":"6459c504-5d74-4b7d-b62a-25c9a95754f8","Type":"ContainerStarted","Data":"32945f4146120da3d111a5f31f8a9cada5640ee02565567194e870fa68440f91"} Oct 03 08:43:49 crc kubenswrapper[4664]: I1003 08:43:49.791053 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-d8qfr" podStartSLOduration=2.325505001 podStartE2EDuration="4.791025319s" podCreationTimestamp="2025-10-03 08:43:45 +0000 UTC" firstStartedPulling="2025-10-03 08:43:46.702043801 +0000 UTC m=+3327.523234291" lastFinishedPulling="2025-10-03 08:43:49.167564119 +0000 UTC m=+3329.988754609" observedRunningTime="2025-10-03 08:43:49.781738324 +0000 UTC m=+3330.602928834" watchObservedRunningTime="2025-10-03 08:43:49.791025319 +0000 UTC m=+3330.612215809" Oct 03 08:43:49 crc kubenswrapper[4664]: I1003 08:43:49.826913 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jp4qx" podStartSLOduration=3.250440836 podStartE2EDuration="5.826888113s" podCreationTimestamp="2025-10-03 08:43:44 +0000 UTC" firstStartedPulling="2025-10-03 08:43:46.704285785 +0000 UTC m=+3327.525476275" lastFinishedPulling="2025-10-03 08:43:49.280733052 +0000 UTC m=+3330.101923552" observedRunningTime="2025-10-03 08:43:49.820351497 +0000 UTC m=+3330.641541987" watchObservedRunningTime="2025-10-03 08:43:49.826888113 +0000 UTC m=+3330.648078603" Oct 03 08:43:49 crc kubenswrapper[4664]: I1003 08:43:49.884669 4664 scope.go:117] "RemoveContainer" containerID="db2833a516ac8601848d83e3ded1226e2dd9d4f9bf2dd94522498c7951f646a6" Oct 03 08:43:49 crc kubenswrapper[4664]: E1003 08:43:49.884998 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:43:51 crc kubenswrapper[4664]: I1003 08:43:51.785057 4664 generic.go:334] "Generic (PLEG): container finished" podID="e3790394-92d9-4667-9b1a-e3aa70c6a9f2" containerID="820c7c7d24c5e86f2188b4d275cd194912c2883fda15890eb78ad26201bdfe7a" exitCode=0 Oct 03 08:43:51 crc kubenswrapper[4664]: I1003 08:43:51.785091 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nrlkk" event={"ID":"e3790394-92d9-4667-9b1a-e3aa70c6a9f2","Type":"ContainerDied","Data":"820c7c7d24c5e86f2188b4d275cd194912c2883fda15890eb78ad26201bdfe7a"} Oct 03 08:43:52 crc kubenswrapper[4664]: I1003 08:43:52.797851 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nrlkk" event={"ID":"e3790394-92d9-4667-9b1a-e3aa70c6a9f2","Type":"ContainerStarted","Data":"43252b8d086bfe623f4a91a17b16429c7ac8bd1c8c4411305128b2dd4288edea"} Oct 03 08:43:52 crc kubenswrapper[4664]: I1003 08:43:52.821241 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nrlkk" podStartSLOduration=3.402522737 podStartE2EDuration="5.821217608s" podCreationTimestamp="2025-10-03 08:43:47 +0000 UTC" firstStartedPulling="2025-10-03 08:43:49.757941704 +0000 UTC m=+3330.579132194" lastFinishedPulling="2025-10-03 08:43:52.176636575 +0000 UTC m=+3332.997827065" observedRunningTime="2025-10-03 08:43:52.815538185 +0000 UTC m=+3333.636728695" watchObservedRunningTime="2025-10-03 08:43:52.821217608 +0000 UTC m=+3333.642408088" Oct 03 08:43:55 crc kubenswrapper[4664]: I1003 08:43:55.105101 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jp4qx" Oct 03 08:43:55 crc kubenswrapper[4664]: I1003 08:43:55.105355 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jp4qx" Oct 03 08:43:55 crc kubenswrapper[4664]: I1003 08:43:55.175855 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jp4qx" Oct 03 08:43:55 crc kubenswrapper[4664]: I1003 08:43:55.673217 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-d8qfr" Oct 03 08:43:55 crc kubenswrapper[4664]: I1003 08:43:55.673286 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-d8qfr" Oct 03 08:43:55 crc kubenswrapper[4664]: I1003 08:43:55.755234 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-d8qfr" Oct 03 08:43:55 crc kubenswrapper[4664]: I1003 08:43:55.903510 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jp4qx" Oct 03 08:43:55 crc kubenswrapper[4664]: I1003 08:43:55.903618 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-d8qfr" Oct 03 08:43:57 crc kubenswrapper[4664]: I1003 08:43:57.119088 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jp4qx"] Oct 03 08:43:57 crc kubenswrapper[4664]: I1003 08:43:57.858318 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jp4qx" podUID="6459c504-5d74-4b7d-b62a-25c9a95754f8" containerName="registry-server" containerID="cri-o://32945f4146120da3d111a5f31f8a9cada5640ee02565567194e870fa68440f91" gracePeriod=2 Oct 03 08:43:58 crc kubenswrapper[4664]: I1003 08:43:58.085990 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nrlkk" Oct 03 08:43:58 crc kubenswrapper[4664]: I1003 08:43:58.086515 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nrlkk" Oct 03 08:43:58 crc kubenswrapper[4664]: I1003 08:43:58.117169 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d8qfr"] Oct 03 08:43:58 crc kubenswrapper[4664]: I1003 08:43:58.117453 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-d8qfr" podUID="81bc144e-0356-46fa-ab7e-5aeaa662e875" containerName="registry-server" containerID="cri-o://92cc3e140c13eb5eeb5a7c6c69d29ed862d83d00fe6a88796dc0af49c42255c4" gracePeriod=2 Oct 03 08:43:58 crc kubenswrapper[4664]: I1003 08:43:58.162302 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nrlkk" Oct 03 08:43:58 crc kubenswrapper[4664]: I1003 08:43:58.346659 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jp4qx" Oct 03 08:43:58 crc kubenswrapper[4664]: I1003 08:43:58.475375 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6459c504-5d74-4b7d-b62a-25c9a95754f8-utilities\") pod \"6459c504-5d74-4b7d-b62a-25c9a95754f8\" (UID: \"6459c504-5d74-4b7d-b62a-25c9a95754f8\") " Oct 03 08:43:58 crc kubenswrapper[4664]: I1003 08:43:58.475571 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6459c504-5d74-4b7d-b62a-25c9a95754f8-catalog-content\") pod \"6459c504-5d74-4b7d-b62a-25c9a95754f8\" (UID: \"6459c504-5d74-4b7d-b62a-25c9a95754f8\") " Oct 03 08:43:58 crc kubenswrapper[4664]: I1003 08:43:58.475764 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbkfv\" (UniqueName: \"kubernetes.io/projected/6459c504-5d74-4b7d-b62a-25c9a95754f8-kube-api-access-zbkfv\") pod \"6459c504-5d74-4b7d-b62a-25c9a95754f8\" (UID: \"6459c504-5d74-4b7d-b62a-25c9a95754f8\") " Oct 03 08:43:58 crc kubenswrapper[4664]: I1003 08:43:58.476281 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6459c504-5d74-4b7d-b62a-25c9a95754f8-utilities" (OuterVolumeSpecName: "utilities") pod "6459c504-5d74-4b7d-b62a-25c9a95754f8" (UID: "6459c504-5d74-4b7d-b62a-25c9a95754f8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:43:58 crc kubenswrapper[4664]: I1003 08:43:58.476745 4664 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6459c504-5d74-4b7d-b62a-25c9a95754f8-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:43:58 crc kubenswrapper[4664]: I1003 08:43:58.485226 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6459c504-5d74-4b7d-b62a-25c9a95754f8-kube-api-access-zbkfv" (OuterVolumeSpecName: "kube-api-access-zbkfv") pod "6459c504-5d74-4b7d-b62a-25c9a95754f8" (UID: "6459c504-5d74-4b7d-b62a-25c9a95754f8"). InnerVolumeSpecName "kube-api-access-zbkfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:43:58 crc kubenswrapper[4664]: I1003 08:43:58.490832 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6459c504-5d74-4b7d-b62a-25c9a95754f8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6459c504-5d74-4b7d-b62a-25c9a95754f8" (UID: "6459c504-5d74-4b7d-b62a-25c9a95754f8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:43:58 crc kubenswrapper[4664]: I1003 08:43:58.569425 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d8qfr" Oct 03 08:43:58 crc kubenswrapper[4664]: I1003 08:43:58.579661 4664 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6459c504-5d74-4b7d-b62a-25c9a95754f8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:43:58 crc kubenswrapper[4664]: I1003 08:43:58.579717 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbkfv\" (UniqueName: \"kubernetes.io/projected/6459c504-5d74-4b7d-b62a-25c9a95754f8-kube-api-access-zbkfv\") on node \"crc\" DevicePath \"\"" Oct 03 08:43:58 crc kubenswrapper[4664]: I1003 08:43:58.681530 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnz8b\" (UniqueName: \"kubernetes.io/projected/81bc144e-0356-46fa-ab7e-5aeaa662e875-kube-api-access-bnz8b\") pod \"81bc144e-0356-46fa-ab7e-5aeaa662e875\" (UID: \"81bc144e-0356-46fa-ab7e-5aeaa662e875\") " Oct 03 08:43:58 crc kubenswrapper[4664]: I1003 08:43:58.682541 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81bc144e-0356-46fa-ab7e-5aeaa662e875-catalog-content\") pod \"81bc144e-0356-46fa-ab7e-5aeaa662e875\" (UID: \"81bc144e-0356-46fa-ab7e-5aeaa662e875\") " Oct 03 08:43:58 crc kubenswrapper[4664]: I1003 08:43:58.682803 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81bc144e-0356-46fa-ab7e-5aeaa662e875-utilities\") pod \"81bc144e-0356-46fa-ab7e-5aeaa662e875\" (UID: \"81bc144e-0356-46fa-ab7e-5aeaa662e875\") " Oct 03 08:43:58 crc kubenswrapper[4664]: I1003 08:43:58.683914 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81bc144e-0356-46fa-ab7e-5aeaa662e875-utilities" (OuterVolumeSpecName: "utilities") pod "81bc144e-0356-46fa-ab7e-5aeaa662e875" (UID: "81bc144e-0356-46fa-ab7e-5aeaa662e875"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:43:58 crc kubenswrapper[4664]: I1003 08:43:58.684467 4664 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81bc144e-0356-46fa-ab7e-5aeaa662e875-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:43:58 crc kubenswrapper[4664]: I1003 08:43:58.688380 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81bc144e-0356-46fa-ab7e-5aeaa662e875-kube-api-access-bnz8b" (OuterVolumeSpecName: "kube-api-access-bnz8b") pod "81bc144e-0356-46fa-ab7e-5aeaa662e875" (UID: "81bc144e-0356-46fa-ab7e-5aeaa662e875"). InnerVolumeSpecName "kube-api-access-bnz8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:43:58 crc kubenswrapper[4664]: I1003 08:43:58.777560 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81bc144e-0356-46fa-ab7e-5aeaa662e875-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "81bc144e-0356-46fa-ab7e-5aeaa662e875" (UID: "81bc144e-0356-46fa-ab7e-5aeaa662e875"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:43:58 crc kubenswrapper[4664]: I1003 08:43:58.786166 4664 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81bc144e-0356-46fa-ab7e-5aeaa662e875-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:43:58 crc kubenswrapper[4664]: I1003 08:43:58.786212 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnz8b\" (UniqueName: \"kubernetes.io/projected/81bc144e-0356-46fa-ab7e-5aeaa662e875-kube-api-access-bnz8b\") on node \"crc\" DevicePath \"\"" Oct 03 08:43:58 crc kubenswrapper[4664]: I1003 08:43:58.869921 4664 generic.go:334] "Generic (PLEG): container finished" podID="6459c504-5d74-4b7d-b62a-25c9a95754f8" containerID="32945f4146120da3d111a5f31f8a9cada5640ee02565567194e870fa68440f91" exitCode=0 Oct 03 08:43:58 crc kubenswrapper[4664]: I1003 08:43:58.870009 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jp4qx" Oct 03 08:43:58 crc kubenswrapper[4664]: I1003 08:43:58.870027 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jp4qx" event={"ID":"6459c504-5d74-4b7d-b62a-25c9a95754f8","Type":"ContainerDied","Data":"32945f4146120da3d111a5f31f8a9cada5640ee02565567194e870fa68440f91"} Oct 03 08:43:58 crc kubenswrapper[4664]: I1003 08:43:58.870133 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jp4qx" event={"ID":"6459c504-5d74-4b7d-b62a-25c9a95754f8","Type":"ContainerDied","Data":"7d39991af3b362338a084fa897131f7a57e7cca9e4d339a7eed486678787886c"} Oct 03 08:43:58 crc kubenswrapper[4664]: I1003 08:43:58.870175 4664 scope.go:117] "RemoveContainer" containerID="32945f4146120da3d111a5f31f8a9cada5640ee02565567194e870fa68440f91" Oct 03 08:43:58 crc kubenswrapper[4664]: I1003 08:43:58.874966 4664 generic.go:334] "Generic (PLEG): container finished" podID="81bc144e-0356-46fa-ab7e-5aeaa662e875" containerID="92cc3e140c13eb5eeb5a7c6c69d29ed862d83d00fe6a88796dc0af49c42255c4" exitCode=0 Oct 03 08:43:58 crc kubenswrapper[4664]: I1003 08:43:58.875062 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d8qfr" event={"ID":"81bc144e-0356-46fa-ab7e-5aeaa662e875","Type":"ContainerDied","Data":"92cc3e140c13eb5eeb5a7c6c69d29ed862d83d00fe6a88796dc0af49c42255c4"} Oct 03 08:43:58 crc kubenswrapper[4664]: I1003 08:43:58.875091 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d8qfr" Oct 03 08:43:58 crc kubenswrapper[4664]: I1003 08:43:58.875211 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d8qfr" event={"ID":"81bc144e-0356-46fa-ab7e-5aeaa662e875","Type":"ContainerDied","Data":"394acdb203ad07e4705d1c9ff85d18ea397a34998a8336822a47b1fcff9d7b0a"} Oct 03 08:43:58 crc kubenswrapper[4664]: I1003 08:43:58.915095 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jp4qx"] Oct 03 08:43:58 crc kubenswrapper[4664]: I1003 08:43:58.921521 4664 scope.go:117] "RemoveContainer" containerID="47e23dc4a24c9270409ac470cdbd35227c3a613fe87e7859aa634151374737a8" Oct 03 08:43:58 crc kubenswrapper[4664]: I1003 08:43:58.924369 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jp4qx"] Oct 03 08:43:58 crc kubenswrapper[4664]: I1003 08:43:58.946086 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d8qfr"] Oct 03 08:43:58 crc kubenswrapper[4664]: I1003 08:43:58.946962 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nrlkk" Oct 03 08:43:58 crc kubenswrapper[4664]: I1003 08:43:58.957982 4664 scope.go:117] "RemoveContainer" containerID="dbbb67e7f7ede25e3e9932e1491b912e31a675c0f0ff5d0412af3386be9f3536" Oct 03 08:43:58 crc kubenswrapper[4664]: I1003 08:43:58.958298 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-d8qfr"] Oct 03 08:43:58 crc kubenswrapper[4664]: I1003 08:43:58.990152 4664 scope.go:117] "RemoveContainer" containerID="32945f4146120da3d111a5f31f8a9cada5640ee02565567194e870fa68440f91" Oct 03 08:43:58 crc kubenswrapper[4664]: E1003 08:43:58.990897 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32945f4146120da3d111a5f31f8a9cada5640ee02565567194e870fa68440f91\": container with ID starting with 32945f4146120da3d111a5f31f8a9cada5640ee02565567194e870fa68440f91 not found: ID does not exist" containerID="32945f4146120da3d111a5f31f8a9cada5640ee02565567194e870fa68440f91" Oct 03 08:43:58 crc kubenswrapper[4664]: I1003 08:43:58.990958 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32945f4146120da3d111a5f31f8a9cada5640ee02565567194e870fa68440f91"} err="failed to get container status \"32945f4146120da3d111a5f31f8a9cada5640ee02565567194e870fa68440f91\": rpc error: code = NotFound desc = could not find container \"32945f4146120da3d111a5f31f8a9cada5640ee02565567194e870fa68440f91\": container with ID starting with 32945f4146120da3d111a5f31f8a9cada5640ee02565567194e870fa68440f91 not found: ID does not exist" Oct 03 08:43:58 crc kubenswrapper[4664]: I1003 08:43:58.990996 4664 scope.go:117] "RemoveContainer" containerID="47e23dc4a24c9270409ac470cdbd35227c3a613fe87e7859aa634151374737a8" Oct 03 08:43:58 crc kubenswrapper[4664]: E1003 08:43:58.991289 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47e23dc4a24c9270409ac470cdbd35227c3a613fe87e7859aa634151374737a8\": container with ID starting with 47e23dc4a24c9270409ac470cdbd35227c3a613fe87e7859aa634151374737a8 not found: ID does not exist" containerID="47e23dc4a24c9270409ac470cdbd35227c3a613fe87e7859aa634151374737a8" Oct 03 08:43:58 crc kubenswrapper[4664]: I1003 08:43:58.991317 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47e23dc4a24c9270409ac470cdbd35227c3a613fe87e7859aa634151374737a8"} err="failed to get container status \"47e23dc4a24c9270409ac470cdbd35227c3a613fe87e7859aa634151374737a8\": rpc error: code = NotFound desc = could not find container \"47e23dc4a24c9270409ac470cdbd35227c3a613fe87e7859aa634151374737a8\": container with ID starting with 47e23dc4a24c9270409ac470cdbd35227c3a613fe87e7859aa634151374737a8 not found: ID does not exist" Oct 03 08:43:58 crc kubenswrapper[4664]: I1003 08:43:58.991332 4664 scope.go:117] "RemoveContainer" containerID="dbbb67e7f7ede25e3e9932e1491b912e31a675c0f0ff5d0412af3386be9f3536" Oct 03 08:43:58 crc kubenswrapper[4664]: E1003 08:43:58.991547 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbbb67e7f7ede25e3e9932e1491b912e31a675c0f0ff5d0412af3386be9f3536\": container with ID starting with dbbb67e7f7ede25e3e9932e1491b912e31a675c0f0ff5d0412af3386be9f3536 not found: ID does not exist" containerID="dbbb67e7f7ede25e3e9932e1491b912e31a675c0f0ff5d0412af3386be9f3536" Oct 03 08:43:58 crc kubenswrapper[4664]: I1003 08:43:58.991572 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbbb67e7f7ede25e3e9932e1491b912e31a675c0f0ff5d0412af3386be9f3536"} err="failed to get container status \"dbbb67e7f7ede25e3e9932e1491b912e31a675c0f0ff5d0412af3386be9f3536\": rpc error: code = NotFound desc = could not find container \"dbbb67e7f7ede25e3e9932e1491b912e31a675c0f0ff5d0412af3386be9f3536\": container with ID starting with dbbb67e7f7ede25e3e9932e1491b912e31a675c0f0ff5d0412af3386be9f3536 not found: ID does not exist" Oct 03 08:43:58 crc kubenswrapper[4664]: I1003 08:43:58.991666 4664 scope.go:117] "RemoveContainer" containerID="92cc3e140c13eb5eeb5a7c6c69d29ed862d83d00fe6a88796dc0af49c42255c4" Oct 03 08:43:59 crc kubenswrapper[4664]: I1003 08:43:59.057465 4664 scope.go:117] "RemoveContainer" containerID="cc94ee7a4501b69b1ac5db56145263097e5ae9ab35610bce711665c32f328f04" Oct 03 08:43:59 crc kubenswrapper[4664]: I1003 08:43:59.083654 4664 scope.go:117] "RemoveContainer" containerID="6aabfb349e738061833165adcc1e7607e8ddebcfabdb954faa6bcd0a12fa25bb" Oct 03 08:43:59 crc kubenswrapper[4664]: I1003 08:43:59.134489 4664 scope.go:117] "RemoveContainer" containerID="92cc3e140c13eb5eeb5a7c6c69d29ed862d83d00fe6a88796dc0af49c42255c4" Oct 03 08:43:59 crc kubenswrapper[4664]: E1003 08:43:59.135309 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92cc3e140c13eb5eeb5a7c6c69d29ed862d83d00fe6a88796dc0af49c42255c4\": container with ID starting with 92cc3e140c13eb5eeb5a7c6c69d29ed862d83d00fe6a88796dc0af49c42255c4 not found: ID does not exist" containerID="92cc3e140c13eb5eeb5a7c6c69d29ed862d83d00fe6a88796dc0af49c42255c4" Oct 03 08:43:59 crc kubenswrapper[4664]: I1003 08:43:59.135367 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92cc3e140c13eb5eeb5a7c6c69d29ed862d83d00fe6a88796dc0af49c42255c4"} err="failed to get container status \"92cc3e140c13eb5eeb5a7c6c69d29ed862d83d00fe6a88796dc0af49c42255c4\": rpc error: code = NotFound desc = could not find container \"92cc3e140c13eb5eeb5a7c6c69d29ed862d83d00fe6a88796dc0af49c42255c4\": container with ID starting with 92cc3e140c13eb5eeb5a7c6c69d29ed862d83d00fe6a88796dc0af49c42255c4 not found: ID does not exist" Oct 03 08:43:59 crc kubenswrapper[4664]: I1003 08:43:59.135400 4664 scope.go:117] "RemoveContainer" containerID="cc94ee7a4501b69b1ac5db56145263097e5ae9ab35610bce711665c32f328f04" Oct 03 08:43:59 crc kubenswrapper[4664]: E1003 08:43:59.136137 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc94ee7a4501b69b1ac5db56145263097e5ae9ab35610bce711665c32f328f04\": container with ID starting with cc94ee7a4501b69b1ac5db56145263097e5ae9ab35610bce711665c32f328f04 not found: ID does not exist" containerID="cc94ee7a4501b69b1ac5db56145263097e5ae9ab35610bce711665c32f328f04" Oct 03 08:43:59 crc kubenswrapper[4664]: I1003 08:43:59.136168 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc94ee7a4501b69b1ac5db56145263097e5ae9ab35610bce711665c32f328f04"} err="failed to get container status \"cc94ee7a4501b69b1ac5db56145263097e5ae9ab35610bce711665c32f328f04\": rpc error: code = NotFound desc = could not find container \"cc94ee7a4501b69b1ac5db56145263097e5ae9ab35610bce711665c32f328f04\": container with ID starting with cc94ee7a4501b69b1ac5db56145263097e5ae9ab35610bce711665c32f328f04 not found: ID does not exist" Oct 03 08:43:59 crc kubenswrapper[4664]: I1003 08:43:59.136187 4664 scope.go:117] "RemoveContainer" containerID="6aabfb349e738061833165adcc1e7607e8ddebcfabdb954faa6bcd0a12fa25bb" Oct 03 08:43:59 crc kubenswrapper[4664]: E1003 08:43:59.136553 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6aabfb349e738061833165adcc1e7607e8ddebcfabdb954faa6bcd0a12fa25bb\": container with ID starting with 6aabfb349e738061833165adcc1e7607e8ddebcfabdb954faa6bcd0a12fa25bb not found: ID does not exist" containerID="6aabfb349e738061833165adcc1e7607e8ddebcfabdb954faa6bcd0a12fa25bb" Oct 03 08:43:59 crc kubenswrapper[4664]: I1003 08:43:59.136573 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6aabfb349e738061833165adcc1e7607e8ddebcfabdb954faa6bcd0a12fa25bb"} err="failed to get container status \"6aabfb349e738061833165adcc1e7607e8ddebcfabdb954faa6bcd0a12fa25bb\": rpc error: code = NotFound desc = could not find container \"6aabfb349e738061833165adcc1e7607e8ddebcfabdb954faa6bcd0a12fa25bb\": container with ID starting with 6aabfb349e738061833165adcc1e7607e8ddebcfabdb954faa6bcd0a12fa25bb not found: ID does not exist" Oct 03 08:43:59 crc kubenswrapper[4664]: I1003 08:43:59.892827 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6459c504-5d74-4b7d-b62a-25c9a95754f8" path="/var/lib/kubelet/pods/6459c504-5d74-4b7d-b62a-25c9a95754f8/volumes" Oct 03 08:43:59 crc kubenswrapper[4664]: I1003 08:43:59.893904 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81bc144e-0356-46fa-ab7e-5aeaa662e875" path="/var/lib/kubelet/pods/81bc144e-0356-46fa-ab7e-5aeaa662e875/volumes" Oct 03 08:44:01 crc kubenswrapper[4664]: I1003 08:44:01.524449 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nrlkk"] Oct 03 08:44:01 crc kubenswrapper[4664]: I1003 08:44:01.524774 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nrlkk" podUID="e3790394-92d9-4667-9b1a-e3aa70c6a9f2" containerName="registry-server" containerID="cri-o://43252b8d086bfe623f4a91a17b16429c7ac8bd1c8c4411305128b2dd4288edea" gracePeriod=2 Oct 03 08:44:01 crc kubenswrapper[4664]: I1003 08:44:01.919250 4664 generic.go:334] "Generic (PLEG): container finished" podID="e3790394-92d9-4667-9b1a-e3aa70c6a9f2" containerID="43252b8d086bfe623f4a91a17b16429c7ac8bd1c8c4411305128b2dd4288edea" exitCode=0 Oct 03 08:44:01 crc kubenswrapper[4664]: I1003 08:44:01.919328 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nrlkk" event={"ID":"e3790394-92d9-4667-9b1a-e3aa70c6a9f2","Type":"ContainerDied","Data":"43252b8d086bfe623f4a91a17b16429c7ac8bd1c8c4411305128b2dd4288edea"} Oct 03 08:44:02 crc kubenswrapper[4664]: I1003 08:44:02.020724 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nrlkk" Oct 03 08:44:02 crc kubenswrapper[4664]: I1003 08:44:02.168677 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvnr8\" (UniqueName: \"kubernetes.io/projected/e3790394-92d9-4667-9b1a-e3aa70c6a9f2-kube-api-access-dvnr8\") pod \"e3790394-92d9-4667-9b1a-e3aa70c6a9f2\" (UID: \"e3790394-92d9-4667-9b1a-e3aa70c6a9f2\") " Oct 03 08:44:02 crc kubenswrapper[4664]: I1003 08:44:02.169013 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3790394-92d9-4667-9b1a-e3aa70c6a9f2-utilities\") pod \"e3790394-92d9-4667-9b1a-e3aa70c6a9f2\" (UID: \"e3790394-92d9-4667-9b1a-e3aa70c6a9f2\") " Oct 03 08:44:02 crc kubenswrapper[4664]: I1003 08:44:02.169070 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3790394-92d9-4667-9b1a-e3aa70c6a9f2-catalog-content\") pod \"e3790394-92d9-4667-9b1a-e3aa70c6a9f2\" (UID: \"e3790394-92d9-4667-9b1a-e3aa70c6a9f2\") " Oct 03 08:44:02 crc kubenswrapper[4664]: I1003 08:44:02.171301 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3790394-92d9-4667-9b1a-e3aa70c6a9f2-utilities" (OuterVolumeSpecName: "utilities") pod "e3790394-92d9-4667-9b1a-e3aa70c6a9f2" (UID: "e3790394-92d9-4667-9b1a-e3aa70c6a9f2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:44:02 crc kubenswrapper[4664]: I1003 08:44:02.181250 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3790394-92d9-4667-9b1a-e3aa70c6a9f2-kube-api-access-dvnr8" (OuterVolumeSpecName: "kube-api-access-dvnr8") pod "e3790394-92d9-4667-9b1a-e3aa70c6a9f2" (UID: "e3790394-92d9-4667-9b1a-e3aa70c6a9f2"). InnerVolumeSpecName "kube-api-access-dvnr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:44:02 crc kubenswrapper[4664]: I1003 08:44:02.272960 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvnr8\" (UniqueName: \"kubernetes.io/projected/e3790394-92d9-4667-9b1a-e3aa70c6a9f2-kube-api-access-dvnr8\") on node \"crc\" DevicePath \"\"" Oct 03 08:44:02 crc kubenswrapper[4664]: I1003 08:44:02.273004 4664 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3790394-92d9-4667-9b1a-e3aa70c6a9f2-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:44:02 crc kubenswrapper[4664]: I1003 08:44:02.305223 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3790394-92d9-4667-9b1a-e3aa70c6a9f2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e3790394-92d9-4667-9b1a-e3aa70c6a9f2" (UID: "e3790394-92d9-4667-9b1a-e3aa70c6a9f2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:44:02 crc kubenswrapper[4664]: I1003 08:44:02.374221 4664 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3790394-92d9-4667-9b1a-e3aa70c6a9f2-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:44:02 crc kubenswrapper[4664]: I1003 08:44:02.934753 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nrlkk" event={"ID":"e3790394-92d9-4667-9b1a-e3aa70c6a9f2","Type":"ContainerDied","Data":"17a395175061abb23067107b01bec9434d45628f1311e08fd598e6fcce588128"} Oct 03 08:44:02 crc kubenswrapper[4664]: I1003 08:44:02.934840 4664 scope.go:117] "RemoveContainer" containerID="43252b8d086bfe623f4a91a17b16429c7ac8bd1c8c4411305128b2dd4288edea" Oct 03 08:44:02 crc kubenswrapper[4664]: I1003 08:44:02.934887 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nrlkk" Oct 03 08:44:02 crc kubenswrapper[4664]: I1003 08:44:02.974084 4664 scope.go:117] "RemoveContainer" containerID="820c7c7d24c5e86f2188b4d275cd194912c2883fda15890eb78ad26201bdfe7a" Oct 03 08:44:02 crc kubenswrapper[4664]: I1003 08:44:02.981409 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nrlkk"] Oct 03 08:44:03 crc kubenswrapper[4664]: I1003 08:44:03.025697 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nrlkk"] Oct 03 08:44:03 crc kubenswrapper[4664]: I1003 08:44:03.028658 4664 scope.go:117] "RemoveContainer" containerID="03f2ca66c9da56cc31e80257907edcf0de193833d6f6bfbae222f120974f01c9" Oct 03 08:44:03 crc kubenswrapper[4664]: I1003 08:44:03.876981 4664 scope.go:117] "RemoveContainer" containerID="db2833a516ac8601848d83e3ded1226e2dd9d4f9bf2dd94522498c7951f646a6" Oct 03 08:44:03 crc kubenswrapper[4664]: E1003 08:44:03.877592 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:44:03 crc kubenswrapper[4664]: I1003 08:44:03.888897 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3790394-92d9-4667-9b1a-e3aa70c6a9f2" path="/var/lib/kubelet/pods/e3790394-92d9-4667-9b1a-e3aa70c6a9f2/volumes" Oct 03 08:44:15 crc kubenswrapper[4664]: I1003 08:44:15.877307 4664 scope.go:117] "RemoveContainer" containerID="db2833a516ac8601848d83e3ded1226e2dd9d4f9bf2dd94522498c7951f646a6" Oct 03 08:44:15 crc kubenswrapper[4664]: E1003 08:44:15.878384 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:44:30 crc kubenswrapper[4664]: I1003 08:44:30.877264 4664 scope.go:117] "RemoveContainer" containerID="db2833a516ac8601848d83e3ded1226e2dd9d4f9bf2dd94522498c7951f646a6" Oct 03 08:44:30 crc kubenswrapper[4664]: E1003 08:44:30.878974 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:44:42 crc kubenswrapper[4664]: I1003 08:44:42.876599 4664 scope.go:117] "RemoveContainer" containerID="db2833a516ac8601848d83e3ded1226e2dd9d4f9bf2dd94522498c7951f646a6" Oct 03 08:44:42 crc kubenswrapper[4664]: E1003 08:44:42.877783 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:44:55 crc kubenswrapper[4664]: I1003 08:44:55.876284 4664 scope.go:117] "RemoveContainer" containerID="db2833a516ac8601848d83e3ded1226e2dd9d4f9bf2dd94522498c7951f646a6" Oct 03 08:44:55 crc kubenswrapper[4664]: E1003 08:44:55.877197 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:45:00 crc kubenswrapper[4664]: I1003 08:45:00.173353 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324685-gcmb6"] Oct 03 08:45:00 crc kubenswrapper[4664]: E1003 08:45:00.175175 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3790394-92d9-4667-9b1a-e3aa70c6a9f2" containerName="extract-utilities" Oct 03 08:45:00 crc kubenswrapper[4664]: I1003 08:45:00.175203 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3790394-92d9-4667-9b1a-e3aa70c6a9f2" containerName="extract-utilities" Oct 03 08:45:00 crc kubenswrapper[4664]: E1003 08:45:00.175226 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3790394-92d9-4667-9b1a-e3aa70c6a9f2" containerName="extract-content" Oct 03 08:45:00 crc kubenswrapper[4664]: I1003 08:45:00.175234 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3790394-92d9-4667-9b1a-e3aa70c6a9f2" containerName="extract-content" Oct 03 08:45:00 crc kubenswrapper[4664]: E1003 08:45:00.175256 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6459c504-5d74-4b7d-b62a-25c9a95754f8" containerName="registry-server" Oct 03 08:45:00 crc kubenswrapper[4664]: I1003 08:45:00.175265 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="6459c504-5d74-4b7d-b62a-25c9a95754f8" containerName="registry-server" Oct 03 08:45:00 crc kubenswrapper[4664]: E1003 08:45:00.175280 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6459c504-5d74-4b7d-b62a-25c9a95754f8" containerName="extract-utilities" Oct 03 08:45:00 crc kubenswrapper[4664]: I1003 08:45:00.175289 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="6459c504-5d74-4b7d-b62a-25c9a95754f8" containerName="extract-utilities" Oct 03 08:45:00 crc kubenswrapper[4664]: E1003 08:45:00.175301 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81bc144e-0356-46fa-ab7e-5aeaa662e875" containerName="registry-server" Oct 03 08:45:00 crc kubenswrapper[4664]: I1003 08:45:00.175308 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="81bc144e-0356-46fa-ab7e-5aeaa662e875" containerName="registry-server" Oct 03 08:45:00 crc kubenswrapper[4664]: E1003 08:45:00.175325 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81bc144e-0356-46fa-ab7e-5aeaa662e875" containerName="extract-utilities" Oct 03 08:45:00 crc kubenswrapper[4664]: I1003 08:45:00.175333 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="81bc144e-0356-46fa-ab7e-5aeaa662e875" containerName="extract-utilities" Oct 03 08:45:00 crc kubenswrapper[4664]: E1003 08:45:00.175347 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6459c504-5d74-4b7d-b62a-25c9a95754f8" containerName="extract-content" Oct 03 08:45:00 crc kubenswrapper[4664]: I1003 08:45:00.175354 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="6459c504-5d74-4b7d-b62a-25c9a95754f8" containerName="extract-content" Oct 03 08:45:00 crc kubenswrapper[4664]: E1003 08:45:00.175385 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81bc144e-0356-46fa-ab7e-5aeaa662e875" containerName="extract-content" Oct 03 08:45:00 crc kubenswrapper[4664]: I1003 08:45:00.175392 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="81bc144e-0356-46fa-ab7e-5aeaa662e875" containerName="extract-content" Oct 03 08:45:00 crc kubenswrapper[4664]: E1003 08:45:00.175399 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3790394-92d9-4667-9b1a-e3aa70c6a9f2" containerName="registry-server" Oct 03 08:45:00 crc kubenswrapper[4664]: I1003 08:45:00.175406 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3790394-92d9-4667-9b1a-e3aa70c6a9f2" containerName="registry-server" Oct 03 08:45:00 crc kubenswrapper[4664]: I1003 08:45:00.175689 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="6459c504-5d74-4b7d-b62a-25c9a95754f8" containerName="registry-server" Oct 03 08:45:00 crc kubenswrapper[4664]: I1003 08:45:00.175711 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3790394-92d9-4667-9b1a-e3aa70c6a9f2" containerName="registry-server" Oct 03 08:45:00 crc kubenswrapper[4664]: I1003 08:45:00.175724 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="81bc144e-0356-46fa-ab7e-5aeaa662e875" containerName="registry-server" Oct 03 08:45:00 crc kubenswrapper[4664]: I1003 08:45:00.176814 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-gcmb6" Oct 03 08:45:00 crc kubenswrapper[4664]: I1003 08:45:00.181389 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 08:45:00 crc kubenswrapper[4664]: I1003 08:45:00.181918 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 08:45:00 crc kubenswrapper[4664]: I1003 08:45:00.190202 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324685-gcmb6"] Oct 03 08:45:00 crc kubenswrapper[4664]: I1003 08:45:00.235938 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b8587107-e045-4f3a-a20b-bfcaa25df7e4-secret-volume\") pod \"collect-profiles-29324685-gcmb6\" (UID: \"b8587107-e045-4f3a-a20b-bfcaa25df7e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-gcmb6" Oct 03 08:45:00 crc kubenswrapper[4664]: I1003 08:45:00.236021 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc8x7\" (UniqueName: \"kubernetes.io/projected/b8587107-e045-4f3a-a20b-bfcaa25df7e4-kube-api-access-vc8x7\") pod \"collect-profiles-29324685-gcmb6\" (UID: \"b8587107-e045-4f3a-a20b-bfcaa25df7e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-gcmb6" Oct 03 08:45:00 crc kubenswrapper[4664]: I1003 08:45:00.236077 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b8587107-e045-4f3a-a20b-bfcaa25df7e4-config-volume\") pod \"collect-profiles-29324685-gcmb6\" (UID: \"b8587107-e045-4f3a-a20b-bfcaa25df7e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-gcmb6" Oct 03 08:45:00 crc kubenswrapper[4664]: I1003 08:45:00.338460 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b8587107-e045-4f3a-a20b-bfcaa25df7e4-secret-volume\") pod \"collect-profiles-29324685-gcmb6\" (UID: \"b8587107-e045-4f3a-a20b-bfcaa25df7e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-gcmb6" Oct 03 08:45:00 crc kubenswrapper[4664]: I1003 08:45:00.338560 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vc8x7\" (UniqueName: \"kubernetes.io/projected/b8587107-e045-4f3a-a20b-bfcaa25df7e4-kube-api-access-vc8x7\") pod \"collect-profiles-29324685-gcmb6\" (UID: \"b8587107-e045-4f3a-a20b-bfcaa25df7e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-gcmb6" Oct 03 08:45:00 crc kubenswrapper[4664]: I1003 08:45:00.338640 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b8587107-e045-4f3a-a20b-bfcaa25df7e4-config-volume\") pod \"collect-profiles-29324685-gcmb6\" (UID: \"b8587107-e045-4f3a-a20b-bfcaa25df7e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-gcmb6" Oct 03 08:45:00 crc kubenswrapper[4664]: I1003 08:45:00.340077 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b8587107-e045-4f3a-a20b-bfcaa25df7e4-config-volume\") pod \"collect-profiles-29324685-gcmb6\" (UID: \"b8587107-e045-4f3a-a20b-bfcaa25df7e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-gcmb6" Oct 03 08:45:00 crc kubenswrapper[4664]: I1003 08:45:00.347297 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b8587107-e045-4f3a-a20b-bfcaa25df7e4-secret-volume\") pod \"collect-profiles-29324685-gcmb6\" (UID: \"b8587107-e045-4f3a-a20b-bfcaa25df7e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-gcmb6" Oct 03 08:45:00 crc kubenswrapper[4664]: I1003 08:45:00.364018 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc8x7\" (UniqueName: \"kubernetes.io/projected/b8587107-e045-4f3a-a20b-bfcaa25df7e4-kube-api-access-vc8x7\") pod \"collect-profiles-29324685-gcmb6\" (UID: \"b8587107-e045-4f3a-a20b-bfcaa25df7e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-gcmb6" Oct 03 08:45:00 crc kubenswrapper[4664]: I1003 08:45:00.509683 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-gcmb6" Oct 03 08:45:01 crc kubenswrapper[4664]: I1003 08:45:01.027253 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324685-gcmb6"] Oct 03 08:45:01 crc kubenswrapper[4664]: I1003 08:45:01.526092 4664 generic.go:334] "Generic (PLEG): container finished" podID="b8587107-e045-4f3a-a20b-bfcaa25df7e4" containerID="b2cb1c7c824ec3b5aa6f029230aa19bb480ae9bcf48ea9493810b83cf227079e" exitCode=0 Oct 03 08:45:01 crc kubenswrapper[4664]: I1003 08:45:01.526205 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-gcmb6" event={"ID":"b8587107-e045-4f3a-a20b-bfcaa25df7e4","Type":"ContainerDied","Data":"b2cb1c7c824ec3b5aa6f029230aa19bb480ae9bcf48ea9493810b83cf227079e"} Oct 03 08:45:01 crc kubenswrapper[4664]: I1003 08:45:01.526561 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-gcmb6" event={"ID":"b8587107-e045-4f3a-a20b-bfcaa25df7e4","Type":"ContainerStarted","Data":"11c562bac84f084884160c31430fcb746d078203e72e3ae03b3d835622fc471b"} Oct 03 08:45:02 crc kubenswrapper[4664]: I1003 08:45:02.876102 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-gcmb6" Oct 03 08:45:03 crc kubenswrapper[4664]: I1003 08:45:03.001491 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b8587107-e045-4f3a-a20b-bfcaa25df7e4-config-volume\") pod \"b8587107-e045-4f3a-a20b-bfcaa25df7e4\" (UID: \"b8587107-e045-4f3a-a20b-bfcaa25df7e4\") " Oct 03 08:45:03 crc kubenswrapper[4664]: I1003 08:45:03.001664 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vc8x7\" (UniqueName: \"kubernetes.io/projected/b8587107-e045-4f3a-a20b-bfcaa25df7e4-kube-api-access-vc8x7\") pod \"b8587107-e045-4f3a-a20b-bfcaa25df7e4\" (UID: \"b8587107-e045-4f3a-a20b-bfcaa25df7e4\") " Oct 03 08:45:03 crc kubenswrapper[4664]: I1003 08:45:03.001694 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b8587107-e045-4f3a-a20b-bfcaa25df7e4-secret-volume\") pod \"b8587107-e045-4f3a-a20b-bfcaa25df7e4\" (UID: \"b8587107-e045-4f3a-a20b-bfcaa25df7e4\") " Oct 03 08:45:03 crc kubenswrapper[4664]: I1003 08:45:03.002999 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8587107-e045-4f3a-a20b-bfcaa25df7e4-config-volume" (OuterVolumeSpecName: "config-volume") pod "b8587107-e045-4f3a-a20b-bfcaa25df7e4" (UID: "b8587107-e045-4f3a-a20b-bfcaa25df7e4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:45:03 crc kubenswrapper[4664]: I1003 08:45:03.010099 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8587107-e045-4f3a-a20b-bfcaa25df7e4-kube-api-access-vc8x7" (OuterVolumeSpecName: "kube-api-access-vc8x7") pod "b8587107-e045-4f3a-a20b-bfcaa25df7e4" (UID: "b8587107-e045-4f3a-a20b-bfcaa25df7e4"). InnerVolumeSpecName "kube-api-access-vc8x7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:45:03 crc kubenswrapper[4664]: I1003 08:45:03.010840 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8587107-e045-4f3a-a20b-bfcaa25df7e4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b8587107-e045-4f3a-a20b-bfcaa25df7e4" (UID: "b8587107-e045-4f3a-a20b-bfcaa25df7e4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:45:03 crc kubenswrapper[4664]: I1003 08:45:03.104346 4664 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b8587107-e045-4f3a-a20b-bfcaa25df7e4-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 08:45:03 crc kubenswrapper[4664]: I1003 08:45:03.104384 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vc8x7\" (UniqueName: \"kubernetes.io/projected/b8587107-e045-4f3a-a20b-bfcaa25df7e4-kube-api-access-vc8x7\") on node \"crc\" DevicePath \"\"" Oct 03 08:45:03 crc kubenswrapper[4664]: I1003 08:45:03.104399 4664 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b8587107-e045-4f3a-a20b-bfcaa25df7e4-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 08:45:03 crc kubenswrapper[4664]: I1003 08:45:03.549764 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-gcmb6" event={"ID":"b8587107-e045-4f3a-a20b-bfcaa25df7e4","Type":"ContainerDied","Data":"11c562bac84f084884160c31430fcb746d078203e72e3ae03b3d835622fc471b"} Oct 03 08:45:03 crc kubenswrapper[4664]: I1003 08:45:03.550271 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11c562bac84f084884160c31430fcb746d078203e72e3ae03b3d835622fc471b" Oct 03 08:45:03 crc kubenswrapper[4664]: I1003 08:45:03.549818 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-gcmb6" Oct 03 08:45:03 crc kubenswrapper[4664]: I1003 08:45:03.968098 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324640-tg8nc"] Oct 03 08:45:03 crc kubenswrapper[4664]: I1003 08:45:03.977339 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324640-tg8nc"] Oct 03 08:45:05 crc kubenswrapper[4664]: I1003 08:45:05.895690 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dc17bd7-8f56-4801-8595-5c4578747397" path="/var/lib/kubelet/pods/5dc17bd7-8f56-4801-8595-5c4578747397/volumes" Oct 03 08:45:08 crc kubenswrapper[4664]: I1003 08:45:08.877440 4664 scope.go:117] "RemoveContainer" containerID="db2833a516ac8601848d83e3ded1226e2dd9d4f9bf2dd94522498c7951f646a6" Oct 03 08:45:08 crc kubenswrapper[4664]: E1003 08:45:08.878065 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:45:20 crc kubenswrapper[4664]: I1003 08:45:20.877804 4664 scope.go:117] "RemoveContainer" containerID="db2833a516ac8601848d83e3ded1226e2dd9d4f9bf2dd94522498c7951f646a6" Oct 03 08:45:20 crc kubenswrapper[4664]: E1003 08:45:20.878977 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:45:26 crc kubenswrapper[4664]: I1003 08:45:26.842373 4664 scope.go:117] "RemoveContainer" containerID="8363663347e92baf645e5ae9dff3c8ecc7c6bacd929ea27644bf6242b0dfe4e1" Oct 03 08:45:32 crc kubenswrapper[4664]: I1003 08:45:32.877015 4664 scope.go:117] "RemoveContainer" containerID="db2833a516ac8601848d83e3ded1226e2dd9d4f9bf2dd94522498c7951f646a6" Oct 03 08:45:32 crc kubenswrapper[4664]: E1003 08:45:32.879075 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:45:45 crc kubenswrapper[4664]: I1003 08:45:45.877763 4664 scope.go:117] "RemoveContainer" containerID="db2833a516ac8601848d83e3ded1226e2dd9d4f9bf2dd94522498c7951f646a6" Oct 03 08:45:47 crc kubenswrapper[4664]: I1003 08:45:47.023079 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" event={"ID":"598b81ce-0ce7-498f-9337-ae5e6e64682b","Type":"ContainerStarted","Data":"980273c98def58db30cd8640f9d8d3fecda4ca7c5d82aaa4081af96069aec987"} Oct 03 08:48:11 crc kubenswrapper[4664]: I1003 08:48:11.986959 4664 patch_prober.go:28] interesting pod/machine-config-daemon-x9dgm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:48:11 crc kubenswrapper[4664]: I1003 08:48:11.987576 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:48:41 crc kubenswrapper[4664]: I1003 08:48:41.987696 4664 patch_prober.go:28] interesting pod/machine-config-daemon-x9dgm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:48:41 crc kubenswrapper[4664]: I1003 08:48:41.988230 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:49:11 crc kubenswrapper[4664]: I1003 08:49:11.987672 4664 patch_prober.go:28] interesting pod/machine-config-daemon-x9dgm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:49:11 crc kubenswrapper[4664]: I1003 08:49:11.988932 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:49:11 crc kubenswrapper[4664]: I1003 08:49:11.989013 4664 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" Oct 03 08:49:11 crc kubenswrapper[4664]: I1003 08:49:11.990210 4664 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"980273c98def58db30cd8640f9d8d3fecda4ca7c5d82aaa4081af96069aec987"} pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 08:49:11 crc kubenswrapper[4664]: I1003 08:49:11.990280 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" containerID="cri-o://980273c98def58db30cd8640f9d8d3fecda4ca7c5d82aaa4081af96069aec987" gracePeriod=600 Oct 03 08:49:13 crc kubenswrapper[4664]: I1003 08:49:13.120456 4664 generic.go:334] "Generic (PLEG): container finished" podID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerID="980273c98def58db30cd8640f9d8d3fecda4ca7c5d82aaa4081af96069aec987" exitCode=0 Oct 03 08:49:13 crc kubenswrapper[4664]: I1003 08:49:13.120579 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" event={"ID":"598b81ce-0ce7-498f-9337-ae5e6e64682b","Type":"ContainerDied","Data":"980273c98def58db30cd8640f9d8d3fecda4ca7c5d82aaa4081af96069aec987"} Oct 03 08:49:13 crc kubenswrapper[4664]: I1003 08:49:13.121107 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" event={"ID":"598b81ce-0ce7-498f-9337-ae5e6e64682b","Type":"ContainerStarted","Data":"c2578004127ac42c34a7a434d2b185c8e2f477653435943ff7ad1aca115c81d8"} Oct 03 08:49:13 crc kubenswrapper[4664]: I1003 08:49:13.121138 4664 scope.go:117] "RemoveContainer" containerID="db2833a516ac8601848d83e3ded1226e2dd9d4f9bf2dd94522498c7951f646a6" Oct 03 08:51:41 crc kubenswrapper[4664]: I1003 08:51:41.987524 4664 patch_prober.go:28] interesting pod/machine-config-daemon-x9dgm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:51:41 crc kubenswrapper[4664]: I1003 08:51:41.988171 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:52:11 crc kubenswrapper[4664]: I1003 08:52:11.986651 4664 patch_prober.go:28] interesting pod/machine-config-daemon-x9dgm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:52:11 crc kubenswrapper[4664]: I1003 08:52:11.987691 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:52:41 crc kubenswrapper[4664]: I1003 08:52:41.986890 4664 patch_prober.go:28] interesting pod/machine-config-daemon-x9dgm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:52:41 crc kubenswrapper[4664]: I1003 08:52:41.988039 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:52:41 crc kubenswrapper[4664]: I1003 08:52:41.988115 4664 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" Oct 03 08:52:41 crc kubenswrapper[4664]: I1003 08:52:41.989372 4664 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c2578004127ac42c34a7a434d2b185c8e2f477653435943ff7ad1aca115c81d8"} pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 08:52:41 crc kubenswrapper[4664]: I1003 08:52:41.989439 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" containerID="cri-o://c2578004127ac42c34a7a434d2b185c8e2f477653435943ff7ad1aca115c81d8" gracePeriod=600 Oct 03 08:52:42 crc kubenswrapper[4664]: E1003 08:52:42.129817 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:52:42 crc kubenswrapper[4664]: I1003 08:52:42.255298 4664 generic.go:334] "Generic (PLEG): container finished" podID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerID="c2578004127ac42c34a7a434d2b185c8e2f477653435943ff7ad1aca115c81d8" exitCode=0 Oct 03 08:52:42 crc kubenswrapper[4664]: I1003 08:52:42.255383 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" event={"ID":"598b81ce-0ce7-498f-9337-ae5e6e64682b","Type":"ContainerDied","Data":"c2578004127ac42c34a7a434d2b185c8e2f477653435943ff7ad1aca115c81d8"} Oct 03 08:52:42 crc kubenswrapper[4664]: I1003 08:52:42.255666 4664 scope.go:117] "RemoveContainer" containerID="980273c98def58db30cd8640f9d8d3fecda4ca7c5d82aaa4081af96069aec987" Oct 03 08:52:42 crc kubenswrapper[4664]: I1003 08:52:42.256421 4664 scope.go:117] "RemoveContainer" containerID="c2578004127ac42c34a7a434d2b185c8e2f477653435943ff7ad1aca115c81d8" Oct 03 08:52:42 crc kubenswrapper[4664]: E1003 08:52:42.256767 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:52:56 crc kubenswrapper[4664]: I1003 08:52:56.876526 4664 scope.go:117] "RemoveContainer" containerID="c2578004127ac42c34a7a434d2b185c8e2f477653435943ff7ad1aca115c81d8" Oct 03 08:52:56 crc kubenswrapper[4664]: E1003 08:52:56.877459 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:53:09 crc kubenswrapper[4664]: I1003 08:53:09.895909 4664 scope.go:117] "RemoveContainer" containerID="c2578004127ac42c34a7a434d2b185c8e2f477653435943ff7ad1aca115c81d8" Oct 03 08:53:09 crc kubenswrapper[4664]: E1003 08:53:09.897538 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:53:21 crc kubenswrapper[4664]: I1003 08:53:21.877032 4664 scope.go:117] "RemoveContainer" containerID="c2578004127ac42c34a7a434d2b185c8e2f477653435943ff7ad1aca115c81d8" Oct 03 08:53:21 crc kubenswrapper[4664]: E1003 08:53:21.878291 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:53:32 crc kubenswrapper[4664]: I1003 08:53:32.876695 4664 scope.go:117] "RemoveContainer" containerID="c2578004127ac42c34a7a434d2b185c8e2f477653435943ff7ad1aca115c81d8" Oct 03 08:53:32 crc kubenswrapper[4664]: E1003 08:53:32.879521 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:53:33 crc kubenswrapper[4664]: I1003 08:53:33.672082 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-f5pqt"] Oct 03 08:53:33 crc kubenswrapper[4664]: E1003 08:53:33.672556 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8587107-e045-4f3a-a20b-bfcaa25df7e4" containerName="collect-profiles" Oct 03 08:53:33 crc kubenswrapper[4664]: I1003 08:53:33.672575 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8587107-e045-4f3a-a20b-bfcaa25df7e4" containerName="collect-profiles" Oct 03 08:53:33 crc kubenswrapper[4664]: I1003 08:53:33.672844 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8587107-e045-4f3a-a20b-bfcaa25df7e4" containerName="collect-profiles" Oct 03 08:53:33 crc kubenswrapper[4664]: I1003 08:53:33.678251 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f5pqt" Oct 03 08:53:33 crc kubenswrapper[4664]: I1003 08:53:33.684791 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f5pqt"] Oct 03 08:53:33 crc kubenswrapper[4664]: I1003 08:53:33.743242 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2c772da-a250-4c08-a5dc-882055cb19d8-catalog-content\") pod \"certified-operators-f5pqt\" (UID: \"b2c772da-a250-4c08-a5dc-882055cb19d8\") " pod="openshift-marketplace/certified-operators-f5pqt" Oct 03 08:53:33 crc kubenswrapper[4664]: I1003 08:53:33.743857 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-986hv\" (UniqueName: \"kubernetes.io/projected/b2c772da-a250-4c08-a5dc-882055cb19d8-kube-api-access-986hv\") pod \"certified-operators-f5pqt\" (UID: \"b2c772da-a250-4c08-a5dc-882055cb19d8\") " pod="openshift-marketplace/certified-operators-f5pqt" Oct 03 08:53:33 crc kubenswrapper[4664]: I1003 08:53:33.744293 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2c772da-a250-4c08-a5dc-882055cb19d8-utilities\") pod \"certified-operators-f5pqt\" (UID: \"b2c772da-a250-4c08-a5dc-882055cb19d8\") " pod="openshift-marketplace/certified-operators-f5pqt" Oct 03 08:53:33 crc kubenswrapper[4664]: I1003 08:53:33.846276 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2c772da-a250-4c08-a5dc-882055cb19d8-utilities\") pod \"certified-operators-f5pqt\" (UID: \"b2c772da-a250-4c08-a5dc-882055cb19d8\") " pod="openshift-marketplace/certified-operators-f5pqt" Oct 03 08:53:33 crc kubenswrapper[4664]: I1003 08:53:33.846410 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2c772da-a250-4c08-a5dc-882055cb19d8-catalog-content\") pod \"certified-operators-f5pqt\" (UID: \"b2c772da-a250-4c08-a5dc-882055cb19d8\") " pod="openshift-marketplace/certified-operators-f5pqt" Oct 03 08:53:33 crc kubenswrapper[4664]: I1003 08:53:33.846514 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-986hv\" (UniqueName: \"kubernetes.io/projected/b2c772da-a250-4c08-a5dc-882055cb19d8-kube-api-access-986hv\") pod \"certified-operators-f5pqt\" (UID: \"b2c772da-a250-4c08-a5dc-882055cb19d8\") " pod="openshift-marketplace/certified-operators-f5pqt" Oct 03 08:53:33 crc kubenswrapper[4664]: I1003 08:53:33.846923 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2c772da-a250-4c08-a5dc-882055cb19d8-utilities\") pod \"certified-operators-f5pqt\" (UID: \"b2c772da-a250-4c08-a5dc-882055cb19d8\") " pod="openshift-marketplace/certified-operators-f5pqt" Oct 03 08:53:33 crc kubenswrapper[4664]: I1003 08:53:33.847150 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2c772da-a250-4c08-a5dc-882055cb19d8-catalog-content\") pod \"certified-operators-f5pqt\" (UID: \"b2c772da-a250-4c08-a5dc-882055cb19d8\") " pod="openshift-marketplace/certified-operators-f5pqt" Oct 03 08:53:33 crc kubenswrapper[4664]: I1003 08:53:33.878567 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-986hv\" (UniqueName: \"kubernetes.io/projected/b2c772da-a250-4c08-a5dc-882055cb19d8-kube-api-access-986hv\") pod \"certified-operators-f5pqt\" (UID: \"b2c772da-a250-4c08-a5dc-882055cb19d8\") " pod="openshift-marketplace/certified-operators-f5pqt" Oct 03 08:53:34 crc kubenswrapper[4664]: I1003 08:53:34.011320 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f5pqt" Oct 03 08:53:34 crc kubenswrapper[4664]: I1003 08:53:34.562475 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f5pqt"] Oct 03 08:53:34 crc kubenswrapper[4664]: I1003 08:53:34.776177 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f5pqt" event={"ID":"b2c772da-a250-4c08-a5dc-882055cb19d8","Type":"ContainerStarted","Data":"61e1e21c8cb767dedf1548fe78744d2370453971f3a053d439ee2aefebe7dd63"} Oct 03 08:53:35 crc kubenswrapper[4664]: I1003 08:53:35.788798 4664 generic.go:334] "Generic (PLEG): container finished" podID="b2c772da-a250-4c08-a5dc-882055cb19d8" containerID="72a429e7dd0734aca540cc01b5f1f1cd9532809a53b3ddaf710fcf9cde7216cb" exitCode=0 Oct 03 08:53:35 crc kubenswrapper[4664]: I1003 08:53:35.788869 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f5pqt" event={"ID":"b2c772da-a250-4c08-a5dc-882055cb19d8","Type":"ContainerDied","Data":"72a429e7dd0734aca540cc01b5f1f1cd9532809a53b3ddaf710fcf9cde7216cb"} Oct 03 08:53:35 crc kubenswrapper[4664]: I1003 08:53:35.791839 4664 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 08:53:36 crc kubenswrapper[4664]: I1003 08:53:36.800701 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f5pqt" event={"ID":"b2c772da-a250-4c08-a5dc-882055cb19d8","Type":"ContainerStarted","Data":"afbd14ec48d013d4f2d28bc7d4d5ba27e234221ceb1c40a70a61c2f74230e0d1"} Oct 03 08:53:37 crc kubenswrapper[4664]: I1003 08:53:37.811498 4664 generic.go:334] "Generic (PLEG): container finished" podID="b2c772da-a250-4c08-a5dc-882055cb19d8" containerID="afbd14ec48d013d4f2d28bc7d4d5ba27e234221ceb1c40a70a61c2f74230e0d1" exitCode=0 Oct 03 08:53:37 crc kubenswrapper[4664]: I1003 08:53:37.811574 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f5pqt" event={"ID":"b2c772da-a250-4c08-a5dc-882055cb19d8","Type":"ContainerDied","Data":"afbd14ec48d013d4f2d28bc7d4d5ba27e234221ceb1c40a70a61c2f74230e0d1"} Oct 03 08:53:38 crc kubenswrapper[4664]: I1003 08:53:38.827118 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f5pqt" event={"ID":"b2c772da-a250-4c08-a5dc-882055cb19d8","Type":"ContainerStarted","Data":"bff7dc7581dab84a8cc2c6922594f779796e97af7170ef3c2d311c18dcfc02a9"} Oct 03 08:53:38 crc kubenswrapper[4664]: I1003 08:53:38.858859 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-f5pqt" podStartSLOduration=3.447737467 podStartE2EDuration="5.858831787s" podCreationTimestamp="2025-10-03 08:53:33 +0000 UTC" firstStartedPulling="2025-10-03 08:53:35.791485573 +0000 UTC m=+3916.612676063" lastFinishedPulling="2025-10-03 08:53:38.202579893 +0000 UTC m=+3919.023770383" observedRunningTime="2025-10-03 08:53:38.847092691 +0000 UTC m=+3919.668283201" watchObservedRunningTime="2025-10-03 08:53:38.858831787 +0000 UTC m=+3919.680022277" Oct 03 08:53:44 crc kubenswrapper[4664]: I1003 08:53:44.011955 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-f5pqt" Oct 03 08:53:44 crc kubenswrapper[4664]: I1003 08:53:44.013177 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-f5pqt" Oct 03 08:53:44 crc kubenswrapper[4664]: I1003 08:53:44.066910 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-f5pqt" Oct 03 08:53:44 crc kubenswrapper[4664]: I1003 08:53:44.951328 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-f5pqt" Oct 03 08:53:45 crc kubenswrapper[4664]: I1003 08:53:45.008429 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f5pqt"] Oct 03 08:53:46 crc kubenswrapper[4664]: I1003 08:53:46.877168 4664 scope.go:117] "RemoveContainer" containerID="c2578004127ac42c34a7a434d2b185c8e2f477653435943ff7ad1aca115c81d8" Oct 03 08:53:46 crc kubenswrapper[4664]: E1003 08:53:46.878477 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:53:46 crc kubenswrapper[4664]: I1003 08:53:46.913741 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-f5pqt" podUID="b2c772da-a250-4c08-a5dc-882055cb19d8" containerName="registry-server" containerID="cri-o://bff7dc7581dab84a8cc2c6922594f779796e97af7170ef3c2d311c18dcfc02a9" gracePeriod=2 Oct 03 08:53:47 crc kubenswrapper[4664]: E1003 08:53:47.202048 4664 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2c772da_a250_4c08_a5dc_882055cb19d8.slice/crio-bff7dc7581dab84a8cc2c6922594f779796e97af7170ef3c2d311c18dcfc02a9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2c772da_a250_4c08_a5dc_882055cb19d8.slice/crio-conmon-bff7dc7581dab84a8cc2c6922594f779796e97af7170ef3c2d311c18dcfc02a9.scope\": RecentStats: unable to find data in memory cache]" Oct 03 08:53:47 crc kubenswrapper[4664]: I1003 08:53:47.401974 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f5pqt" Oct 03 08:53:47 crc kubenswrapper[4664]: I1003 08:53:47.474347 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2c772da-a250-4c08-a5dc-882055cb19d8-utilities\") pod \"b2c772da-a250-4c08-a5dc-882055cb19d8\" (UID: \"b2c772da-a250-4c08-a5dc-882055cb19d8\") " Oct 03 08:53:47 crc kubenswrapper[4664]: I1003 08:53:47.474455 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2c772da-a250-4c08-a5dc-882055cb19d8-catalog-content\") pod \"b2c772da-a250-4c08-a5dc-882055cb19d8\" (UID: \"b2c772da-a250-4c08-a5dc-882055cb19d8\") " Oct 03 08:53:47 crc kubenswrapper[4664]: I1003 08:53:47.474511 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-986hv\" (UniqueName: \"kubernetes.io/projected/b2c772da-a250-4c08-a5dc-882055cb19d8-kube-api-access-986hv\") pod \"b2c772da-a250-4c08-a5dc-882055cb19d8\" (UID: \"b2c772da-a250-4c08-a5dc-882055cb19d8\") " Oct 03 08:53:47 crc kubenswrapper[4664]: I1003 08:53:47.476274 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2c772da-a250-4c08-a5dc-882055cb19d8-utilities" (OuterVolumeSpecName: "utilities") pod "b2c772da-a250-4c08-a5dc-882055cb19d8" (UID: "b2c772da-a250-4c08-a5dc-882055cb19d8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:53:47 crc kubenswrapper[4664]: I1003 08:53:47.490198 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2c772da-a250-4c08-a5dc-882055cb19d8-kube-api-access-986hv" (OuterVolumeSpecName: "kube-api-access-986hv") pod "b2c772da-a250-4c08-a5dc-882055cb19d8" (UID: "b2c772da-a250-4c08-a5dc-882055cb19d8"). InnerVolumeSpecName "kube-api-access-986hv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:53:47 crc kubenswrapper[4664]: I1003 08:53:47.577189 4664 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2c772da-a250-4c08-a5dc-882055cb19d8-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:53:47 crc kubenswrapper[4664]: I1003 08:53:47.577231 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-986hv\" (UniqueName: \"kubernetes.io/projected/b2c772da-a250-4c08-a5dc-882055cb19d8-kube-api-access-986hv\") on node \"crc\" DevicePath \"\"" Oct 03 08:53:47 crc kubenswrapper[4664]: I1003 08:53:47.610979 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2c772da-a250-4c08-a5dc-882055cb19d8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b2c772da-a250-4c08-a5dc-882055cb19d8" (UID: "b2c772da-a250-4c08-a5dc-882055cb19d8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:53:47 crc kubenswrapper[4664]: I1003 08:53:47.679211 4664 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2c772da-a250-4c08-a5dc-882055cb19d8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:53:47 crc kubenswrapper[4664]: I1003 08:53:47.923687 4664 generic.go:334] "Generic (PLEG): container finished" podID="b2c772da-a250-4c08-a5dc-882055cb19d8" containerID="bff7dc7581dab84a8cc2c6922594f779796e97af7170ef3c2d311c18dcfc02a9" exitCode=0 Oct 03 08:53:47 crc kubenswrapper[4664]: I1003 08:53:47.923748 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f5pqt" event={"ID":"b2c772da-a250-4c08-a5dc-882055cb19d8","Type":"ContainerDied","Data":"bff7dc7581dab84a8cc2c6922594f779796e97af7170ef3c2d311c18dcfc02a9"} Oct 03 08:53:47 crc kubenswrapper[4664]: I1003 08:53:47.923859 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f5pqt" event={"ID":"b2c772da-a250-4c08-a5dc-882055cb19d8","Type":"ContainerDied","Data":"61e1e21c8cb767dedf1548fe78744d2370453971f3a053d439ee2aefebe7dd63"} Oct 03 08:53:47 crc kubenswrapper[4664]: I1003 08:53:47.923884 4664 scope.go:117] "RemoveContainer" containerID="bff7dc7581dab84a8cc2c6922594f779796e97af7170ef3c2d311c18dcfc02a9" Oct 03 08:53:47 crc kubenswrapper[4664]: I1003 08:53:47.923914 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f5pqt" Oct 03 08:53:47 crc kubenswrapper[4664]: I1003 08:53:47.949861 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f5pqt"] Oct 03 08:53:47 crc kubenswrapper[4664]: I1003 08:53:47.956720 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-f5pqt"] Oct 03 08:53:47 crc kubenswrapper[4664]: I1003 08:53:47.957567 4664 scope.go:117] "RemoveContainer" containerID="afbd14ec48d013d4f2d28bc7d4d5ba27e234221ceb1c40a70a61c2f74230e0d1" Oct 03 08:53:47 crc kubenswrapper[4664]: I1003 08:53:47.978831 4664 scope.go:117] "RemoveContainer" containerID="72a429e7dd0734aca540cc01b5f1f1cd9532809a53b3ddaf710fcf9cde7216cb" Oct 03 08:53:48 crc kubenswrapper[4664]: I1003 08:53:48.017737 4664 scope.go:117] "RemoveContainer" containerID="bff7dc7581dab84a8cc2c6922594f779796e97af7170ef3c2d311c18dcfc02a9" Oct 03 08:53:48 crc kubenswrapper[4664]: E1003 08:53:48.018889 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bff7dc7581dab84a8cc2c6922594f779796e97af7170ef3c2d311c18dcfc02a9\": container with ID starting with bff7dc7581dab84a8cc2c6922594f779796e97af7170ef3c2d311c18dcfc02a9 not found: ID does not exist" containerID="bff7dc7581dab84a8cc2c6922594f779796e97af7170ef3c2d311c18dcfc02a9" Oct 03 08:53:48 crc kubenswrapper[4664]: I1003 08:53:48.018947 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bff7dc7581dab84a8cc2c6922594f779796e97af7170ef3c2d311c18dcfc02a9"} err="failed to get container status \"bff7dc7581dab84a8cc2c6922594f779796e97af7170ef3c2d311c18dcfc02a9\": rpc error: code = NotFound desc = could not find container \"bff7dc7581dab84a8cc2c6922594f779796e97af7170ef3c2d311c18dcfc02a9\": container with ID starting with bff7dc7581dab84a8cc2c6922594f779796e97af7170ef3c2d311c18dcfc02a9 not found: ID does not exist" Oct 03 08:53:48 crc kubenswrapper[4664]: I1003 08:53:48.018984 4664 scope.go:117] "RemoveContainer" containerID="afbd14ec48d013d4f2d28bc7d4d5ba27e234221ceb1c40a70a61c2f74230e0d1" Oct 03 08:53:48 crc kubenswrapper[4664]: E1003 08:53:48.019549 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afbd14ec48d013d4f2d28bc7d4d5ba27e234221ceb1c40a70a61c2f74230e0d1\": container with ID starting with afbd14ec48d013d4f2d28bc7d4d5ba27e234221ceb1c40a70a61c2f74230e0d1 not found: ID does not exist" containerID="afbd14ec48d013d4f2d28bc7d4d5ba27e234221ceb1c40a70a61c2f74230e0d1" Oct 03 08:53:48 crc kubenswrapper[4664]: I1003 08:53:48.019621 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afbd14ec48d013d4f2d28bc7d4d5ba27e234221ceb1c40a70a61c2f74230e0d1"} err="failed to get container status \"afbd14ec48d013d4f2d28bc7d4d5ba27e234221ceb1c40a70a61c2f74230e0d1\": rpc error: code = NotFound desc = could not find container \"afbd14ec48d013d4f2d28bc7d4d5ba27e234221ceb1c40a70a61c2f74230e0d1\": container with ID starting with afbd14ec48d013d4f2d28bc7d4d5ba27e234221ceb1c40a70a61c2f74230e0d1 not found: ID does not exist" Oct 03 08:53:48 crc kubenswrapper[4664]: I1003 08:53:48.019650 4664 scope.go:117] "RemoveContainer" containerID="72a429e7dd0734aca540cc01b5f1f1cd9532809a53b3ddaf710fcf9cde7216cb" Oct 03 08:53:48 crc kubenswrapper[4664]: E1003 08:53:48.020335 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72a429e7dd0734aca540cc01b5f1f1cd9532809a53b3ddaf710fcf9cde7216cb\": container with ID starting with 72a429e7dd0734aca540cc01b5f1f1cd9532809a53b3ddaf710fcf9cde7216cb not found: ID does not exist" containerID="72a429e7dd0734aca540cc01b5f1f1cd9532809a53b3ddaf710fcf9cde7216cb" Oct 03 08:53:48 crc kubenswrapper[4664]: I1003 08:53:48.020378 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72a429e7dd0734aca540cc01b5f1f1cd9532809a53b3ddaf710fcf9cde7216cb"} err="failed to get container status \"72a429e7dd0734aca540cc01b5f1f1cd9532809a53b3ddaf710fcf9cde7216cb\": rpc error: code = NotFound desc = could not find container \"72a429e7dd0734aca540cc01b5f1f1cd9532809a53b3ddaf710fcf9cde7216cb\": container with ID starting with 72a429e7dd0734aca540cc01b5f1f1cd9532809a53b3ddaf710fcf9cde7216cb not found: ID does not exist" Oct 03 08:53:48 crc kubenswrapper[4664]: I1003 08:53:48.804564 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-d87vb"] Oct 03 08:53:48 crc kubenswrapper[4664]: E1003 08:53:48.805466 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2c772da-a250-4c08-a5dc-882055cb19d8" containerName="extract-utilities" Oct 03 08:53:48 crc kubenswrapper[4664]: I1003 08:53:48.805592 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2c772da-a250-4c08-a5dc-882055cb19d8" containerName="extract-utilities" Oct 03 08:53:48 crc kubenswrapper[4664]: E1003 08:53:48.805771 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2c772da-a250-4c08-a5dc-882055cb19d8" containerName="registry-server" Oct 03 08:53:48 crc kubenswrapper[4664]: I1003 08:53:48.805862 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2c772da-a250-4c08-a5dc-882055cb19d8" containerName="registry-server" Oct 03 08:53:48 crc kubenswrapper[4664]: E1003 08:53:48.805973 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2c772da-a250-4c08-a5dc-882055cb19d8" containerName="extract-content" Oct 03 08:53:48 crc kubenswrapper[4664]: I1003 08:53:48.806057 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2c772da-a250-4c08-a5dc-882055cb19d8" containerName="extract-content" Oct 03 08:53:48 crc kubenswrapper[4664]: I1003 08:53:48.806397 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2c772da-a250-4c08-a5dc-882055cb19d8" containerName="registry-server" Oct 03 08:53:48 crc kubenswrapper[4664]: I1003 08:53:48.808108 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d87vb" Oct 03 08:53:48 crc kubenswrapper[4664]: I1003 08:53:48.819941 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d87vb"] Oct 03 08:53:48 crc kubenswrapper[4664]: I1003 08:53:48.905715 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba4ebf5c-b152-4de7-88bc-4536f7bf3206-utilities\") pod \"redhat-marketplace-d87vb\" (UID: \"ba4ebf5c-b152-4de7-88bc-4536f7bf3206\") " pod="openshift-marketplace/redhat-marketplace-d87vb" Oct 03 08:53:48 crc kubenswrapper[4664]: I1003 08:53:48.905775 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba4ebf5c-b152-4de7-88bc-4536f7bf3206-catalog-content\") pod \"redhat-marketplace-d87vb\" (UID: \"ba4ebf5c-b152-4de7-88bc-4536f7bf3206\") " pod="openshift-marketplace/redhat-marketplace-d87vb" Oct 03 08:53:48 crc kubenswrapper[4664]: I1003 08:53:48.906463 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgt7t\" (UniqueName: \"kubernetes.io/projected/ba4ebf5c-b152-4de7-88bc-4536f7bf3206-kube-api-access-qgt7t\") pod \"redhat-marketplace-d87vb\" (UID: \"ba4ebf5c-b152-4de7-88bc-4536f7bf3206\") " pod="openshift-marketplace/redhat-marketplace-d87vb" Oct 03 08:53:49 crc kubenswrapper[4664]: I1003 08:53:49.010345 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgt7t\" (UniqueName: \"kubernetes.io/projected/ba4ebf5c-b152-4de7-88bc-4536f7bf3206-kube-api-access-qgt7t\") pod \"redhat-marketplace-d87vb\" (UID: \"ba4ebf5c-b152-4de7-88bc-4536f7bf3206\") " pod="openshift-marketplace/redhat-marketplace-d87vb" Oct 03 08:53:49 crc kubenswrapper[4664]: I1003 08:53:49.010806 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba4ebf5c-b152-4de7-88bc-4536f7bf3206-utilities\") pod \"redhat-marketplace-d87vb\" (UID: \"ba4ebf5c-b152-4de7-88bc-4536f7bf3206\") " pod="openshift-marketplace/redhat-marketplace-d87vb" Oct 03 08:53:49 crc kubenswrapper[4664]: I1003 08:53:49.010866 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba4ebf5c-b152-4de7-88bc-4536f7bf3206-catalog-content\") pod \"redhat-marketplace-d87vb\" (UID: \"ba4ebf5c-b152-4de7-88bc-4536f7bf3206\") " pod="openshift-marketplace/redhat-marketplace-d87vb" Oct 03 08:53:49 crc kubenswrapper[4664]: I1003 08:53:49.011553 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba4ebf5c-b152-4de7-88bc-4536f7bf3206-utilities\") pod \"redhat-marketplace-d87vb\" (UID: \"ba4ebf5c-b152-4de7-88bc-4536f7bf3206\") " pod="openshift-marketplace/redhat-marketplace-d87vb" Oct 03 08:53:49 crc kubenswrapper[4664]: I1003 08:53:49.011595 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba4ebf5c-b152-4de7-88bc-4536f7bf3206-catalog-content\") pod \"redhat-marketplace-d87vb\" (UID: \"ba4ebf5c-b152-4de7-88bc-4536f7bf3206\") " pod="openshift-marketplace/redhat-marketplace-d87vb" Oct 03 08:53:49 crc kubenswrapper[4664]: I1003 08:53:49.062127 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgt7t\" (UniqueName: \"kubernetes.io/projected/ba4ebf5c-b152-4de7-88bc-4536f7bf3206-kube-api-access-qgt7t\") pod \"redhat-marketplace-d87vb\" (UID: \"ba4ebf5c-b152-4de7-88bc-4536f7bf3206\") " pod="openshift-marketplace/redhat-marketplace-d87vb" Oct 03 08:53:49 crc kubenswrapper[4664]: I1003 08:53:49.139362 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d87vb" Oct 03 08:53:50 crc kubenswrapper[4664]: I1003 08:53:49.621031 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d87vb"] Oct 03 08:53:50 crc kubenswrapper[4664]: I1003 08:53:49.889489 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2c772da-a250-4c08-a5dc-882055cb19d8" path="/var/lib/kubelet/pods/b2c772da-a250-4c08-a5dc-882055cb19d8/volumes" Oct 03 08:53:50 crc kubenswrapper[4664]: I1003 08:53:49.943523 4664 generic.go:334] "Generic (PLEG): container finished" podID="ba4ebf5c-b152-4de7-88bc-4536f7bf3206" containerID="1088b08225167298a3e14613e36cb5438c87cbb5d0744a7fa2b1c46ce822f450" exitCode=0 Oct 03 08:53:50 crc kubenswrapper[4664]: I1003 08:53:49.943583 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d87vb" event={"ID":"ba4ebf5c-b152-4de7-88bc-4536f7bf3206","Type":"ContainerDied","Data":"1088b08225167298a3e14613e36cb5438c87cbb5d0744a7fa2b1c46ce822f450"} Oct 03 08:53:50 crc kubenswrapper[4664]: I1003 08:53:49.943658 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d87vb" event={"ID":"ba4ebf5c-b152-4de7-88bc-4536f7bf3206","Type":"ContainerStarted","Data":"73ee382a669a03ee0605d76f110c0e30a64e86c7a0b670505e2b80244b857aca"} Oct 03 08:53:50 crc kubenswrapper[4664]: I1003 08:53:50.955932 4664 generic.go:334] "Generic (PLEG): container finished" podID="ba4ebf5c-b152-4de7-88bc-4536f7bf3206" containerID="89edcb8385e321cb5a03a4472b6ee3a834bed02d660d8be89479ed57d6c7f9b3" exitCode=0 Oct 03 08:53:50 crc kubenswrapper[4664]: I1003 08:53:50.956158 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d87vb" event={"ID":"ba4ebf5c-b152-4de7-88bc-4536f7bf3206","Type":"ContainerDied","Data":"89edcb8385e321cb5a03a4472b6ee3a834bed02d660d8be89479ed57d6c7f9b3"} Oct 03 08:53:51 crc kubenswrapper[4664]: I1003 08:53:51.970317 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d87vb" event={"ID":"ba4ebf5c-b152-4de7-88bc-4536f7bf3206","Type":"ContainerStarted","Data":"8f11a96c67656936316c9d08846823a7511de4a3d3011d4b7aeb04ce054c4165"} Oct 03 08:53:52 crc kubenswrapper[4664]: I1003 08:53:52.000931 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-d87vb" podStartSLOduration=2.301456101 podStartE2EDuration="4.00090375s" podCreationTimestamp="2025-10-03 08:53:48 +0000 UTC" firstStartedPulling="2025-10-03 08:53:49.945576019 +0000 UTC m=+3930.766766509" lastFinishedPulling="2025-10-03 08:53:51.645023668 +0000 UTC m=+3932.466214158" observedRunningTime="2025-10-03 08:53:51.992369476 +0000 UTC m=+3932.813559996" watchObservedRunningTime="2025-10-03 08:53:52.00090375 +0000 UTC m=+3932.822094230" Oct 03 08:53:58 crc kubenswrapper[4664]: I1003 08:53:58.876731 4664 scope.go:117] "RemoveContainer" containerID="c2578004127ac42c34a7a434d2b185c8e2f477653435943ff7ad1aca115c81d8" Oct 03 08:53:58 crc kubenswrapper[4664]: E1003 08:53:58.877284 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:53:59 crc kubenswrapper[4664]: I1003 08:53:59.139997 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-d87vb" Oct 03 08:53:59 crc kubenswrapper[4664]: I1003 08:53:59.140933 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-d87vb" Oct 03 08:53:59 crc kubenswrapper[4664]: I1003 08:53:59.193247 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-d87vb" Oct 03 08:54:00 crc kubenswrapper[4664]: I1003 08:54:00.106846 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-d87vb" Oct 03 08:54:00 crc kubenswrapper[4664]: I1003 08:54:00.162397 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d87vb"] Oct 03 08:54:02 crc kubenswrapper[4664]: I1003 08:54:02.063738 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-d87vb" podUID="ba4ebf5c-b152-4de7-88bc-4536f7bf3206" containerName="registry-server" containerID="cri-o://8f11a96c67656936316c9d08846823a7511de4a3d3011d4b7aeb04ce054c4165" gracePeriod=2 Oct 03 08:54:02 crc kubenswrapper[4664]: I1003 08:54:02.484147 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d87vb" Oct 03 08:54:02 crc kubenswrapper[4664]: I1003 08:54:02.542238 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgt7t\" (UniqueName: \"kubernetes.io/projected/ba4ebf5c-b152-4de7-88bc-4536f7bf3206-kube-api-access-qgt7t\") pod \"ba4ebf5c-b152-4de7-88bc-4536f7bf3206\" (UID: \"ba4ebf5c-b152-4de7-88bc-4536f7bf3206\") " Oct 03 08:54:02 crc kubenswrapper[4664]: I1003 08:54:02.542455 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba4ebf5c-b152-4de7-88bc-4536f7bf3206-catalog-content\") pod \"ba4ebf5c-b152-4de7-88bc-4536f7bf3206\" (UID: \"ba4ebf5c-b152-4de7-88bc-4536f7bf3206\") " Oct 03 08:54:02 crc kubenswrapper[4664]: I1003 08:54:02.542487 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba4ebf5c-b152-4de7-88bc-4536f7bf3206-utilities\") pod \"ba4ebf5c-b152-4de7-88bc-4536f7bf3206\" (UID: \"ba4ebf5c-b152-4de7-88bc-4536f7bf3206\") " Oct 03 08:54:02 crc kubenswrapper[4664]: I1003 08:54:02.543662 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba4ebf5c-b152-4de7-88bc-4536f7bf3206-utilities" (OuterVolumeSpecName: "utilities") pod "ba4ebf5c-b152-4de7-88bc-4536f7bf3206" (UID: "ba4ebf5c-b152-4de7-88bc-4536f7bf3206"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:54:02 crc kubenswrapper[4664]: I1003 08:54:02.544255 4664 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba4ebf5c-b152-4de7-88bc-4536f7bf3206-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:54:02 crc kubenswrapper[4664]: I1003 08:54:02.550588 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba4ebf5c-b152-4de7-88bc-4536f7bf3206-kube-api-access-qgt7t" (OuterVolumeSpecName: "kube-api-access-qgt7t") pod "ba4ebf5c-b152-4de7-88bc-4536f7bf3206" (UID: "ba4ebf5c-b152-4de7-88bc-4536f7bf3206"). InnerVolumeSpecName "kube-api-access-qgt7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:54:02 crc kubenswrapper[4664]: I1003 08:54:02.560472 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba4ebf5c-b152-4de7-88bc-4536f7bf3206-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ba4ebf5c-b152-4de7-88bc-4536f7bf3206" (UID: "ba4ebf5c-b152-4de7-88bc-4536f7bf3206"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:54:02 crc kubenswrapper[4664]: I1003 08:54:02.646072 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgt7t\" (UniqueName: \"kubernetes.io/projected/ba4ebf5c-b152-4de7-88bc-4536f7bf3206-kube-api-access-qgt7t\") on node \"crc\" DevicePath \"\"" Oct 03 08:54:02 crc kubenswrapper[4664]: I1003 08:54:02.646110 4664 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba4ebf5c-b152-4de7-88bc-4536f7bf3206-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:54:03 crc kubenswrapper[4664]: I1003 08:54:03.074941 4664 generic.go:334] "Generic (PLEG): container finished" podID="ba4ebf5c-b152-4de7-88bc-4536f7bf3206" containerID="8f11a96c67656936316c9d08846823a7511de4a3d3011d4b7aeb04ce054c4165" exitCode=0 Oct 03 08:54:03 crc kubenswrapper[4664]: I1003 08:54:03.074992 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d87vb" event={"ID":"ba4ebf5c-b152-4de7-88bc-4536f7bf3206","Type":"ContainerDied","Data":"8f11a96c67656936316c9d08846823a7511de4a3d3011d4b7aeb04ce054c4165"} Oct 03 08:54:03 crc kubenswrapper[4664]: I1003 08:54:03.075949 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d87vb" event={"ID":"ba4ebf5c-b152-4de7-88bc-4536f7bf3206","Type":"ContainerDied","Data":"73ee382a669a03ee0605d76f110c0e30a64e86c7a0b670505e2b80244b857aca"} Oct 03 08:54:03 crc kubenswrapper[4664]: I1003 08:54:03.075977 4664 scope.go:117] "RemoveContainer" containerID="8f11a96c67656936316c9d08846823a7511de4a3d3011d4b7aeb04ce054c4165" Oct 03 08:54:03 crc kubenswrapper[4664]: I1003 08:54:03.075094 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d87vb" Oct 03 08:54:03 crc kubenswrapper[4664]: I1003 08:54:03.131325 4664 scope.go:117] "RemoveContainer" containerID="89edcb8385e321cb5a03a4472b6ee3a834bed02d660d8be89479ed57d6c7f9b3" Oct 03 08:54:03 crc kubenswrapper[4664]: I1003 08:54:03.136978 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d87vb"] Oct 03 08:54:03 crc kubenswrapper[4664]: I1003 08:54:03.154001 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-d87vb"] Oct 03 08:54:03 crc kubenswrapper[4664]: I1003 08:54:03.167062 4664 scope.go:117] "RemoveContainer" containerID="1088b08225167298a3e14613e36cb5438c87cbb5d0744a7fa2b1c46ce822f450" Oct 03 08:54:03 crc kubenswrapper[4664]: I1003 08:54:03.209654 4664 scope.go:117] "RemoveContainer" containerID="8f11a96c67656936316c9d08846823a7511de4a3d3011d4b7aeb04ce054c4165" Oct 03 08:54:03 crc kubenswrapper[4664]: E1003 08:54:03.210176 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f11a96c67656936316c9d08846823a7511de4a3d3011d4b7aeb04ce054c4165\": container with ID starting with 8f11a96c67656936316c9d08846823a7511de4a3d3011d4b7aeb04ce054c4165 not found: ID does not exist" containerID="8f11a96c67656936316c9d08846823a7511de4a3d3011d4b7aeb04ce054c4165" Oct 03 08:54:03 crc kubenswrapper[4664]: I1003 08:54:03.210233 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f11a96c67656936316c9d08846823a7511de4a3d3011d4b7aeb04ce054c4165"} err="failed to get container status \"8f11a96c67656936316c9d08846823a7511de4a3d3011d4b7aeb04ce054c4165\": rpc error: code = NotFound desc = could not find container \"8f11a96c67656936316c9d08846823a7511de4a3d3011d4b7aeb04ce054c4165\": container with ID starting with 8f11a96c67656936316c9d08846823a7511de4a3d3011d4b7aeb04ce054c4165 not found: ID does not exist" Oct 03 08:54:03 crc kubenswrapper[4664]: I1003 08:54:03.210268 4664 scope.go:117] "RemoveContainer" containerID="89edcb8385e321cb5a03a4472b6ee3a834bed02d660d8be89479ed57d6c7f9b3" Oct 03 08:54:03 crc kubenswrapper[4664]: E1003 08:54:03.210536 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89edcb8385e321cb5a03a4472b6ee3a834bed02d660d8be89479ed57d6c7f9b3\": container with ID starting with 89edcb8385e321cb5a03a4472b6ee3a834bed02d660d8be89479ed57d6c7f9b3 not found: ID does not exist" containerID="89edcb8385e321cb5a03a4472b6ee3a834bed02d660d8be89479ed57d6c7f9b3" Oct 03 08:54:03 crc kubenswrapper[4664]: I1003 08:54:03.210560 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89edcb8385e321cb5a03a4472b6ee3a834bed02d660d8be89479ed57d6c7f9b3"} err="failed to get container status \"89edcb8385e321cb5a03a4472b6ee3a834bed02d660d8be89479ed57d6c7f9b3\": rpc error: code = NotFound desc = could not find container \"89edcb8385e321cb5a03a4472b6ee3a834bed02d660d8be89479ed57d6c7f9b3\": container with ID starting with 89edcb8385e321cb5a03a4472b6ee3a834bed02d660d8be89479ed57d6c7f9b3 not found: ID does not exist" Oct 03 08:54:03 crc kubenswrapper[4664]: I1003 08:54:03.210573 4664 scope.go:117] "RemoveContainer" containerID="1088b08225167298a3e14613e36cb5438c87cbb5d0744a7fa2b1c46ce822f450" Oct 03 08:54:03 crc kubenswrapper[4664]: E1003 08:54:03.210806 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1088b08225167298a3e14613e36cb5438c87cbb5d0744a7fa2b1c46ce822f450\": container with ID starting with 1088b08225167298a3e14613e36cb5438c87cbb5d0744a7fa2b1c46ce822f450 not found: ID does not exist" containerID="1088b08225167298a3e14613e36cb5438c87cbb5d0744a7fa2b1c46ce822f450" Oct 03 08:54:03 crc kubenswrapper[4664]: I1003 08:54:03.210829 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1088b08225167298a3e14613e36cb5438c87cbb5d0744a7fa2b1c46ce822f450"} err="failed to get container status \"1088b08225167298a3e14613e36cb5438c87cbb5d0744a7fa2b1c46ce822f450\": rpc error: code = NotFound desc = could not find container \"1088b08225167298a3e14613e36cb5438c87cbb5d0744a7fa2b1c46ce822f450\": container with ID starting with 1088b08225167298a3e14613e36cb5438c87cbb5d0744a7fa2b1c46ce822f450 not found: ID does not exist" Oct 03 08:54:03 crc kubenswrapper[4664]: I1003 08:54:03.891042 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba4ebf5c-b152-4de7-88bc-4536f7bf3206" path="/var/lib/kubelet/pods/ba4ebf5c-b152-4de7-88bc-4536f7bf3206/volumes" Oct 03 08:54:10 crc kubenswrapper[4664]: I1003 08:54:10.876855 4664 scope.go:117] "RemoveContainer" containerID="c2578004127ac42c34a7a434d2b185c8e2f477653435943ff7ad1aca115c81d8" Oct 03 08:54:10 crc kubenswrapper[4664]: E1003 08:54:10.877601 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:54:21 crc kubenswrapper[4664]: I1003 08:54:21.876708 4664 scope.go:117] "RemoveContainer" containerID="c2578004127ac42c34a7a434d2b185c8e2f477653435943ff7ad1aca115c81d8" Oct 03 08:54:21 crc kubenswrapper[4664]: E1003 08:54:21.877759 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:54:35 crc kubenswrapper[4664]: I1003 08:54:35.877981 4664 scope.go:117] "RemoveContainer" containerID="c2578004127ac42c34a7a434d2b185c8e2f477653435943ff7ad1aca115c81d8" Oct 03 08:54:35 crc kubenswrapper[4664]: E1003 08:54:35.879005 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:54:48 crc kubenswrapper[4664]: I1003 08:54:48.875803 4664 scope.go:117] "RemoveContainer" containerID="c2578004127ac42c34a7a434d2b185c8e2f477653435943ff7ad1aca115c81d8" Oct 03 08:54:48 crc kubenswrapper[4664]: E1003 08:54:48.876471 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:55:00 crc kubenswrapper[4664]: I1003 08:55:00.877219 4664 scope.go:117] "RemoveContainer" containerID="c2578004127ac42c34a7a434d2b185c8e2f477653435943ff7ad1aca115c81d8" Oct 03 08:55:00 crc kubenswrapper[4664]: E1003 08:55:00.878034 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:55:12 crc kubenswrapper[4664]: I1003 08:55:12.877204 4664 scope.go:117] "RemoveContainer" containerID="c2578004127ac42c34a7a434d2b185c8e2f477653435943ff7ad1aca115c81d8" Oct 03 08:55:12 crc kubenswrapper[4664]: E1003 08:55:12.878128 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:55:25 crc kubenswrapper[4664]: I1003 08:55:25.876919 4664 scope.go:117] "RemoveContainer" containerID="c2578004127ac42c34a7a434d2b185c8e2f477653435943ff7ad1aca115c81d8" Oct 03 08:55:25 crc kubenswrapper[4664]: E1003 08:55:25.877709 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:55:39 crc kubenswrapper[4664]: I1003 08:55:39.876021 4664 scope.go:117] "RemoveContainer" containerID="c2578004127ac42c34a7a434d2b185c8e2f477653435943ff7ad1aca115c81d8" Oct 03 08:55:39 crc kubenswrapper[4664]: E1003 08:55:39.876842 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:55:51 crc kubenswrapper[4664]: I1003 08:55:51.876683 4664 scope.go:117] "RemoveContainer" containerID="c2578004127ac42c34a7a434d2b185c8e2f477653435943ff7ad1aca115c81d8" Oct 03 08:55:51 crc kubenswrapper[4664]: E1003 08:55:51.877410 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:56:04 crc kubenswrapper[4664]: I1003 08:56:04.876948 4664 scope.go:117] "RemoveContainer" containerID="c2578004127ac42c34a7a434d2b185c8e2f477653435943ff7ad1aca115c81d8" Oct 03 08:56:04 crc kubenswrapper[4664]: E1003 08:56:04.877818 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:56:18 crc kubenswrapper[4664]: I1003 08:56:18.877001 4664 scope.go:117] "RemoveContainer" containerID="c2578004127ac42c34a7a434d2b185c8e2f477653435943ff7ad1aca115c81d8" Oct 03 08:56:18 crc kubenswrapper[4664]: E1003 08:56:18.877958 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:56:31 crc kubenswrapper[4664]: I1003 08:56:31.876791 4664 scope.go:117] "RemoveContainer" containerID="c2578004127ac42c34a7a434d2b185c8e2f477653435943ff7ad1aca115c81d8" Oct 03 08:56:31 crc kubenswrapper[4664]: E1003 08:56:31.877461 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:56:46 crc kubenswrapper[4664]: I1003 08:56:46.876631 4664 scope.go:117] "RemoveContainer" containerID="c2578004127ac42c34a7a434d2b185c8e2f477653435943ff7ad1aca115c81d8" Oct 03 08:56:46 crc kubenswrapper[4664]: E1003 08:56:46.877644 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:56:59 crc kubenswrapper[4664]: I1003 08:56:59.892784 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rfjf8"] Oct 03 08:56:59 crc kubenswrapper[4664]: E1003 08:56:59.893980 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba4ebf5c-b152-4de7-88bc-4536f7bf3206" containerName="registry-server" Oct 03 08:56:59 crc kubenswrapper[4664]: I1003 08:56:59.893996 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba4ebf5c-b152-4de7-88bc-4536f7bf3206" containerName="registry-server" Oct 03 08:56:59 crc kubenswrapper[4664]: E1003 08:56:59.894062 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba4ebf5c-b152-4de7-88bc-4536f7bf3206" containerName="extract-utilities" Oct 03 08:56:59 crc kubenswrapper[4664]: I1003 08:56:59.894069 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba4ebf5c-b152-4de7-88bc-4536f7bf3206" containerName="extract-utilities" Oct 03 08:56:59 crc kubenswrapper[4664]: E1003 08:56:59.894092 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba4ebf5c-b152-4de7-88bc-4536f7bf3206" containerName="extract-content" Oct 03 08:56:59 crc kubenswrapper[4664]: I1003 08:56:59.894098 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba4ebf5c-b152-4de7-88bc-4536f7bf3206" containerName="extract-content" Oct 03 08:56:59 crc kubenswrapper[4664]: I1003 08:56:59.894304 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba4ebf5c-b152-4de7-88bc-4536f7bf3206" containerName="registry-server" Oct 03 08:56:59 crc kubenswrapper[4664]: I1003 08:56:59.899418 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rfjf8" Oct 03 08:56:59 crc kubenswrapper[4664]: I1003 08:56:59.904558 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rfjf8"] Oct 03 08:57:00 crc kubenswrapper[4664]: I1003 08:57:00.054625 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8dd2e0f0-7fe3-460d-8fdc-269a7796dce4-utilities\") pod \"redhat-operators-rfjf8\" (UID: \"8dd2e0f0-7fe3-460d-8fdc-269a7796dce4\") " pod="openshift-marketplace/redhat-operators-rfjf8" Oct 03 08:57:00 crc kubenswrapper[4664]: I1003 08:57:00.055090 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zrdc\" (UniqueName: \"kubernetes.io/projected/8dd2e0f0-7fe3-460d-8fdc-269a7796dce4-kube-api-access-7zrdc\") pod \"redhat-operators-rfjf8\" (UID: \"8dd2e0f0-7fe3-460d-8fdc-269a7796dce4\") " pod="openshift-marketplace/redhat-operators-rfjf8" Oct 03 08:57:00 crc kubenswrapper[4664]: I1003 08:57:00.055356 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8dd2e0f0-7fe3-460d-8fdc-269a7796dce4-catalog-content\") pod \"redhat-operators-rfjf8\" (UID: \"8dd2e0f0-7fe3-460d-8fdc-269a7796dce4\") " pod="openshift-marketplace/redhat-operators-rfjf8" Oct 03 08:57:00 crc kubenswrapper[4664]: I1003 08:57:00.157628 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8dd2e0f0-7fe3-460d-8fdc-269a7796dce4-utilities\") pod \"redhat-operators-rfjf8\" (UID: \"8dd2e0f0-7fe3-460d-8fdc-269a7796dce4\") " pod="openshift-marketplace/redhat-operators-rfjf8" Oct 03 08:57:00 crc kubenswrapper[4664]: I1003 08:57:00.157815 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zrdc\" (UniqueName: \"kubernetes.io/projected/8dd2e0f0-7fe3-460d-8fdc-269a7796dce4-kube-api-access-7zrdc\") pod \"redhat-operators-rfjf8\" (UID: \"8dd2e0f0-7fe3-460d-8fdc-269a7796dce4\") " pod="openshift-marketplace/redhat-operators-rfjf8" Oct 03 08:57:00 crc kubenswrapper[4664]: I1003 08:57:00.157908 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8dd2e0f0-7fe3-460d-8fdc-269a7796dce4-catalog-content\") pod \"redhat-operators-rfjf8\" (UID: \"8dd2e0f0-7fe3-460d-8fdc-269a7796dce4\") " pod="openshift-marketplace/redhat-operators-rfjf8" Oct 03 08:57:00 crc kubenswrapper[4664]: I1003 08:57:00.158157 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8dd2e0f0-7fe3-460d-8fdc-269a7796dce4-utilities\") pod \"redhat-operators-rfjf8\" (UID: \"8dd2e0f0-7fe3-460d-8fdc-269a7796dce4\") " pod="openshift-marketplace/redhat-operators-rfjf8" Oct 03 08:57:00 crc kubenswrapper[4664]: I1003 08:57:00.158283 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8dd2e0f0-7fe3-460d-8fdc-269a7796dce4-catalog-content\") pod \"redhat-operators-rfjf8\" (UID: \"8dd2e0f0-7fe3-460d-8fdc-269a7796dce4\") " pod="openshift-marketplace/redhat-operators-rfjf8" Oct 03 08:57:00 crc kubenswrapper[4664]: I1003 08:57:00.179222 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zrdc\" (UniqueName: \"kubernetes.io/projected/8dd2e0f0-7fe3-460d-8fdc-269a7796dce4-kube-api-access-7zrdc\") pod \"redhat-operators-rfjf8\" (UID: \"8dd2e0f0-7fe3-460d-8fdc-269a7796dce4\") " pod="openshift-marketplace/redhat-operators-rfjf8" Oct 03 08:57:00 crc kubenswrapper[4664]: I1003 08:57:00.237854 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rfjf8" Oct 03 08:57:00 crc kubenswrapper[4664]: I1003 08:57:00.751035 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rfjf8"] Oct 03 08:57:00 crc kubenswrapper[4664]: W1003 08:57:00.764383 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8dd2e0f0_7fe3_460d_8fdc_269a7796dce4.slice/crio-e0a929367e068d233ce36a6f89ccda316e25bad92bf74c3f78cab62cded61316 WatchSource:0}: Error finding container e0a929367e068d233ce36a6f89ccda316e25bad92bf74c3f78cab62cded61316: Status 404 returned error can't find the container with id e0a929367e068d233ce36a6f89ccda316e25bad92bf74c3f78cab62cded61316 Oct 03 08:57:00 crc kubenswrapper[4664]: I1003 08:57:00.806156 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rfjf8" event={"ID":"8dd2e0f0-7fe3-460d-8fdc-269a7796dce4","Type":"ContainerStarted","Data":"e0a929367e068d233ce36a6f89ccda316e25bad92bf74c3f78cab62cded61316"} Oct 03 08:57:01 crc kubenswrapper[4664]: I1003 08:57:01.818395 4664 generic.go:334] "Generic (PLEG): container finished" podID="8dd2e0f0-7fe3-460d-8fdc-269a7796dce4" containerID="cc97880d0cf2ae57a07be6b9151174bad93805a36da36bde16ca5c2419a89a14" exitCode=0 Oct 03 08:57:01 crc kubenswrapper[4664]: I1003 08:57:01.818472 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rfjf8" event={"ID":"8dd2e0f0-7fe3-460d-8fdc-269a7796dce4","Type":"ContainerDied","Data":"cc97880d0cf2ae57a07be6b9151174bad93805a36da36bde16ca5c2419a89a14"} Oct 03 08:57:01 crc kubenswrapper[4664]: I1003 08:57:01.877177 4664 scope.go:117] "RemoveContainer" containerID="c2578004127ac42c34a7a434d2b185c8e2f477653435943ff7ad1aca115c81d8" Oct 03 08:57:01 crc kubenswrapper[4664]: E1003 08:57:01.877810 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:57:02 crc kubenswrapper[4664]: I1003 08:57:02.272554 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-k7zvg"] Oct 03 08:57:02 crc kubenswrapper[4664]: I1003 08:57:02.275649 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k7zvg" Oct 03 08:57:02 crc kubenswrapper[4664]: I1003 08:57:02.285046 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k7zvg"] Oct 03 08:57:02 crc kubenswrapper[4664]: I1003 08:57:02.407655 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7b89758-8086-41ef-8ed6-eaa91f931ebc-utilities\") pod \"community-operators-k7zvg\" (UID: \"c7b89758-8086-41ef-8ed6-eaa91f931ebc\") " pod="openshift-marketplace/community-operators-k7zvg" Oct 03 08:57:02 crc kubenswrapper[4664]: I1003 08:57:02.407882 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7b89758-8086-41ef-8ed6-eaa91f931ebc-catalog-content\") pod \"community-operators-k7zvg\" (UID: \"c7b89758-8086-41ef-8ed6-eaa91f931ebc\") " pod="openshift-marketplace/community-operators-k7zvg" Oct 03 08:57:02 crc kubenswrapper[4664]: I1003 08:57:02.407932 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lccfg\" (UniqueName: \"kubernetes.io/projected/c7b89758-8086-41ef-8ed6-eaa91f931ebc-kube-api-access-lccfg\") pod \"community-operators-k7zvg\" (UID: \"c7b89758-8086-41ef-8ed6-eaa91f931ebc\") " pod="openshift-marketplace/community-operators-k7zvg" Oct 03 08:57:02 crc kubenswrapper[4664]: I1003 08:57:02.510062 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7b89758-8086-41ef-8ed6-eaa91f931ebc-catalog-content\") pod \"community-operators-k7zvg\" (UID: \"c7b89758-8086-41ef-8ed6-eaa91f931ebc\") " pod="openshift-marketplace/community-operators-k7zvg" Oct 03 08:57:02 crc kubenswrapper[4664]: I1003 08:57:02.510195 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lccfg\" (UniqueName: \"kubernetes.io/projected/c7b89758-8086-41ef-8ed6-eaa91f931ebc-kube-api-access-lccfg\") pod \"community-operators-k7zvg\" (UID: \"c7b89758-8086-41ef-8ed6-eaa91f931ebc\") " pod="openshift-marketplace/community-operators-k7zvg" Oct 03 08:57:02 crc kubenswrapper[4664]: I1003 08:57:02.510319 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7b89758-8086-41ef-8ed6-eaa91f931ebc-utilities\") pod \"community-operators-k7zvg\" (UID: \"c7b89758-8086-41ef-8ed6-eaa91f931ebc\") " pod="openshift-marketplace/community-operators-k7zvg" Oct 03 08:57:02 crc kubenswrapper[4664]: I1003 08:57:02.510873 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7b89758-8086-41ef-8ed6-eaa91f931ebc-catalog-content\") pod \"community-operators-k7zvg\" (UID: \"c7b89758-8086-41ef-8ed6-eaa91f931ebc\") " pod="openshift-marketplace/community-operators-k7zvg" Oct 03 08:57:02 crc kubenswrapper[4664]: I1003 08:57:02.510996 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7b89758-8086-41ef-8ed6-eaa91f931ebc-utilities\") pod \"community-operators-k7zvg\" (UID: \"c7b89758-8086-41ef-8ed6-eaa91f931ebc\") " pod="openshift-marketplace/community-operators-k7zvg" Oct 03 08:57:02 crc kubenswrapper[4664]: I1003 08:57:02.535762 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lccfg\" (UniqueName: \"kubernetes.io/projected/c7b89758-8086-41ef-8ed6-eaa91f931ebc-kube-api-access-lccfg\") pod \"community-operators-k7zvg\" (UID: \"c7b89758-8086-41ef-8ed6-eaa91f931ebc\") " pod="openshift-marketplace/community-operators-k7zvg" Oct 03 08:57:02 crc kubenswrapper[4664]: I1003 08:57:02.609336 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k7zvg" Oct 03 08:57:03 crc kubenswrapper[4664]: I1003 08:57:03.134306 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k7zvg"] Oct 03 08:57:03 crc kubenswrapper[4664]: I1003 08:57:03.838788 4664 generic.go:334] "Generic (PLEG): container finished" podID="8dd2e0f0-7fe3-460d-8fdc-269a7796dce4" containerID="34844c10cd7260aa9c69de6facc1adc8bb69c4c00fd6f9abd35638268be53955" exitCode=0 Oct 03 08:57:03 crc kubenswrapper[4664]: I1003 08:57:03.838879 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rfjf8" event={"ID":"8dd2e0f0-7fe3-460d-8fdc-269a7796dce4","Type":"ContainerDied","Data":"34844c10cd7260aa9c69de6facc1adc8bb69c4c00fd6f9abd35638268be53955"} Oct 03 08:57:03 crc kubenswrapper[4664]: I1003 08:57:03.842424 4664 generic.go:334] "Generic (PLEG): container finished" podID="c7b89758-8086-41ef-8ed6-eaa91f931ebc" containerID="290527915f0831c71ee7529dec6cb01b6a401ee7ec588e48b13a6e4059baeb53" exitCode=0 Oct 03 08:57:03 crc kubenswrapper[4664]: I1003 08:57:03.842465 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k7zvg" event={"ID":"c7b89758-8086-41ef-8ed6-eaa91f931ebc","Type":"ContainerDied","Data":"290527915f0831c71ee7529dec6cb01b6a401ee7ec588e48b13a6e4059baeb53"} Oct 03 08:57:03 crc kubenswrapper[4664]: I1003 08:57:03.842511 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k7zvg" event={"ID":"c7b89758-8086-41ef-8ed6-eaa91f931ebc","Type":"ContainerStarted","Data":"2a72764d043ad72ab0141ed8a8731ca8e04248ce0ddddf3f1157064ba8a3a0ea"} Oct 03 08:57:04 crc kubenswrapper[4664]: I1003 08:57:04.856574 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k7zvg" event={"ID":"c7b89758-8086-41ef-8ed6-eaa91f931ebc","Type":"ContainerStarted","Data":"21976ee721ae44bce1e97254ad387c71efe4dd3680940c2bce4d735ef7d98c62"} Oct 03 08:57:04 crc kubenswrapper[4664]: I1003 08:57:04.861039 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rfjf8" event={"ID":"8dd2e0f0-7fe3-460d-8fdc-269a7796dce4","Type":"ContainerStarted","Data":"5b0368da7c58ee791bc3a605479598a6388fd734c4d8d4324abaf784ad342f13"} Oct 03 08:57:04 crc kubenswrapper[4664]: I1003 08:57:04.889325 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rfjf8" podStartSLOduration=3.481637191 podStartE2EDuration="5.889295901s" podCreationTimestamp="2025-10-03 08:56:59 +0000 UTC" firstStartedPulling="2025-10-03 08:57:01.821313107 +0000 UTC m=+4122.642503597" lastFinishedPulling="2025-10-03 08:57:04.228971817 +0000 UTC m=+4125.050162307" observedRunningTime="2025-10-03 08:57:04.880667474 +0000 UTC m=+4125.701857984" watchObservedRunningTime="2025-10-03 08:57:04.889295901 +0000 UTC m=+4125.710486391" Oct 03 08:57:05 crc kubenswrapper[4664]: I1003 08:57:05.872523 4664 generic.go:334] "Generic (PLEG): container finished" podID="c7b89758-8086-41ef-8ed6-eaa91f931ebc" containerID="21976ee721ae44bce1e97254ad387c71efe4dd3680940c2bce4d735ef7d98c62" exitCode=0 Oct 03 08:57:05 crc kubenswrapper[4664]: I1003 08:57:05.872713 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k7zvg" event={"ID":"c7b89758-8086-41ef-8ed6-eaa91f931ebc","Type":"ContainerDied","Data":"21976ee721ae44bce1e97254ad387c71efe4dd3680940c2bce4d735ef7d98c62"} Oct 03 08:57:07 crc kubenswrapper[4664]: I1003 08:57:07.927547 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k7zvg" event={"ID":"c7b89758-8086-41ef-8ed6-eaa91f931ebc","Type":"ContainerStarted","Data":"08dd3f8c019fffd356158e74957e7752cb34c547e205efc4566fea7085bbe067"} Oct 03 08:57:08 crc kubenswrapper[4664]: I1003 08:57:08.962674 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-k7zvg" podStartSLOduration=4.450190737 podStartE2EDuration="6.962651164s" podCreationTimestamp="2025-10-03 08:57:02 +0000 UTC" firstStartedPulling="2025-10-03 08:57:03.843854018 +0000 UTC m=+4124.665044508" lastFinishedPulling="2025-10-03 08:57:06.356314435 +0000 UTC m=+4127.177504935" observedRunningTime="2025-10-03 08:57:08.955734666 +0000 UTC m=+4129.776925176" watchObservedRunningTime="2025-10-03 08:57:08.962651164 +0000 UTC m=+4129.783841654" Oct 03 08:57:10 crc kubenswrapper[4664]: I1003 08:57:10.238635 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rfjf8" Oct 03 08:57:10 crc kubenswrapper[4664]: I1003 08:57:10.239172 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rfjf8" Oct 03 08:57:10 crc kubenswrapper[4664]: I1003 08:57:10.297179 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rfjf8" Oct 03 08:57:11 crc kubenswrapper[4664]: I1003 08:57:11.568118 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rfjf8" Oct 03 08:57:11 crc kubenswrapper[4664]: I1003 08:57:11.854584 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rfjf8"] Oct 03 08:57:12 crc kubenswrapper[4664]: I1003 08:57:12.609802 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-k7zvg" Oct 03 08:57:12 crc kubenswrapper[4664]: I1003 08:57:12.610348 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-k7zvg" Oct 03 08:57:12 crc kubenswrapper[4664]: I1003 08:57:12.658835 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-k7zvg" Oct 03 08:57:12 crc kubenswrapper[4664]: I1003 08:57:12.878003 4664 scope.go:117] "RemoveContainer" containerID="c2578004127ac42c34a7a434d2b185c8e2f477653435943ff7ad1aca115c81d8" Oct 03 08:57:12 crc kubenswrapper[4664]: E1003 08:57:12.878676 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:57:12 crc kubenswrapper[4664]: I1003 08:57:12.978207 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rfjf8" podUID="8dd2e0f0-7fe3-460d-8fdc-269a7796dce4" containerName="registry-server" containerID="cri-o://5b0368da7c58ee791bc3a605479598a6388fd734c4d8d4324abaf784ad342f13" gracePeriod=2 Oct 03 08:57:13 crc kubenswrapper[4664]: I1003 08:57:13.027066 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-k7zvg" Oct 03 08:57:13 crc kubenswrapper[4664]: I1003 08:57:13.460495 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rfjf8" Oct 03 08:57:13 crc kubenswrapper[4664]: I1003 08:57:13.561152 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8dd2e0f0-7fe3-460d-8fdc-269a7796dce4-utilities\") pod \"8dd2e0f0-7fe3-460d-8fdc-269a7796dce4\" (UID: \"8dd2e0f0-7fe3-460d-8fdc-269a7796dce4\") " Oct 03 08:57:13 crc kubenswrapper[4664]: I1003 08:57:13.561389 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zrdc\" (UniqueName: \"kubernetes.io/projected/8dd2e0f0-7fe3-460d-8fdc-269a7796dce4-kube-api-access-7zrdc\") pod \"8dd2e0f0-7fe3-460d-8fdc-269a7796dce4\" (UID: \"8dd2e0f0-7fe3-460d-8fdc-269a7796dce4\") " Oct 03 08:57:13 crc kubenswrapper[4664]: I1003 08:57:13.561596 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8dd2e0f0-7fe3-460d-8fdc-269a7796dce4-catalog-content\") pod \"8dd2e0f0-7fe3-460d-8fdc-269a7796dce4\" (UID: \"8dd2e0f0-7fe3-460d-8fdc-269a7796dce4\") " Oct 03 08:57:13 crc kubenswrapper[4664]: I1003 08:57:13.562224 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8dd2e0f0-7fe3-460d-8fdc-269a7796dce4-utilities" (OuterVolumeSpecName: "utilities") pod "8dd2e0f0-7fe3-460d-8fdc-269a7796dce4" (UID: "8dd2e0f0-7fe3-460d-8fdc-269a7796dce4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:57:13 crc kubenswrapper[4664]: I1003 08:57:13.562628 4664 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8dd2e0f0-7fe3-460d-8fdc-269a7796dce4-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:13 crc kubenswrapper[4664]: I1003 08:57:13.568828 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dd2e0f0-7fe3-460d-8fdc-269a7796dce4-kube-api-access-7zrdc" (OuterVolumeSpecName: "kube-api-access-7zrdc") pod "8dd2e0f0-7fe3-460d-8fdc-269a7796dce4" (UID: "8dd2e0f0-7fe3-460d-8fdc-269a7796dce4"). InnerVolumeSpecName "kube-api-access-7zrdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:57:13 crc kubenswrapper[4664]: I1003 08:57:13.664544 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zrdc\" (UniqueName: \"kubernetes.io/projected/8dd2e0f0-7fe3-460d-8fdc-269a7796dce4-kube-api-access-7zrdc\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:13 crc kubenswrapper[4664]: I1003 08:57:13.727859 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8dd2e0f0-7fe3-460d-8fdc-269a7796dce4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8dd2e0f0-7fe3-460d-8fdc-269a7796dce4" (UID: "8dd2e0f0-7fe3-460d-8fdc-269a7796dce4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:57:13 crc kubenswrapper[4664]: I1003 08:57:13.767115 4664 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8dd2e0f0-7fe3-460d-8fdc-269a7796dce4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:13 crc kubenswrapper[4664]: I1003 08:57:13.990791 4664 generic.go:334] "Generic (PLEG): container finished" podID="8dd2e0f0-7fe3-460d-8fdc-269a7796dce4" containerID="5b0368da7c58ee791bc3a605479598a6388fd734c4d8d4324abaf784ad342f13" exitCode=0 Oct 03 08:57:13 crc kubenswrapper[4664]: I1003 08:57:13.991104 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rfjf8" Oct 03 08:57:13 crc kubenswrapper[4664]: I1003 08:57:13.991126 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rfjf8" event={"ID":"8dd2e0f0-7fe3-460d-8fdc-269a7796dce4","Type":"ContainerDied","Data":"5b0368da7c58ee791bc3a605479598a6388fd734c4d8d4324abaf784ad342f13"} Oct 03 08:57:13 crc kubenswrapper[4664]: I1003 08:57:13.991177 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rfjf8" event={"ID":"8dd2e0f0-7fe3-460d-8fdc-269a7796dce4","Type":"ContainerDied","Data":"e0a929367e068d233ce36a6f89ccda316e25bad92bf74c3f78cab62cded61316"} Oct 03 08:57:13 crc kubenswrapper[4664]: I1003 08:57:13.991207 4664 scope.go:117] "RemoveContainer" containerID="5b0368da7c58ee791bc3a605479598a6388fd734c4d8d4324abaf784ad342f13" Oct 03 08:57:14 crc kubenswrapper[4664]: I1003 08:57:14.018798 4664 scope.go:117] "RemoveContainer" containerID="34844c10cd7260aa9c69de6facc1adc8bb69c4c00fd6f9abd35638268be53955" Oct 03 08:57:14 crc kubenswrapper[4664]: I1003 08:57:14.024865 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rfjf8"] Oct 03 08:57:14 crc kubenswrapper[4664]: I1003 08:57:14.035163 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rfjf8"] Oct 03 08:57:14 crc kubenswrapper[4664]: I1003 08:57:14.040800 4664 scope.go:117] "RemoveContainer" containerID="cc97880d0cf2ae57a07be6b9151174bad93805a36da36bde16ca5c2419a89a14" Oct 03 08:57:14 crc kubenswrapper[4664]: I1003 08:57:14.093616 4664 scope.go:117] "RemoveContainer" containerID="5b0368da7c58ee791bc3a605479598a6388fd734c4d8d4324abaf784ad342f13" Oct 03 08:57:14 crc kubenswrapper[4664]: E1003 08:57:14.094135 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b0368da7c58ee791bc3a605479598a6388fd734c4d8d4324abaf784ad342f13\": container with ID starting with 5b0368da7c58ee791bc3a605479598a6388fd734c4d8d4324abaf784ad342f13 not found: ID does not exist" containerID="5b0368da7c58ee791bc3a605479598a6388fd734c4d8d4324abaf784ad342f13" Oct 03 08:57:14 crc kubenswrapper[4664]: I1003 08:57:14.094176 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b0368da7c58ee791bc3a605479598a6388fd734c4d8d4324abaf784ad342f13"} err="failed to get container status \"5b0368da7c58ee791bc3a605479598a6388fd734c4d8d4324abaf784ad342f13\": rpc error: code = NotFound desc = could not find container \"5b0368da7c58ee791bc3a605479598a6388fd734c4d8d4324abaf784ad342f13\": container with ID starting with 5b0368da7c58ee791bc3a605479598a6388fd734c4d8d4324abaf784ad342f13 not found: ID does not exist" Oct 03 08:57:14 crc kubenswrapper[4664]: I1003 08:57:14.094201 4664 scope.go:117] "RemoveContainer" containerID="34844c10cd7260aa9c69de6facc1adc8bb69c4c00fd6f9abd35638268be53955" Oct 03 08:57:14 crc kubenswrapper[4664]: E1003 08:57:14.094571 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34844c10cd7260aa9c69de6facc1adc8bb69c4c00fd6f9abd35638268be53955\": container with ID starting with 34844c10cd7260aa9c69de6facc1adc8bb69c4c00fd6f9abd35638268be53955 not found: ID does not exist" containerID="34844c10cd7260aa9c69de6facc1adc8bb69c4c00fd6f9abd35638268be53955" Oct 03 08:57:14 crc kubenswrapper[4664]: I1003 08:57:14.094646 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34844c10cd7260aa9c69de6facc1adc8bb69c4c00fd6f9abd35638268be53955"} err="failed to get container status \"34844c10cd7260aa9c69de6facc1adc8bb69c4c00fd6f9abd35638268be53955\": rpc error: code = NotFound desc = could not find container \"34844c10cd7260aa9c69de6facc1adc8bb69c4c00fd6f9abd35638268be53955\": container with ID starting with 34844c10cd7260aa9c69de6facc1adc8bb69c4c00fd6f9abd35638268be53955 not found: ID does not exist" Oct 03 08:57:14 crc kubenswrapper[4664]: I1003 08:57:14.094687 4664 scope.go:117] "RemoveContainer" containerID="cc97880d0cf2ae57a07be6b9151174bad93805a36da36bde16ca5c2419a89a14" Oct 03 08:57:14 crc kubenswrapper[4664]: E1003 08:57:14.095249 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc97880d0cf2ae57a07be6b9151174bad93805a36da36bde16ca5c2419a89a14\": container with ID starting with cc97880d0cf2ae57a07be6b9151174bad93805a36da36bde16ca5c2419a89a14 not found: ID does not exist" containerID="cc97880d0cf2ae57a07be6b9151174bad93805a36da36bde16ca5c2419a89a14" Oct 03 08:57:14 crc kubenswrapper[4664]: I1003 08:57:14.095291 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc97880d0cf2ae57a07be6b9151174bad93805a36da36bde16ca5c2419a89a14"} err="failed to get container status \"cc97880d0cf2ae57a07be6b9151174bad93805a36da36bde16ca5c2419a89a14\": rpc error: code = NotFound desc = could not find container \"cc97880d0cf2ae57a07be6b9151174bad93805a36da36bde16ca5c2419a89a14\": container with ID starting with cc97880d0cf2ae57a07be6b9151174bad93805a36da36bde16ca5c2419a89a14 not found: ID does not exist" Oct 03 08:57:14 crc kubenswrapper[4664]: I1003 08:57:14.884578 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k7zvg"] Oct 03 08:57:15 crc kubenswrapper[4664]: I1003 08:57:15.004584 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-k7zvg" podUID="c7b89758-8086-41ef-8ed6-eaa91f931ebc" containerName="registry-server" containerID="cri-o://08dd3f8c019fffd356158e74957e7752cb34c547e205efc4566fea7085bbe067" gracePeriod=2 Oct 03 08:57:15 crc kubenswrapper[4664]: I1003 08:57:15.606927 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k7zvg" Oct 03 08:57:15 crc kubenswrapper[4664]: I1003 08:57:15.711697 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lccfg\" (UniqueName: \"kubernetes.io/projected/c7b89758-8086-41ef-8ed6-eaa91f931ebc-kube-api-access-lccfg\") pod \"c7b89758-8086-41ef-8ed6-eaa91f931ebc\" (UID: \"c7b89758-8086-41ef-8ed6-eaa91f931ebc\") " Oct 03 08:57:15 crc kubenswrapper[4664]: I1003 08:57:15.711909 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7b89758-8086-41ef-8ed6-eaa91f931ebc-catalog-content\") pod \"c7b89758-8086-41ef-8ed6-eaa91f931ebc\" (UID: \"c7b89758-8086-41ef-8ed6-eaa91f931ebc\") " Oct 03 08:57:15 crc kubenswrapper[4664]: I1003 08:57:15.712074 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7b89758-8086-41ef-8ed6-eaa91f931ebc-utilities\") pod \"c7b89758-8086-41ef-8ed6-eaa91f931ebc\" (UID: \"c7b89758-8086-41ef-8ed6-eaa91f931ebc\") " Oct 03 08:57:15 crc kubenswrapper[4664]: I1003 08:57:15.713185 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7b89758-8086-41ef-8ed6-eaa91f931ebc-utilities" (OuterVolumeSpecName: "utilities") pod "c7b89758-8086-41ef-8ed6-eaa91f931ebc" (UID: "c7b89758-8086-41ef-8ed6-eaa91f931ebc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:57:15 crc kubenswrapper[4664]: I1003 08:57:15.719800 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7b89758-8086-41ef-8ed6-eaa91f931ebc-kube-api-access-lccfg" (OuterVolumeSpecName: "kube-api-access-lccfg") pod "c7b89758-8086-41ef-8ed6-eaa91f931ebc" (UID: "c7b89758-8086-41ef-8ed6-eaa91f931ebc"). InnerVolumeSpecName "kube-api-access-lccfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:57:15 crc kubenswrapper[4664]: I1003 08:57:15.770568 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7b89758-8086-41ef-8ed6-eaa91f931ebc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c7b89758-8086-41ef-8ed6-eaa91f931ebc" (UID: "c7b89758-8086-41ef-8ed6-eaa91f931ebc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:57:15 crc kubenswrapper[4664]: I1003 08:57:15.815309 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lccfg\" (UniqueName: \"kubernetes.io/projected/c7b89758-8086-41ef-8ed6-eaa91f931ebc-kube-api-access-lccfg\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:15 crc kubenswrapper[4664]: I1003 08:57:15.815356 4664 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7b89758-8086-41ef-8ed6-eaa91f931ebc-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:15 crc kubenswrapper[4664]: I1003 08:57:15.815370 4664 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7b89758-8086-41ef-8ed6-eaa91f931ebc-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:15 crc kubenswrapper[4664]: I1003 08:57:15.888706 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dd2e0f0-7fe3-460d-8fdc-269a7796dce4" path="/var/lib/kubelet/pods/8dd2e0f0-7fe3-460d-8fdc-269a7796dce4/volumes" Oct 03 08:57:16 crc kubenswrapper[4664]: I1003 08:57:16.017053 4664 generic.go:334] "Generic (PLEG): container finished" podID="c7b89758-8086-41ef-8ed6-eaa91f931ebc" containerID="08dd3f8c019fffd356158e74957e7752cb34c547e205efc4566fea7085bbe067" exitCode=0 Oct 03 08:57:16 crc kubenswrapper[4664]: I1003 08:57:16.017188 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k7zvg" Oct 03 08:57:16 crc kubenswrapper[4664]: I1003 08:57:16.017420 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k7zvg" event={"ID":"c7b89758-8086-41ef-8ed6-eaa91f931ebc","Type":"ContainerDied","Data":"08dd3f8c019fffd356158e74957e7752cb34c547e205efc4566fea7085bbe067"} Oct 03 08:57:16 crc kubenswrapper[4664]: I1003 08:57:16.017528 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k7zvg" event={"ID":"c7b89758-8086-41ef-8ed6-eaa91f931ebc","Type":"ContainerDied","Data":"2a72764d043ad72ab0141ed8a8731ca8e04248ce0ddddf3f1157064ba8a3a0ea"} Oct 03 08:57:16 crc kubenswrapper[4664]: I1003 08:57:16.017633 4664 scope.go:117] "RemoveContainer" containerID="08dd3f8c019fffd356158e74957e7752cb34c547e205efc4566fea7085bbe067" Oct 03 08:57:16 crc kubenswrapper[4664]: I1003 08:57:16.086928 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k7zvg"] Oct 03 08:57:16 crc kubenswrapper[4664]: I1003 08:57:16.088967 4664 scope.go:117] "RemoveContainer" containerID="21976ee721ae44bce1e97254ad387c71efe4dd3680940c2bce4d735ef7d98c62" Oct 03 08:57:16 crc kubenswrapper[4664]: I1003 08:57:16.100508 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-k7zvg"] Oct 03 08:57:16 crc kubenswrapper[4664]: I1003 08:57:16.114458 4664 scope.go:117] "RemoveContainer" containerID="290527915f0831c71ee7529dec6cb01b6a401ee7ec588e48b13a6e4059baeb53" Oct 03 08:57:16 crc kubenswrapper[4664]: I1003 08:57:16.157081 4664 scope.go:117] "RemoveContainer" containerID="08dd3f8c019fffd356158e74957e7752cb34c547e205efc4566fea7085bbe067" Oct 03 08:57:16 crc kubenswrapper[4664]: E1003 08:57:16.157633 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08dd3f8c019fffd356158e74957e7752cb34c547e205efc4566fea7085bbe067\": container with ID starting with 08dd3f8c019fffd356158e74957e7752cb34c547e205efc4566fea7085bbe067 not found: ID does not exist" containerID="08dd3f8c019fffd356158e74957e7752cb34c547e205efc4566fea7085bbe067" Oct 03 08:57:16 crc kubenswrapper[4664]: I1003 08:57:16.157730 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08dd3f8c019fffd356158e74957e7752cb34c547e205efc4566fea7085bbe067"} err="failed to get container status \"08dd3f8c019fffd356158e74957e7752cb34c547e205efc4566fea7085bbe067\": rpc error: code = NotFound desc = could not find container \"08dd3f8c019fffd356158e74957e7752cb34c547e205efc4566fea7085bbe067\": container with ID starting with 08dd3f8c019fffd356158e74957e7752cb34c547e205efc4566fea7085bbe067 not found: ID does not exist" Oct 03 08:57:16 crc kubenswrapper[4664]: I1003 08:57:16.157762 4664 scope.go:117] "RemoveContainer" containerID="21976ee721ae44bce1e97254ad387c71efe4dd3680940c2bce4d735ef7d98c62" Oct 03 08:57:16 crc kubenswrapper[4664]: E1003 08:57:16.158200 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21976ee721ae44bce1e97254ad387c71efe4dd3680940c2bce4d735ef7d98c62\": container with ID starting with 21976ee721ae44bce1e97254ad387c71efe4dd3680940c2bce4d735ef7d98c62 not found: ID does not exist" containerID="21976ee721ae44bce1e97254ad387c71efe4dd3680940c2bce4d735ef7d98c62" Oct 03 08:57:16 crc kubenswrapper[4664]: I1003 08:57:16.158329 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21976ee721ae44bce1e97254ad387c71efe4dd3680940c2bce4d735ef7d98c62"} err="failed to get container status \"21976ee721ae44bce1e97254ad387c71efe4dd3680940c2bce4d735ef7d98c62\": rpc error: code = NotFound desc = could not find container \"21976ee721ae44bce1e97254ad387c71efe4dd3680940c2bce4d735ef7d98c62\": container with ID starting with 21976ee721ae44bce1e97254ad387c71efe4dd3680940c2bce4d735ef7d98c62 not found: ID does not exist" Oct 03 08:57:16 crc kubenswrapper[4664]: I1003 08:57:16.158387 4664 scope.go:117] "RemoveContainer" containerID="290527915f0831c71ee7529dec6cb01b6a401ee7ec588e48b13a6e4059baeb53" Oct 03 08:57:16 crc kubenswrapper[4664]: E1003 08:57:16.158886 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"290527915f0831c71ee7529dec6cb01b6a401ee7ec588e48b13a6e4059baeb53\": container with ID starting with 290527915f0831c71ee7529dec6cb01b6a401ee7ec588e48b13a6e4059baeb53 not found: ID does not exist" containerID="290527915f0831c71ee7529dec6cb01b6a401ee7ec588e48b13a6e4059baeb53" Oct 03 08:57:16 crc kubenswrapper[4664]: I1003 08:57:16.159074 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"290527915f0831c71ee7529dec6cb01b6a401ee7ec588e48b13a6e4059baeb53"} err="failed to get container status \"290527915f0831c71ee7529dec6cb01b6a401ee7ec588e48b13a6e4059baeb53\": rpc error: code = NotFound desc = could not find container \"290527915f0831c71ee7529dec6cb01b6a401ee7ec588e48b13a6e4059baeb53\": container with ID starting with 290527915f0831c71ee7529dec6cb01b6a401ee7ec588e48b13a6e4059baeb53 not found: ID does not exist" Oct 03 08:57:17 crc kubenswrapper[4664]: I1003 08:57:17.891554 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7b89758-8086-41ef-8ed6-eaa91f931ebc" path="/var/lib/kubelet/pods/c7b89758-8086-41ef-8ed6-eaa91f931ebc/volumes" Oct 03 08:57:23 crc kubenswrapper[4664]: I1003 08:57:23.876480 4664 scope.go:117] "RemoveContainer" containerID="c2578004127ac42c34a7a434d2b185c8e2f477653435943ff7ad1aca115c81d8" Oct 03 08:57:23 crc kubenswrapper[4664]: E1003 08:57:23.877424 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:57:37 crc kubenswrapper[4664]: I1003 08:57:37.876961 4664 scope.go:117] "RemoveContainer" containerID="c2578004127ac42c34a7a434d2b185c8e2f477653435943ff7ad1aca115c81d8" Oct 03 08:57:37 crc kubenswrapper[4664]: E1003 08:57:37.877755 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 08:57:48 crc kubenswrapper[4664]: I1003 08:57:48.876325 4664 scope.go:117] "RemoveContainer" containerID="c2578004127ac42c34a7a434d2b185c8e2f477653435943ff7ad1aca115c81d8" Oct 03 08:57:49 crc kubenswrapper[4664]: I1003 08:57:49.354390 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" event={"ID":"598b81ce-0ce7-498f-9337-ae5e6e64682b","Type":"ContainerStarted","Data":"4721b0d96904390f5d1bae6f797eb14b686b3ebe327c373eedf7fe5fd3b02c6f"} Oct 03 09:00:00 crc kubenswrapper[4664]: I1003 09:00:00.165066 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324700-f26v5"] Oct 03 09:00:00 crc kubenswrapper[4664]: E1003 09:00:00.166596 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dd2e0f0-7fe3-460d-8fdc-269a7796dce4" containerName="extract-content" Oct 03 09:00:00 crc kubenswrapper[4664]: I1003 09:00:00.166616 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dd2e0f0-7fe3-460d-8fdc-269a7796dce4" containerName="extract-content" Oct 03 09:00:00 crc kubenswrapper[4664]: E1003 09:00:00.166671 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7b89758-8086-41ef-8ed6-eaa91f931ebc" containerName="registry-server" Oct 03 09:00:00 crc kubenswrapper[4664]: I1003 09:00:00.166680 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7b89758-8086-41ef-8ed6-eaa91f931ebc" containerName="registry-server" Oct 03 09:00:00 crc kubenswrapper[4664]: E1003 09:00:00.166717 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7b89758-8086-41ef-8ed6-eaa91f931ebc" containerName="extract-utilities" Oct 03 09:00:00 crc kubenswrapper[4664]: I1003 09:00:00.166725 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7b89758-8086-41ef-8ed6-eaa91f931ebc" containerName="extract-utilities" Oct 03 09:00:00 crc kubenswrapper[4664]: E1003 09:00:00.166735 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7b89758-8086-41ef-8ed6-eaa91f931ebc" containerName="extract-content" Oct 03 09:00:00 crc kubenswrapper[4664]: I1003 09:00:00.166742 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7b89758-8086-41ef-8ed6-eaa91f931ebc" containerName="extract-content" Oct 03 09:00:00 crc kubenswrapper[4664]: E1003 09:00:00.166753 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dd2e0f0-7fe3-460d-8fdc-269a7796dce4" containerName="registry-server" Oct 03 09:00:00 crc kubenswrapper[4664]: I1003 09:00:00.166761 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dd2e0f0-7fe3-460d-8fdc-269a7796dce4" containerName="registry-server" Oct 03 09:00:00 crc kubenswrapper[4664]: E1003 09:00:00.166781 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dd2e0f0-7fe3-460d-8fdc-269a7796dce4" containerName="extract-utilities" Oct 03 09:00:00 crc kubenswrapper[4664]: I1003 09:00:00.166790 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dd2e0f0-7fe3-460d-8fdc-269a7796dce4" containerName="extract-utilities" Oct 03 09:00:00 crc kubenswrapper[4664]: I1003 09:00:00.167024 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dd2e0f0-7fe3-460d-8fdc-269a7796dce4" containerName="registry-server" Oct 03 09:00:00 crc kubenswrapper[4664]: I1003 09:00:00.167048 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7b89758-8086-41ef-8ed6-eaa91f931ebc" containerName="registry-server" Oct 03 09:00:00 crc kubenswrapper[4664]: I1003 09:00:00.169322 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-f26v5" Oct 03 09:00:00 crc kubenswrapper[4664]: I1003 09:00:00.173305 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 09:00:00 crc kubenswrapper[4664]: I1003 09:00:00.173576 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 09:00:00 crc kubenswrapper[4664]: I1003 09:00:00.184709 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324700-f26v5"] Oct 03 09:00:00 crc kubenswrapper[4664]: I1003 09:00:00.196255 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mp5gr\" (UniqueName: \"kubernetes.io/projected/eac48591-9e4d-4bde-afe9-c4c42f902388-kube-api-access-mp5gr\") pod \"collect-profiles-29324700-f26v5\" (UID: \"eac48591-9e4d-4bde-afe9-c4c42f902388\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-f26v5" Oct 03 09:00:00 crc kubenswrapper[4664]: I1003 09:00:00.196331 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eac48591-9e4d-4bde-afe9-c4c42f902388-config-volume\") pod \"collect-profiles-29324700-f26v5\" (UID: \"eac48591-9e4d-4bde-afe9-c4c42f902388\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-f26v5" Oct 03 09:00:00 crc kubenswrapper[4664]: I1003 09:00:00.196439 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eac48591-9e4d-4bde-afe9-c4c42f902388-secret-volume\") pod \"collect-profiles-29324700-f26v5\" (UID: \"eac48591-9e4d-4bde-afe9-c4c42f902388\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-f26v5" Oct 03 09:00:00 crc kubenswrapper[4664]: I1003 09:00:00.298201 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mp5gr\" (UniqueName: \"kubernetes.io/projected/eac48591-9e4d-4bde-afe9-c4c42f902388-kube-api-access-mp5gr\") pod \"collect-profiles-29324700-f26v5\" (UID: \"eac48591-9e4d-4bde-afe9-c4c42f902388\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-f26v5" Oct 03 09:00:00 crc kubenswrapper[4664]: I1003 09:00:00.298309 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eac48591-9e4d-4bde-afe9-c4c42f902388-config-volume\") pod \"collect-profiles-29324700-f26v5\" (UID: \"eac48591-9e4d-4bde-afe9-c4c42f902388\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-f26v5" Oct 03 09:00:00 crc kubenswrapper[4664]: I1003 09:00:00.298393 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eac48591-9e4d-4bde-afe9-c4c42f902388-secret-volume\") pod \"collect-profiles-29324700-f26v5\" (UID: \"eac48591-9e4d-4bde-afe9-c4c42f902388\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-f26v5" Oct 03 09:00:00 crc kubenswrapper[4664]: I1003 09:00:00.299549 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eac48591-9e4d-4bde-afe9-c4c42f902388-config-volume\") pod \"collect-profiles-29324700-f26v5\" (UID: \"eac48591-9e4d-4bde-afe9-c4c42f902388\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-f26v5" Oct 03 09:00:00 crc kubenswrapper[4664]: I1003 09:00:00.308097 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eac48591-9e4d-4bde-afe9-c4c42f902388-secret-volume\") pod \"collect-profiles-29324700-f26v5\" (UID: \"eac48591-9e4d-4bde-afe9-c4c42f902388\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-f26v5" Oct 03 09:00:00 crc kubenswrapper[4664]: I1003 09:00:00.320473 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mp5gr\" (UniqueName: \"kubernetes.io/projected/eac48591-9e4d-4bde-afe9-c4c42f902388-kube-api-access-mp5gr\") pod \"collect-profiles-29324700-f26v5\" (UID: \"eac48591-9e4d-4bde-afe9-c4c42f902388\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-f26v5" Oct 03 09:00:00 crc kubenswrapper[4664]: I1003 09:00:00.502400 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-f26v5" Oct 03 09:00:01 crc kubenswrapper[4664]: I1003 09:00:01.000540 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324700-f26v5"] Oct 03 09:00:01 crc kubenswrapper[4664]: W1003 09:00:01.008016 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeac48591_9e4d_4bde_afe9_c4c42f902388.slice/crio-b1e824345d46eb7d86a803ce48dbfe4a73cde07a22cae62e8e53e89fc5b6399a WatchSource:0}: Error finding container b1e824345d46eb7d86a803ce48dbfe4a73cde07a22cae62e8e53e89fc5b6399a: Status 404 returned error can't find the container with id b1e824345d46eb7d86a803ce48dbfe4a73cde07a22cae62e8e53e89fc5b6399a Oct 03 09:00:01 crc kubenswrapper[4664]: I1003 09:00:01.837516 4664 generic.go:334] "Generic (PLEG): container finished" podID="eac48591-9e4d-4bde-afe9-c4c42f902388" containerID="fb3e9882205f4fb4dc0337630438b05e551080fad0e3353b457d587d7fcf8f2c" exitCode=0 Oct 03 09:00:01 crc kubenswrapper[4664]: I1003 09:00:01.837574 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-f26v5" event={"ID":"eac48591-9e4d-4bde-afe9-c4c42f902388","Type":"ContainerDied","Data":"fb3e9882205f4fb4dc0337630438b05e551080fad0e3353b457d587d7fcf8f2c"} Oct 03 09:00:01 crc kubenswrapper[4664]: I1003 09:00:01.838004 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-f26v5" event={"ID":"eac48591-9e4d-4bde-afe9-c4c42f902388","Type":"ContainerStarted","Data":"b1e824345d46eb7d86a803ce48dbfe4a73cde07a22cae62e8e53e89fc5b6399a"} Oct 03 09:00:03 crc kubenswrapper[4664]: I1003 09:00:03.198518 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-f26v5" Oct 03 09:00:03 crc kubenswrapper[4664]: I1003 09:00:03.383393 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mp5gr\" (UniqueName: \"kubernetes.io/projected/eac48591-9e4d-4bde-afe9-c4c42f902388-kube-api-access-mp5gr\") pod \"eac48591-9e4d-4bde-afe9-c4c42f902388\" (UID: \"eac48591-9e4d-4bde-afe9-c4c42f902388\") " Oct 03 09:00:03 crc kubenswrapper[4664]: I1003 09:00:03.383500 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eac48591-9e4d-4bde-afe9-c4c42f902388-config-volume\") pod \"eac48591-9e4d-4bde-afe9-c4c42f902388\" (UID: \"eac48591-9e4d-4bde-afe9-c4c42f902388\") " Oct 03 09:00:03 crc kubenswrapper[4664]: I1003 09:00:03.383637 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eac48591-9e4d-4bde-afe9-c4c42f902388-secret-volume\") pod \"eac48591-9e4d-4bde-afe9-c4c42f902388\" (UID: \"eac48591-9e4d-4bde-afe9-c4c42f902388\") " Oct 03 09:00:03 crc kubenswrapper[4664]: I1003 09:00:03.384995 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eac48591-9e4d-4bde-afe9-c4c42f902388-config-volume" (OuterVolumeSpecName: "config-volume") pod "eac48591-9e4d-4bde-afe9-c4c42f902388" (UID: "eac48591-9e4d-4bde-afe9-c4c42f902388"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:00:03 crc kubenswrapper[4664]: I1003 09:00:03.392455 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eac48591-9e4d-4bde-afe9-c4c42f902388-kube-api-access-mp5gr" (OuterVolumeSpecName: "kube-api-access-mp5gr") pod "eac48591-9e4d-4bde-afe9-c4c42f902388" (UID: "eac48591-9e4d-4bde-afe9-c4c42f902388"). InnerVolumeSpecName "kube-api-access-mp5gr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:00:03 crc kubenswrapper[4664]: I1003 09:00:03.393546 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eac48591-9e4d-4bde-afe9-c4c42f902388-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "eac48591-9e4d-4bde-afe9-c4c42f902388" (UID: "eac48591-9e4d-4bde-afe9-c4c42f902388"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:00:03 crc kubenswrapper[4664]: I1003 09:00:03.486841 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mp5gr\" (UniqueName: \"kubernetes.io/projected/eac48591-9e4d-4bde-afe9-c4c42f902388-kube-api-access-mp5gr\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:03 crc kubenswrapper[4664]: I1003 09:00:03.486896 4664 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eac48591-9e4d-4bde-afe9-c4c42f902388-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:03 crc kubenswrapper[4664]: I1003 09:00:03.486907 4664 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eac48591-9e4d-4bde-afe9-c4c42f902388-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:03 crc kubenswrapper[4664]: I1003 09:00:03.868547 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-f26v5" event={"ID":"eac48591-9e4d-4bde-afe9-c4c42f902388","Type":"ContainerDied","Data":"b1e824345d46eb7d86a803ce48dbfe4a73cde07a22cae62e8e53e89fc5b6399a"} Oct 03 09:00:03 crc kubenswrapper[4664]: I1003 09:00:03.868600 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1e824345d46eb7d86a803ce48dbfe4a73cde07a22cae62e8e53e89fc5b6399a" Oct 03 09:00:03 crc kubenswrapper[4664]: I1003 09:00:03.868681 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-f26v5" Oct 03 09:00:04 crc kubenswrapper[4664]: I1003 09:00:04.285275 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324655-n56qp"] Oct 03 09:00:04 crc kubenswrapper[4664]: I1003 09:00:04.294091 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324655-n56qp"] Oct 03 09:00:05 crc kubenswrapper[4664]: I1003 09:00:05.888891 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7061a942-1c4a-417d-a7ba-9082b5a64147" path="/var/lib/kubelet/pods/7061a942-1c4a-417d-a7ba-9082b5a64147/volumes" Oct 03 09:00:11 crc kubenswrapper[4664]: I1003 09:00:11.987426 4664 patch_prober.go:28] interesting pod/machine-config-daemon-x9dgm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:00:11 crc kubenswrapper[4664]: I1003 09:00:11.988357 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:00:27 crc kubenswrapper[4664]: I1003 09:00:27.292895 4664 scope.go:117] "RemoveContainer" containerID="e21dc86ae7adf5be3298fc10c57f4f4752222547e3a2c5389335e8572afbbffb" Oct 03 09:00:41 crc kubenswrapper[4664]: I1003 09:00:41.987594 4664 patch_prober.go:28] interesting pod/machine-config-daemon-x9dgm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:00:41 crc kubenswrapper[4664]: I1003 09:00:41.988221 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:01:00 crc kubenswrapper[4664]: I1003 09:01:00.156281 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29324701-s7s4z"] Oct 03 09:01:00 crc kubenswrapper[4664]: E1003 09:01:00.157657 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eac48591-9e4d-4bde-afe9-c4c42f902388" containerName="collect-profiles" Oct 03 09:01:00 crc kubenswrapper[4664]: I1003 09:01:00.157676 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="eac48591-9e4d-4bde-afe9-c4c42f902388" containerName="collect-profiles" Oct 03 09:01:00 crc kubenswrapper[4664]: I1003 09:01:00.157939 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="eac48591-9e4d-4bde-afe9-c4c42f902388" containerName="collect-profiles" Oct 03 09:01:00 crc kubenswrapper[4664]: I1003 09:01:00.158852 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29324701-s7s4z" Oct 03 09:01:00 crc kubenswrapper[4664]: I1003 09:01:00.169773 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29324701-s7s4z"] Oct 03 09:01:00 crc kubenswrapper[4664]: I1003 09:01:00.204003 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58076b89-9a2c-42e4-83dc-f26ef09f5d55-config-data\") pod \"keystone-cron-29324701-s7s4z\" (UID: \"58076b89-9a2c-42e4-83dc-f26ef09f5d55\") " pod="openstack/keystone-cron-29324701-s7s4z" Oct 03 09:01:00 crc kubenswrapper[4664]: I1003 09:01:00.204161 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/58076b89-9a2c-42e4-83dc-f26ef09f5d55-fernet-keys\") pod \"keystone-cron-29324701-s7s4z\" (UID: \"58076b89-9a2c-42e4-83dc-f26ef09f5d55\") " pod="openstack/keystone-cron-29324701-s7s4z" Oct 03 09:01:00 crc kubenswrapper[4664]: I1003 09:01:00.204230 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkkht\" (UniqueName: \"kubernetes.io/projected/58076b89-9a2c-42e4-83dc-f26ef09f5d55-kube-api-access-xkkht\") pod \"keystone-cron-29324701-s7s4z\" (UID: \"58076b89-9a2c-42e4-83dc-f26ef09f5d55\") " pod="openstack/keystone-cron-29324701-s7s4z" Oct 03 09:01:00 crc kubenswrapper[4664]: I1003 09:01:00.204258 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58076b89-9a2c-42e4-83dc-f26ef09f5d55-combined-ca-bundle\") pod \"keystone-cron-29324701-s7s4z\" (UID: \"58076b89-9a2c-42e4-83dc-f26ef09f5d55\") " pod="openstack/keystone-cron-29324701-s7s4z" Oct 03 09:01:00 crc kubenswrapper[4664]: I1003 09:01:00.306768 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58076b89-9a2c-42e4-83dc-f26ef09f5d55-config-data\") pod \"keystone-cron-29324701-s7s4z\" (UID: \"58076b89-9a2c-42e4-83dc-f26ef09f5d55\") " pod="openstack/keystone-cron-29324701-s7s4z" Oct 03 09:01:00 crc kubenswrapper[4664]: I1003 09:01:00.306898 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/58076b89-9a2c-42e4-83dc-f26ef09f5d55-fernet-keys\") pod \"keystone-cron-29324701-s7s4z\" (UID: \"58076b89-9a2c-42e4-83dc-f26ef09f5d55\") " pod="openstack/keystone-cron-29324701-s7s4z" Oct 03 09:01:00 crc kubenswrapper[4664]: I1003 09:01:00.306965 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkkht\" (UniqueName: \"kubernetes.io/projected/58076b89-9a2c-42e4-83dc-f26ef09f5d55-kube-api-access-xkkht\") pod \"keystone-cron-29324701-s7s4z\" (UID: \"58076b89-9a2c-42e4-83dc-f26ef09f5d55\") " pod="openstack/keystone-cron-29324701-s7s4z" Oct 03 09:01:00 crc kubenswrapper[4664]: I1003 09:01:00.306994 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58076b89-9a2c-42e4-83dc-f26ef09f5d55-combined-ca-bundle\") pod \"keystone-cron-29324701-s7s4z\" (UID: \"58076b89-9a2c-42e4-83dc-f26ef09f5d55\") " pod="openstack/keystone-cron-29324701-s7s4z" Oct 03 09:01:00 crc kubenswrapper[4664]: I1003 09:01:00.316428 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/58076b89-9a2c-42e4-83dc-f26ef09f5d55-fernet-keys\") pod \"keystone-cron-29324701-s7s4z\" (UID: \"58076b89-9a2c-42e4-83dc-f26ef09f5d55\") " pod="openstack/keystone-cron-29324701-s7s4z" Oct 03 09:01:00 crc kubenswrapper[4664]: I1003 09:01:00.316620 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58076b89-9a2c-42e4-83dc-f26ef09f5d55-config-data\") pod \"keystone-cron-29324701-s7s4z\" (UID: \"58076b89-9a2c-42e4-83dc-f26ef09f5d55\") " pod="openstack/keystone-cron-29324701-s7s4z" Oct 03 09:01:00 crc kubenswrapper[4664]: I1003 09:01:00.323392 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58076b89-9a2c-42e4-83dc-f26ef09f5d55-combined-ca-bundle\") pod \"keystone-cron-29324701-s7s4z\" (UID: \"58076b89-9a2c-42e4-83dc-f26ef09f5d55\") " pod="openstack/keystone-cron-29324701-s7s4z" Oct 03 09:01:00 crc kubenswrapper[4664]: I1003 09:01:00.325406 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkkht\" (UniqueName: \"kubernetes.io/projected/58076b89-9a2c-42e4-83dc-f26ef09f5d55-kube-api-access-xkkht\") pod \"keystone-cron-29324701-s7s4z\" (UID: \"58076b89-9a2c-42e4-83dc-f26ef09f5d55\") " pod="openstack/keystone-cron-29324701-s7s4z" Oct 03 09:01:00 crc kubenswrapper[4664]: I1003 09:01:00.486653 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29324701-s7s4z" Oct 03 09:01:01 crc kubenswrapper[4664]: I1003 09:01:00.998228 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29324701-s7s4z"] Oct 03 09:01:01 crc kubenswrapper[4664]: I1003 09:01:01.464350 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29324701-s7s4z" event={"ID":"58076b89-9a2c-42e4-83dc-f26ef09f5d55","Type":"ContainerStarted","Data":"a38cfebd9f556abeef6f97a107429fa8d7a86fac65b2c1cc0db2eeb8db18f784"} Oct 03 09:01:01 crc kubenswrapper[4664]: I1003 09:01:01.464877 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29324701-s7s4z" event={"ID":"58076b89-9a2c-42e4-83dc-f26ef09f5d55","Type":"ContainerStarted","Data":"a41fb2fb98ea2faac4ba4409908ba4696f74e8661f3f86dbdf616c2de818b502"} Oct 03 09:01:01 crc kubenswrapper[4664]: I1003 09:01:01.497725 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29324701-s7s4z" podStartSLOduration=1.497685294 podStartE2EDuration="1.497685294s" podCreationTimestamp="2025-10-03 09:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:01:01.487102761 +0000 UTC m=+4362.308293251" watchObservedRunningTime="2025-10-03 09:01:01.497685294 +0000 UTC m=+4362.318875794" Oct 03 09:01:04 crc kubenswrapper[4664]: I1003 09:01:04.505393 4664 generic.go:334] "Generic (PLEG): container finished" podID="58076b89-9a2c-42e4-83dc-f26ef09f5d55" containerID="a38cfebd9f556abeef6f97a107429fa8d7a86fac65b2c1cc0db2eeb8db18f784" exitCode=0 Oct 03 09:01:04 crc kubenswrapper[4664]: I1003 09:01:04.505482 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29324701-s7s4z" event={"ID":"58076b89-9a2c-42e4-83dc-f26ef09f5d55","Type":"ContainerDied","Data":"a38cfebd9f556abeef6f97a107429fa8d7a86fac65b2c1cc0db2eeb8db18f784"} Oct 03 09:01:06 crc kubenswrapper[4664]: I1003 09:01:06.016399 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29324701-s7s4z" Oct 03 09:01:06 crc kubenswrapper[4664]: I1003 09:01:06.140256 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58076b89-9a2c-42e4-83dc-f26ef09f5d55-combined-ca-bundle\") pod \"58076b89-9a2c-42e4-83dc-f26ef09f5d55\" (UID: \"58076b89-9a2c-42e4-83dc-f26ef09f5d55\") " Oct 03 09:01:06 crc kubenswrapper[4664]: I1003 09:01:06.140324 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58076b89-9a2c-42e4-83dc-f26ef09f5d55-config-data\") pod \"58076b89-9a2c-42e4-83dc-f26ef09f5d55\" (UID: \"58076b89-9a2c-42e4-83dc-f26ef09f5d55\") " Oct 03 09:01:06 crc kubenswrapper[4664]: I1003 09:01:06.140419 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkkht\" (UniqueName: \"kubernetes.io/projected/58076b89-9a2c-42e4-83dc-f26ef09f5d55-kube-api-access-xkkht\") pod \"58076b89-9a2c-42e4-83dc-f26ef09f5d55\" (UID: \"58076b89-9a2c-42e4-83dc-f26ef09f5d55\") " Oct 03 09:01:06 crc kubenswrapper[4664]: I1003 09:01:06.140585 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/58076b89-9a2c-42e4-83dc-f26ef09f5d55-fernet-keys\") pod \"58076b89-9a2c-42e4-83dc-f26ef09f5d55\" (UID: \"58076b89-9a2c-42e4-83dc-f26ef09f5d55\") " Oct 03 09:01:06 crc kubenswrapper[4664]: I1003 09:01:06.154250 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58076b89-9a2c-42e4-83dc-f26ef09f5d55-kube-api-access-xkkht" (OuterVolumeSpecName: "kube-api-access-xkkht") pod "58076b89-9a2c-42e4-83dc-f26ef09f5d55" (UID: "58076b89-9a2c-42e4-83dc-f26ef09f5d55"). InnerVolumeSpecName "kube-api-access-xkkht". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:01:06 crc kubenswrapper[4664]: I1003 09:01:06.160640 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58076b89-9a2c-42e4-83dc-f26ef09f5d55-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "58076b89-9a2c-42e4-83dc-f26ef09f5d55" (UID: "58076b89-9a2c-42e4-83dc-f26ef09f5d55"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:06 crc kubenswrapper[4664]: I1003 09:01:06.200737 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58076b89-9a2c-42e4-83dc-f26ef09f5d55-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "58076b89-9a2c-42e4-83dc-f26ef09f5d55" (UID: "58076b89-9a2c-42e4-83dc-f26ef09f5d55"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:06 crc kubenswrapper[4664]: I1003 09:01:06.221706 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58076b89-9a2c-42e4-83dc-f26ef09f5d55-config-data" (OuterVolumeSpecName: "config-data") pod "58076b89-9a2c-42e4-83dc-f26ef09f5d55" (UID: "58076b89-9a2c-42e4-83dc-f26ef09f5d55"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:06 crc kubenswrapper[4664]: I1003 09:01:06.244681 4664 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/58076b89-9a2c-42e4-83dc-f26ef09f5d55-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:06 crc kubenswrapper[4664]: I1003 09:01:06.244740 4664 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58076b89-9a2c-42e4-83dc-f26ef09f5d55-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:06 crc kubenswrapper[4664]: I1003 09:01:06.244757 4664 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58076b89-9a2c-42e4-83dc-f26ef09f5d55-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:06 crc kubenswrapper[4664]: I1003 09:01:06.244781 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkkht\" (UniqueName: \"kubernetes.io/projected/58076b89-9a2c-42e4-83dc-f26ef09f5d55-kube-api-access-xkkht\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:06 crc kubenswrapper[4664]: I1003 09:01:06.525456 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29324701-s7s4z" event={"ID":"58076b89-9a2c-42e4-83dc-f26ef09f5d55","Type":"ContainerDied","Data":"a41fb2fb98ea2faac4ba4409908ba4696f74e8661f3f86dbdf616c2de818b502"} Oct 03 09:01:06 crc kubenswrapper[4664]: I1003 09:01:06.525836 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a41fb2fb98ea2faac4ba4409908ba4696f74e8661f3f86dbdf616c2de818b502" Oct 03 09:01:06 crc kubenswrapper[4664]: I1003 09:01:06.525520 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29324701-s7s4z" Oct 03 09:01:11 crc kubenswrapper[4664]: I1003 09:01:11.987549 4664 patch_prober.go:28] interesting pod/machine-config-daemon-x9dgm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:01:11 crc kubenswrapper[4664]: I1003 09:01:11.988505 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:01:11 crc kubenswrapper[4664]: I1003 09:01:11.988565 4664 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" Oct 03 09:01:11 crc kubenswrapper[4664]: I1003 09:01:11.989758 4664 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4721b0d96904390f5d1bae6f797eb14b686b3ebe327c373eedf7fe5fd3b02c6f"} pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 09:01:11 crc kubenswrapper[4664]: I1003 09:01:11.989836 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" containerID="cri-o://4721b0d96904390f5d1bae6f797eb14b686b3ebe327c373eedf7fe5fd3b02c6f" gracePeriod=600 Oct 03 09:01:12 crc kubenswrapper[4664]: I1003 09:01:12.595866 4664 generic.go:334] "Generic (PLEG): container finished" podID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerID="4721b0d96904390f5d1bae6f797eb14b686b3ebe327c373eedf7fe5fd3b02c6f" exitCode=0 Oct 03 09:01:12 crc kubenswrapper[4664]: I1003 09:01:12.596301 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" event={"ID":"598b81ce-0ce7-498f-9337-ae5e6e64682b","Type":"ContainerDied","Data":"4721b0d96904390f5d1bae6f797eb14b686b3ebe327c373eedf7fe5fd3b02c6f"} Oct 03 09:01:12 crc kubenswrapper[4664]: I1003 09:01:12.596349 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" event={"ID":"598b81ce-0ce7-498f-9337-ae5e6e64682b","Type":"ContainerStarted","Data":"eb1fa260f925060668dce2865fd89a378a86a7eb69d6cc31ef71fbb4baeb45da"} Oct 03 09:01:12 crc kubenswrapper[4664]: I1003 09:01:12.596371 4664 scope.go:117] "RemoveContainer" containerID="c2578004127ac42c34a7a434d2b185c8e2f477653435943ff7ad1aca115c81d8" Oct 03 09:03:41 crc kubenswrapper[4664]: I1003 09:03:41.987658 4664 patch_prober.go:28] interesting pod/machine-config-daemon-x9dgm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:03:41 crc kubenswrapper[4664]: I1003 09:03:41.988264 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:03:48 crc kubenswrapper[4664]: I1003 09:03:48.324341 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6td26"] Oct 03 09:03:48 crc kubenswrapper[4664]: E1003 09:03:48.325523 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58076b89-9a2c-42e4-83dc-f26ef09f5d55" containerName="keystone-cron" Oct 03 09:03:48 crc kubenswrapper[4664]: I1003 09:03:48.325538 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="58076b89-9a2c-42e4-83dc-f26ef09f5d55" containerName="keystone-cron" Oct 03 09:03:48 crc kubenswrapper[4664]: I1003 09:03:48.325798 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="58076b89-9a2c-42e4-83dc-f26ef09f5d55" containerName="keystone-cron" Oct 03 09:03:48 crc kubenswrapper[4664]: I1003 09:03:48.327590 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6td26" Oct 03 09:03:48 crc kubenswrapper[4664]: I1003 09:03:48.392534 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rxw5\" (UniqueName: \"kubernetes.io/projected/afc7650b-353f-4385-a823-2106df87a3f4-kube-api-access-4rxw5\") pod \"certified-operators-6td26\" (UID: \"afc7650b-353f-4385-a823-2106df87a3f4\") " pod="openshift-marketplace/certified-operators-6td26" Oct 03 09:03:48 crc kubenswrapper[4664]: I1003 09:03:48.392901 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afc7650b-353f-4385-a823-2106df87a3f4-catalog-content\") pod \"certified-operators-6td26\" (UID: \"afc7650b-353f-4385-a823-2106df87a3f4\") " pod="openshift-marketplace/certified-operators-6td26" Oct 03 09:03:48 crc kubenswrapper[4664]: I1003 09:03:48.393259 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afc7650b-353f-4385-a823-2106df87a3f4-utilities\") pod \"certified-operators-6td26\" (UID: \"afc7650b-353f-4385-a823-2106df87a3f4\") " pod="openshift-marketplace/certified-operators-6td26" Oct 03 09:03:48 crc kubenswrapper[4664]: I1003 09:03:48.410061 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6td26"] Oct 03 09:03:48 crc kubenswrapper[4664]: I1003 09:03:48.494586 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rxw5\" (UniqueName: \"kubernetes.io/projected/afc7650b-353f-4385-a823-2106df87a3f4-kube-api-access-4rxw5\") pod \"certified-operators-6td26\" (UID: \"afc7650b-353f-4385-a823-2106df87a3f4\") " pod="openshift-marketplace/certified-operators-6td26" Oct 03 09:03:48 crc kubenswrapper[4664]: I1003 09:03:48.494741 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afc7650b-353f-4385-a823-2106df87a3f4-catalog-content\") pod \"certified-operators-6td26\" (UID: \"afc7650b-353f-4385-a823-2106df87a3f4\") " pod="openshift-marketplace/certified-operators-6td26" Oct 03 09:03:48 crc kubenswrapper[4664]: I1003 09:03:48.494826 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afc7650b-353f-4385-a823-2106df87a3f4-utilities\") pod \"certified-operators-6td26\" (UID: \"afc7650b-353f-4385-a823-2106df87a3f4\") " pod="openshift-marketplace/certified-operators-6td26" Oct 03 09:03:48 crc kubenswrapper[4664]: I1003 09:03:48.495632 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afc7650b-353f-4385-a823-2106df87a3f4-utilities\") pod \"certified-operators-6td26\" (UID: \"afc7650b-353f-4385-a823-2106df87a3f4\") " pod="openshift-marketplace/certified-operators-6td26" Oct 03 09:03:48 crc kubenswrapper[4664]: I1003 09:03:48.495993 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afc7650b-353f-4385-a823-2106df87a3f4-catalog-content\") pod \"certified-operators-6td26\" (UID: \"afc7650b-353f-4385-a823-2106df87a3f4\") " pod="openshift-marketplace/certified-operators-6td26" Oct 03 09:03:48 crc kubenswrapper[4664]: I1003 09:03:48.722635 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rxw5\" (UniqueName: \"kubernetes.io/projected/afc7650b-353f-4385-a823-2106df87a3f4-kube-api-access-4rxw5\") pod \"certified-operators-6td26\" (UID: \"afc7650b-353f-4385-a823-2106df87a3f4\") " pod="openshift-marketplace/certified-operators-6td26" Oct 03 09:03:49 crc kubenswrapper[4664]: I1003 09:03:49.008718 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6td26" Oct 03 09:03:49 crc kubenswrapper[4664]: I1003 09:03:49.473549 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6td26"] Oct 03 09:03:50 crc kubenswrapper[4664]: I1003 09:03:50.218290 4664 generic.go:334] "Generic (PLEG): container finished" podID="afc7650b-353f-4385-a823-2106df87a3f4" containerID="f3d3830f83842d00cf71dc484cedda55303ec6576c42ae3367f377118da4d685" exitCode=0 Oct 03 09:03:50 crc kubenswrapper[4664]: I1003 09:03:50.218371 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6td26" event={"ID":"afc7650b-353f-4385-a823-2106df87a3f4","Type":"ContainerDied","Data":"f3d3830f83842d00cf71dc484cedda55303ec6576c42ae3367f377118da4d685"} Oct 03 09:03:50 crc kubenswrapper[4664]: I1003 09:03:50.218423 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6td26" event={"ID":"afc7650b-353f-4385-a823-2106df87a3f4","Type":"ContainerStarted","Data":"3578c9b1448b1e7040764f9ab10ac126c1afaa44a6b354a433017cc4ab256d3b"} Oct 03 09:03:50 crc kubenswrapper[4664]: I1003 09:03:50.221363 4664 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 09:03:51 crc kubenswrapper[4664]: I1003 09:03:51.233697 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6td26" event={"ID":"afc7650b-353f-4385-a823-2106df87a3f4","Type":"ContainerStarted","Data":"3bea9766da2480a91c269fb2cf32838e7cb8fd46420800320df90c1c76474180"} Oct 03 09:03:52 crc kubenswrapper[4664]: I1003 09:03:52.246124 4664 generic.go:334] "Generic (PLEG): container finished" podID="afc7650b-353f-4385-a823-2106df87a3f4" containerID="3bea9766da2480a91c269fb2cf32838e7cb8fd46420800320df90c1c76474180" exitCode=0 Oct 03 09:03:52 crc kubenswrapper[4664]: I1003 09:03:52.247038 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6td26" event={"ID":"afc7650b-353f-4385-a823-2106df87a3f4","Type":"ContainerDied","Data":"3bea9766da2480a91c269fb2cf32838e7cb8fd46420800320df90c1c76474180"} Oct 03 09:03:53 crc kubenswrapper[4664]: I1003 09:03:53.276456 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6td26" event={"ID":"afc7650b-353f-4385-a823-2106df87a3f4","Type":"ContainerStarted","Data":"8f394b024494b2c37411b8edaad4317d3e70f88e4bcb6c6d67570e9e5bd08232"} Oct 03 09:03:53 crc kubenswrapper[4664]: I1003 09:03:53.306099 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6td26" podStartSLOduration=2.8124874 podStartE2EDuration="5.306075131s" podCreationTimestamp="2025-10-03 09:03:48 +0000 UTC" firstStartedPulling="2025-10-03 09:03:50.220722982 +0000 UTC m=+4531.041913472" lastFinishedPulling="2025-10-03 09:03:52.714310723 +0000 UTC m=+4533.535501203" observedRunningTime="2025-10-03 09:03:53.300007387 +0000 UTC m=+4534.121197887" watchObservedRunningTime="2025-10-03 09:03:53.306075131 +0000 UTC m=+4534.127265631" Oct 03 09:03:59 crc kubenswrapper[4664]: I1003 09:03:59.009376 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6td26" Oct 03 09:03:59 crc kubenswrapper[4664]: I1003 09:03:59.011622 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6td26" Oct 03 09:03:59 crc kubenswrapper[4664]: I1003 09:03:59.061183 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6td26" Oct 03 09:03:59 crc kubenswrapper[4664]: I1003 09:03:59.379074 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6td26" Oct 03 09:03:59 crc kubenswrapper[4664]: I1003 09:03:59.433187 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6td26"] Oct 03 09:04:01 crc kubenswrapper[4664]: I1003 09:04:01.345357 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6td26" podUID="afc7650b-353f-4385-a823-2106df87a3f4" containerName="registry-server" containerID="cri-o://8f394b024494b2c37411b8edaad4317d3e70f88e4bcb6c6d67570e9e5bd08232" gracePeriod=2 Oct 03 09:04:01 crc kubenswrapper[4664]: I1003 09:04:01.872279 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6td26" Oct 03 09:04:01 crc kubenswrapper[4664]: I1003 09:04:01.898431 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rxw5\" (UniqueName: \"kubernetes.io/projected/afc7650b-353f-4385-a823-2106df87a3f4-kube-api-access-4rxw5\") pod \"afc7650b-353f-4385-a823-2106df87a3f4\" (UID: \"afc7650b-353f-4385-a823-2106df87a3f4\") " Oct 03 09:04:01 crc kubenswrapper[4664]: I1003 09:04:01.898657 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afc7650b-353f-4385-a823-2106df87a3f4-catalog-content\") pod \"afc7650b-353f-4385-a823-2106df87a3f4\" (UID: \"afc7650b-353f-4385-a823-2106df87a3f4\") " Oct 03 09:04:01 crc kubenswrapper[4664]: I1003 09:04:01.898719 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afc7650b-353f-4385-a823-2106df87a3f4-utilities\") pod \"afc7650b-353f-4385-a823-2106df87a3f4\" (UID: \"afc7650b-353f-4385-a823-2106df87a3f4\") " Oct 03 09:04:01 crc kubenswrapper[4664]: I1003 09:04:01.900811 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afc7650b-353f-4385-a823-2106df87a3f4-utilities" (OuterVolumeSpecName: "utilities") pod "afc7650b-353f-4385-a823-2106df87a3f4" (UID: "afc7650b-353f-4385-a823-2106df87a3f4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:04:01 crc kubenswrapper[4664]: I1003 09:04:01.924801 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afc7650b-353f-4385-a823-2106df87a3f4-kube-api-access-4rxw5" (OuterVolumeSpecName: "kube-api-access-4rxw5") pod "afc7650b-353f-4385-a823-2106df87a3f4" (UID: "afc7650b-353f-4385-a823-2106df87a3f4"). InnerVolumeSpecName "kube-api-access-4rxw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:04:01 crc kubenswrapper[4664]: I1003 09:04:01.970336 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afc7650b-353f-4385-a823-2106df87a3f4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "afc7650b-353f-4385-a823-2106df87a3f4" (UID: "afc7650b-353f-4385-a823-2106df87a3f4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:04:02 crc kubenswrapper[4664]: I1003 09:04:02.001206 4664 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afc7650b-353f-4385-a823-2106df87a3f4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:02 crc kubenswrapper[4664]: I1003 09:04:02.001244 4664 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afc7650b-353f-4385-a823-2106df87a3f4-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:02 crc kubenswrapper[4664]: I1003 09:04:02.001268 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rxw5\" (UniqueName: \"kubernetes.io/projected/afc7650b-353f-4385-a823-2106df87a3f4-kube-api-access-4rxw5\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:02 crc kubenswrapper[4664]: I1003 09:04:02.359177 4664 generic.go:334] "Generic (PLEG): container finished" podID="afc7650b-353f-4385-a823-2106df87a3f4" containerID="8f394b024494b2c37411b8edaad4317d3e70f88e4bcb6c6d67570e9e5bd08232" exitCode=0 Oct 03 09:04:02 crc kubenswrapper[4664]: I1003 09:04:02.359257 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6td26" event={"ID":"afc7650b-353f-4385-a823-2106df87a3f4","Type":"ContainerDied","Data":"8f394b024494b2c37411b8edaad4317d3e70f88e4bcb6c6d67570e9e5bd08232"} Oct 03 09:04:02 crc kubenswrapper[4664]: I1003 09:04:02.359307 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6td26" event={"ID":"afc7650b-353f-4385-a823-2106df87a3f4","Type":"ContainerDied","Data":"3578c9b1448b1e7040764f9ab10ac126c1afaa44a6b354a433017cc4ab256d3b"} Oct 03 09:04:02 crc kubenswrapper[4664]: I1003 09:04:02.359335 4664 scope.go:117] "RemoveContainer" containerID="8f394b024494b2c37411b8edaad4317d3e70f88e4bcb6c6d67570e9e5bd08232" Oct 03 09:04:02 crc kubenswrapper[4664]: I1003 09:04:02.359412 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6td26" Oct 03 09:04:02 crc kubenswrapper[4664]: I1003 09:04:02.388681 4664 scope.go:117] "RemoveContainer" containerID="3bea9766da2480a91c269fb2cf32838e7cb8fd46420800320df90c1c76474180" Oct 03 09:04:02 crc kubenswrapper[4664]: I1003 09:04:02.395900 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6td26"] Oct 03 09:04:02 crc kubenswrapper[4664]: I1003 09:04:02.424085 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6td26"] Oct 03 09:04:02 crc kubenswrapper[4664]: I1003 09:04:02.936040 4664 scope.go:117] "RemoveContainer" containerID="f3d3830f83842d00cf71dc484cedda55303ec6576c42ae3367f377118da4d685" Oct 03 09:04:02 crc kubenswrapper[4664]: I1003 09:04:02.992295 4664 scope.go:117] "RemoveContainer" containerID="8f394b024494b2c37411b8edaad4317d3e70f88e4bcb6c6d67570e9e5bd08232" Oct 03 09:04:02 crc kubenswrapper[4664]: E1003 09:04:02.992843 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f394b024494b2c37411b8edaad4317d3e70f88e4bcb6c6d67570e9e5bd08232\": container with ID starting with 8f394b024494b2c37411b8edaad4317d3e70f88e4bcb6c6d67570e9e5bd08232 not found: ID does not exist" containerID="8f394b024494b2c37411b8edaad4317d3e70f88e4bcb6c6d67570e9e5bd08232" Oct 03 09:04:02 crc kubenswrapper[4664]: I1003 09:04:02.992887 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f394b024494b2c37411b8edaad4317d3e70f88e4bcb6c6d67570e9e5bd08232"} err="failed to get container status \"8f394b024494b2c37411b8edaad4317d3e70f88e4bcb6c6d67570e9e5bd08232\": rpc error: code = NotFound desc = could not find container \"8f394b024494b2c37411b8edaad4317d3e70f88e4bcb6c6d67570e9e5bd08232\": container with ID starting with 8f394b024494b2c37411b8edaad4317d3e70f88e4bcb6c6d67570e9e5bd08232 not found: ID does not exist" Oct 03 09:04:02 crc kubenswrapper[4664]: I1003 09:04:02.992913 4664 scope.go:117] "RemoveContainer" containerID="3bea9766da2480a91c269fb2cf32838e7cb8fd46420800320df90c1c76474180" Oct 03 09:04:02 crc kubenswrapper[4664]: E1003 09:04:02.993317 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bea9766da2480a91c269fb2cf32838e7cb8fd46420800320df90c1c76474180\": container with ID starting with 3bea9766da2480a91c269fb2cf32838e7cb8fd46420800320df90c1c76474180 not found: ID does not exist" containerID="3bea9766da2480a91c269fb2cf32838e7cb8fd46420800320df90c1c76474180" Oct 03 09:04:02 crc kubenswrapper[4664]: I1003 09:04:02.993340 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bea9766da2480a91c269fb2cf32838e7cb8fd46420800320df90c1c76474180"} err="failed to get container status \"3bea9766da2480a91c269fb2cf32838e7cb8fd46420800320df90c1c76474180\": rpc error: code = NotFound desc = could not find container \"3bea9766da2480a91c269fb2cf32838e7cb8fd46420800320df90c1c76474180\": container with ID starting with 3bea9766da2480a91c269fb2cf32838e7cb8fd46420800320df90c1c76474180 not found: ID does not exist" Oct 03 09:04:02 crc kubenswrapper[4664]: I1003 09:04:02.993355 4664 scope.go:117] "RemoveContainer" containerID="f3d3830f83842d00cf71dc484cedda55303ec6576c42ae3367f377118da4d685" Oct 03 09:04:02 crc kubenswrapper[4664]: E1003 09:04:02.993862 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3d3830f83842d00cf71dc484cedda55303ec6576c42ae3367f377118da4d685\": container with ID starting with f3d3830f83842d00cf71dc484cedda55303ec6576c42ae3367f377118da4d685 not found: ID does not exist" containerID="f3d3830f83842d00cf71dc484cedda55303ec6576c42ae3367f377118da4d685" Oct 03 09:04:02 crc kubenswrapper[4664]: I1003 09:04:02.993899 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3d3830f83842d00cf71dc484cedda55303ec6576c42ae3367f377118da4d685"} err="failed to get container status \"f3d3830f83842d00cf71dc484cedda55303ec6576c42ae3367f377118da4d685\": rpc error: code = NotFound desc = could not find container \"f3d3830f83842d00cf71dc484cedda55303ec6576c42ae3367f377118da4d685\": container with ID starting with f3d3830f83842d00cf71dc484cedda55303ec6576c42ae3367f377118da4d685 not found: ID does not exist" Oct 03 09:04:03 crc kubenswrapper[4664]: I1003 09:04:03.888255 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afc7650b-353f-4385-a823-2106df87a3f4" path="/var/lib/kubelet/pods/afc7650b-353f-4385-a823-2106df87a3f4/volumes" Oct 03 09:04:11 crc kubenswrapper[4664]: I1003 09:04:11.987443 4664 patch_prober.go:28] interesting pod/machine-config-daemon-x9dgm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:04:11 crc kubenswrapper[4664]: I1003 09:04:11.988146 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:04:41 crc kubenswrapper[4664]: I1003 09:04:41.987910 4664 patch_prober.go:28] interesting pod/machine-config-daemon-x9dgm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:04:41 crc kubenswrapper[4664]: I1003 09:04:41.988778 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:04:41 crc kubenswrapper[4664]: I1003 09:04:41.988855 4664 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" Oct 03 09:04:41 crc kubenswrapper[4664]: I1003 09:04:41.990451 4664 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"eb1fa260f925060668dce2865fd89a378a86a7eb69d6cc31ef71fbb4baeb45da"} pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 09:04:41 crc kubenswrapper[4664]: I1003 09:04:41.990524 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" containerID="cri-o://eb1fa260f925060668dce2865fd89a378a86a7eb69d6cc31ef71fbb4baeb45da" gracePeriod=600 Oct 03 09:04:42 crc kubenswrapper[4664]: E1003 09:04:42.747525 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 09:04:42 crc kubenswrapper[4664]: I1003 09:04:42.802945 4664 generic.go:334] "Generic (PLEG): container finished" podID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerID="eb1fa260f925060668dce2865fd89a378a86a7eb69d6cc31ef71fbb4baeb45da" exitCode=0 Oct 03 09:04:42 crc kubenswrapper[4664]: I1003 09:04:42.803028 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" event={"ID":"598b81ce-0ce7-498f-9337-ae5e6e64682b","Type":"ContainerDied","Data":"eb1fa260f925060668dce2865fd89a378a86a7eb69d6cc31ef71fbb4baeb45da"} Oct 03 09:04:42 crc kubenswrapper[4664]: I1003 09:04:42.803459 4664 scope.go:117] "RemoveContainer" containerID="4721b0d96904390f5d1bae6f797eb14b686b3ebe327c373eedf7fe5fd3b02c6f" Oct 03 09:04:42 crc kubenswrapper[4664]: I1003 09:04:42.804326 4664 scope.go:117] "RemoveContainer" containerID="eb1fa260f925060668dce2865fd89a378a86a7eb69d6cc31ef71fbb4baeb45da" Oct 03 09:04:42 crc kubenswrapper[4664]: E1003 09:04:42.804677 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 09:04:53 crc kubenswrapper[4664]: I1003 09:04:53.877771 4664 scope.go:117] "RemoveContainer" containerID="eb1fa260f925060668dce2865fd89a378a86a7eb69d6cc31ef71fbb4baeb45da" Oct 03 09:04:53 crc kubenswrapper[4664]: E1003 09:04:53.880134 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 09:05:08 crc kubenswrapper[4664]: I1003 09:05:08.876788 4664 scope.go:117] "RemoveContainer" containerID="eb1fa260f925060668dce2865fd89a378a86a7eb69d6cc31ef71fbb4baeb45da" Oct 03 09:05:08 crc kubenswrapper[4664]: E1003 09:05:08.877954 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 09:05:17 crc kubenswrapper[4664]: I1003 09:05:17.804895 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qhvbq"] Oct 03 09:05:17 crc kubenswrapper[4664]: E1003 09:05:17.806834 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afc7650b-353f-4385-a823-2106df87a3f4" containerName="registry-server" Oct 03 09:05:17 crc kubenswrapper[4664]: I1003 09:05:17.806872 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="afc7650b-353f-4385-a823-2106df87a3f4" containerName="registry-server" Oct 03 09:05:17 crc kubenswrapper[4664]: E1003 09:05:17.806905 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afc7650b-353f-4385-a823-2106df87a3f4" containerName="extract-content" Oct 03 09:05:17 crc kubenswrapper[4664]: I1003 09:05:17.806919 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="afc7650b-353f-4385-a823-2106df87a3f4" containerName="extract-content" Oct 03 09:05:17 crc kubenswrapper[4664]: E1003 09:05:17.806934 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afc7650b-353f-4385-a823-2106df87a3f4" containerName="extract-utilities" Oct 03 09:05:17 crc kubenswrapper[4664]: I1003 09:05:17.806947 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="afc7650b-353f-4385-a823-2106df87a3f4" containerName="extract-utilities" Oct 03 09:05:17 crc kubenswrapper[4664]: I1003 09:05:17.807389 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="afc7650b-353f-4385-a823-2106df87a3f4" containerName="registry-server" Oct 03 09:05:17 crc kubenswrapper[4664]: I1003 09:05:17.815425 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qhvbq" Oct 03 09:05:17 crc kubenswrapper[4664]: I1003 09:05:17.818904 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qhvbq"] Oct 03 09:05:17 crc kubenswrapper[4664]: I1003 09:05:17.941632 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6129669b-c54a-483c-90ad-faf18ad5a2bb-catalog-content\") pod \"redhat-marketplace-qhvbq\" (UID: \"6129669b-c54a-483c-90ad-faf18ad5a2bb\") " pod="openshift-marketplace/redhat-marketplace-qhvbq" Oct 03 09:05:17 crc kubenswrapper[4664]: I1003 09:05:17.942964 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6129669b-c54a-483c-90ad-faf18ad5a2bb-utilities\") pod \"redhat-marketplace-qhvbq\" (UID: \"6129669b-c54a-483c-90ad-faf18ad5a2bb\") " pod="openshift-marketplace/redhat-marketplace-qhvbq" Oct 03 09:05:17 crc kubenswrapper[4664]: I1003 09:05:17.943478 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbfts\" (UniqueName: \"kubernetes.io/projected/6129669b-c54a-483c-90ad-faf18ad5a2bb-kube-api-access-tbfts\") pod \"redhat-marketplace-qhvbq\" (UID: \"6129669b-c54a-483c-90ad-faf18ad5a2bb\") " pod="openshift-marketplace/redhat-marketplace-qhvbq" Oct 03 09:05:18 crc kubenswrapper[4664]: I1003 09:05:18.045658 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6129669b-c54a-483c-90ad-faf18ad5a2bb-catalog-content\") pod \"redhat-marketplace-qhvbq\" (UID: \"6129669b-c54a-483c-90ad-faf18ad5a2bb\") " pod="openshift-marketplace/redhat-marketplace-qhvbq" Oct 03 09:05:18 crc kubenswrapper[4664]: I1003 09:05:18.045902 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6129669b-c54a-483c-90ad-faf18ad5a2bb-utilities\") pod \"redhat-marketplace-qhvbq\" (UID: \"6129669b-c54a-483c-90ad-faf18ad5a2bb\") " pod="openshift-marketplace/redhat-marketplace-qhvbq" Oct 03 09:05:18 crc kubenswrapper[4664]: I1003 09:05:18.045993 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbfts\" (UniqueName: \"kubernetes.io/projected/6129669b-c54a-483c-90ad-faf18ad5a2bb-kube-api-access-tbfts\") pod \"redhat-marketplace-qhvbq\" (UID: \"6129669b-c54a-483c-90ad-faf18ad5a2bb\") " pod="openshift-marketplace/redhat-marketplace-qhvbq" Oct 03 09:05:18 crc kubenswrapper[4664]: I1003 09:05:18.046276 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6129669b-c54a-483c-90ad-faf18ad5a2bb-catalog-content\") pod \"redhat-marketplace-qhvbq\" (UID: \"6129669b-c54a-483c-90ad-faf18ad5a2bb\") " pod="openshift-marketplace/redhat-marketplace-qhvbq" Oct 03 09:05:18 crc kubenswrapper[4664]: I1003 09:05:18.046672 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6129669b-c54a-483c-90ad-faf18ad5a2bb-utilities\") pod \"redhat-marketplace-qhvbq\" (UID: \"6129669b-c54a-483c-90ad-faf18ad5a2bb\") " pod="openshift-marketplace/redhat-marketplace-qhvbq" Oct 03 09:05:18 crc kubenswrapper[4664]: I1003 09:05:18.073788 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbfts\" (UniqueName: \"kubernetes.io/projected/6129669b-c54a-483c-90ad-faf18ad5a2bb-kube-api-access-tbfts\") pod \"redhat-marketplace-qhvbq\" (UID: \"6129669b-c54a-483c-90ad-faf18ad5a2bb\") " pod="openshift-marketplace/redhat-marketplace-qhvbq" Oct 03 09:05:18 crc kubenswrapper[4664]: I1003 09:05:18.142439 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qhvbq" Oct 03 09:05:18 crc kubenswrapper[4664]: I1003 09:05:18.645208 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qhvbq"] Oct 03 09:05:19 crc kubenswrapper[4664]: I1003 09:05:19.243426 4664 generic.go:334] "Generic (PLEG): container finished" podID="6129669b-c54a-483c-90ad-faf18ad5a2bb" containerID="11a137c3f07baf2c6a26972825cc2473564f1e812f6875a32691ed85a571e5d0" exitCode=0 Oct 03 09:05:19 crc kubenswrapper[4664]: I1003 09:05:19.243723 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qhvbq" event={"ID":"6129669b-c54a-483c-90ad-faf18ad5a2bb","Type":"ContainerDied","Data":"11a137c3f07baf2c6a26972825cc2473564f1e812f6875a32691ed85a571e5d0"} Oct 03 09:05:19 crc kubenswrapper[4664]: I1003 09:05:19.243952 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qhvbq" event={"ID":"6129669b-c54a-483c-90ad-faf18ad5a2bb","Type":"ContainerStarted","Data":"382f79d9f7827fe5f402dffb429d3f861f03e4215255c30c2a7e6bfdb89ed48b"} Oct 03 09:05:20 crc kubenswrapper[4664]: I1003 09:05:20.256653 4664 generic.go:334] "Generic (PLEG): container finished" podID="6129669b-c54a-483c-90ad-faf18ad5a2bb" containerID="402aeb7b58c06c4b150cd752a289ab58302ce8c98760b7da7f0b7825e262cc43" exitCode=0 Oct 03 09:05:20 crc kubenswrapper[4664]: I1003 09:05:20.256906 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qhvbq" event={"ID":"6129669b-c54a-483c-90ad-faf18ad5a2bb","Type":"ContainerDied","Data":"402aeb7b58c06c4b150cd752a289ab58302ce8c98760b7da7f0b7825e262cc43"} Oct 03 09:05:21 crc kubenswrapper[4664]: I1003 09:05:21.269488 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qhvbq" event={"ID":"6129669b-c54a-483c-90ad-faf18ad5a2bb","Type":"ContainerStarted","Data":"19f37f755ffe02ffcf2f477d4cdbb36627dfe347b325c8b1da9730bb1606b473"} Oct 03 09:05:21 crc kubenswrapper[4664]: I1003 09:05:21.306057 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qhvbq" podStartSLOduration=2.645615063 podStartE2EDuration="4.30602423s" podCreationTimestamp="2025-10-03 09:05:17 +0000 UTC" firstStartedPulling="2025-10-03 09:05:19.245720883 +0000 UTC m=+4620.066911373" lastFinishedPulling="2025-10-03 09:05:20.90613005 +0000 UTC m=+4621.727320540" observedRunningTime="2025-10-03 09:05:21.2962403 +0000 UTC m=+4622.117430810" watchObservedRunningTime="2025-10-03 09:05:21.30602423 +0000 UTC m=+4622.127214720" Oct 03 09:05:23 crc kubenswrapper[4664]: I1003 09:05:23.877622 4664 scope.go:117] "RemoveContainer" containerID="eb1fa260f925060668dce2865fd89a378a86a7eb69d6cc31ef71fbb4baeb45da" Oct 03 09:05:23 crc kubenswrapper[4664]: E1003 09:05:23.878891 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 09:05:28 crc kubenswrapper[4664]: I1003 09:05:28.143693 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qhvbq" Oct 03 09:05:28 crc kubenswrapper[4664]: I1003 09:05:28.144685 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qhvbq" Oct 03 09:05:28 crc kubenswrapper[4664]: I1003 09:05:28.196226 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qhvbq" Oct 03 09:05:28 crc kubenswrapper[4664]: I1003 09:05:28.405298 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qhvbq" Oct 03 09:05:28 crc kubenswrapper[4664]: I1003 09:05:28.469242 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qhvbq"] Oct 03 09:05:30 crc kubenswrapper[4664]: I1003 09:05:30.372764 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qhvbq" podUID="6129669b-c54a-483c-90ad-faf18ad5a2bb" containerName="registry-server" containerID="cri-o://19f37f755ffe02ffcf2f477d4cdbb36627dfe347b325c8b1da9730bb1606b473" gracePeriod=2 Oct 03 09:05:30 crc kubenswrapper[4664]: I1003 09:05:30.838533 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qhvbq" Oct 03 09:05:30 crc kubenswrapper[4664]: I1003 09:05:30.991723 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6129669b-c54a-483c-90ad-faf18ad5a2bb-catalog-content\") pod \"6129669b-c54a-483c-90ad-faf18ad5a2bb\" (UID: \"6129669b-c54a-483c-90ad-faf18ad5a2bb\") " Oct 03 09:05:30 crc kubenswrapper[4664]: I1003 09:05:30.991806 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6129669b-c54a-483c-90ad-faf18ad5a2bb-utilities\") pod \"6129669b-c54a-483c-90ad-faf18ad5a2bb\" (UID: \"6129669b-c54a-483c-90ad-faf18ad5a2bb\") " Oct 03 09:05:30 crc kubenswrapper[4664]: I1003 09:05:30.991868 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbfts\" (UniqueName: \"kubernetes.io/projected/6129669b-c54a-483c-90ad-faf18ad5a2bb-kube-api-access-tbfts\") pod \"6129669b-c54a-483c-90ad-faf18ad5a2bb\" (UID: \"6129669b-c54a-483c-90ad-faf18ad5a2bb\") " Oct 03 09:05:30 crc kubenswrapper[4664]: I1003 09:05:30.993978 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6129669b-c54a-483c-90ad-faf18ad5a2bb-utilities" (OuterVolumeSpecName: "utilities") pod "6129669b-c54a-483c-90ad-faf18ad5a2bb" (UID: "6129669b-c54a-483c-90ad-faf18ad5a2bb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:05:31 crc kubenswrapper[4664]: I1003 09:05:31.003950 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6129669b-c54a-483c-90ad-faf18ad5a2bb-kube-api-access-tbfts" (OuterVolumeSpecName: "kube-api-access-tbfts") pod "6129669b-c54a-483c-90ad-faf18ad5a2bb" (UID: "6129669b-c54a-483c-90ad-faf18ad5a2bb"). InnerVolumeSpecName "kube-api-access-tbfts". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:05:31 crc kubenswrapper[4664]: I1003 09:05:31.007028 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6129669b-c54a-483c-90ad-faf18ad5a2bb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6129669b-c54a-483c-90ad-faf18ad5a2bb" (UID: "6129669b-c54a-483c-90ad-faf18ad5a2bb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:05:31 crc kubenswrapper[4664]: I1003 09:05:31.095224 4664 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6129669b-c54a-483c-90ad-faf18ad5a2bb-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:31 crc kubenswrapper[4664]: I1003 09:05:31.095273 4664 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6129669b-c54a-483c-90ad-faf18ad5a2bb-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:31 crc kubenswrapper[4664]: I1003 09:05:31.095292 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbfts\" (UniqueName: \"kubernetes.io/projected/6129669b-c54a-483c-90ad-faf18ad5a2bb-kube-api-access-tbfts\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:31 crc kubenswrapper[4664]: I1003 09:05:31.389441 4664 generic.go:334] "Generic (PLEG): container finished" podID="6129669b-c54a-483c-90ad-faf18ad5a2bb" containerID="19f37f755ffe02ffcf2f477d4cdbb36627dfe347b325c8b1da9730bb1606b473" exitCode=0 Oct 03 09:05:31 crc kubenswrapper[4664]: I1003 09:05:31.389510 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qhvbq" event={"ID":"6129669b-c54a-483c-90ad-faf18ad5a2bb","Type":"ContainerDied","Data":"19f37f755ffe02ffcf2f477d4cdbb36627dfe347b325c8b1da9730bb1606b473"} Oct 03 09:05:31 crc kubenswrapper[4664]: I1003 09:05:31.389559 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qhvbq" event={"ID":"6129669b-c54a-483c-90ad-faf18ad5a2bb","Type":"ContainerDied","Data":"382f79d9f7827fe5f402dffb429d3f861f03e4215255c30c2a7e6bfdb89ed48b"} Oct 03 09:05:31 crc kubenswrapper[4664]: I1003 09:05:31.389584 4664 scope.go:117] "RemoveContainer" containerID="19f37f755ffe02ffcf2f477d4cdbb36627dfe347b325c8b1da9730bb1606b473" Oct 03 09:05:31 crc kubenswrapper[4664]: I1003 09:05:31.389581 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qhvbq" Oct 03 09:05:31 crc kubenswrapper[4664]: I1003 09:05:31.447967 4664 scope.go:117] "RemoveContainer" containerID="402aeb7b58c06c4b150cd752a289ab58302ce8c98760b7da7f0b7825e262cc43" Oct 03 09:05:31 crc kubenswrapper[4664]: I1003 09:05:31.465955 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qhvbq"] Oct 03 09:05:31 crc kubenswrapper[4664]: I1003 09:05:31.478666 4664 scope.go:117] "RemoveContainer" containerID="11a137c3f07baf2c6a26972825cc2473564f1e812f6875a32691ed85a571e5d0" Oct 03 09:05:31 crc kubenswrapper[4664]: I1003 09:05:31.479880 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qhvbq"] Oct 03 09:05:31 crc kubenswrapper[4664]: I1003 09:05:31.515745 4664 scope.go:117] "RemoveContainer" containerID="19f37f755ffe02ffcf2f477d4cdbb36627dfe347b325c8b1da9730bb1606b473" Oct 03 09:05:31 crc kubenswrapper[4664]: E1003 09:05:31.516226 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19f37f755ffe02ffcf2f477d4cdbb36627dfe347b325c8b1da9730bb1606b473\": container with ID starting with 19f37f755ffe02ffcf2f477d4cdbb36627dfe347b325c8b1da9730bb1606b473 not found: ID does not exist" containerID="19f37f755ffe02ffcf2f477d4cdbb36627dfe347b325c8b1da9730bb1606b473" Oct 03 09:05:31 crc kubenswrapper[4664]: I1003 09:05:31.516265 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19f37f755ffe02ffcf2f477d4cdbb36627dfe347b325c8b1da9730bb1606b473"} err="failed to get container status \"19f37f755ffe02ffcf2f477d4cdbb36627dfe347b325c8b1da9730bb1606b473\": rpc error: code = NotFound desc = could not find container \"19f37f755ffe02ffcf2f477d4cdbb36627dfe347b325c8b1da9730bb1606b473\": container with ID starting with 19f37f755ffe02ffcf2f477d4cdbb36627dfe347b325c8b1da9730bb1606b473 not found: ID does not exist" Oct 03 09:05:31 crc kubenswrapper[4664]: I1003 09:05:31.516321 4664 scope.go:117] "RemoveContainer" containerID="402aeb7b58c06c4b150cd752a289ab58302ce8c98760b7da7f0b7825e262cc43" Oct 03 09:05:31 crc kubenswrapper[4664]: E1003 09:05:31.516845 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"402aeb7b58c06c4b150cd752a289ab58302ce8c98760b7da7f0b7825e262cc43\": container with ID starting with 402aeb7b58c06c4b150cd752a289ab58302ce8c98760b7da7f0b7825e262cc43 not found: ID does not exist" containerID="402aeb7b58c06c4b150cd752a289ab58302ce8c98760b7da7f0b7825e262cc43" Oct 03 09:05:31 crc kubenswrapper[4664]: I1003 09:05:31.516871 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"402aeb7b58c06c4b150cd752a289ab58302ce8c98760b7da7f0b7825e262cc43"} err="failed to get container status \"402aeb7b58c06c4b150cd752a289ab58302ce8c98760b7da7f0b7825e262cc43\": rpc error: code = NotFound desc = could not find container \"402aeb7b58c06c4b150cd752a289ab58302ce8c98760b7da7f0b7825e262cc43\": container with ID starting with 402aeb7b58c06c4b150cd752a289ab58302ce8c98760b7da7f0b7825e262cc43 not found: ID does not exist" Oct 03 09:05:31 crc kubenswrapper[4664]: I1003 09:05:31.516885 4664 scope.go:117] "RemoveContainer" containerID="11a137c3f07baf2c6a26972825cc2473564f1e812f6875a32691ed85a571e5d0" Oct 03 09:05:31 crc kubenswrapper[4664]: E1003 09:05:31.517098 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11a137c3f07baf2c6a26972825cc2473564f1e812f6875a32691ed85a571e5d0\": container with ID starting with 11a137c3f07baf2c6a26972825cc2473564f1e812f6875a32691ed85a571e5d0 not found: ID does not exist" containerID="11a137c3f07baf2c6a26972825cc2473564f1e812f6875a32691ed85a571e5d0" Oct 03 09:05:31 crc kubenswrapper[4664]: I1003 09:05:31.517120 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11a137c3f07baf2c6a26972825cc2473564f1e812f6875a32691ed85a571e5d0"} err="failed to get container status \"11a137c3f07baf2c6a26972825cc2473564f1e812f6875a32691ed85a571e5d0\": rpc error: code = NotFound desc = could not find container \"11a137c3f07baf2c6a26972825cc2473564f1e812f6875a32691ed85a571e5d0\": container with ID starting with 11a137c3f07baf2c6a26972825cc2473564f1e812f6875a32691ed85a571e5d0 not found: ID does not exist" Oct 03 09:05:31 crc kubenswrapper[4664]: I1003 09:05:31.903757 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6129669b-c54a-483c-90ad-faf18ad5a2bb" path="/var/lib/kubelet/pods/6129669b-c54a-483c-90ad-faf18ad5a2bb/volumes" Oct 03 09:05:38 crc kubenswrapper[4664]: I1003 09:05:38.877378 4664 scope.go:117] "RemoveContainer" containerID="eb1fa260f925060668dce2865fd89a378a86a7eb69d6cc31ef71fbb4baeb45da" Oct 03 09:05:38 crc kubenswrapper[4664]: E1003 09:05:38.878762 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 09:05:51 crc kubenswrapper[4664]: I1003 09:05:51.878313 4664 scope.go:117] "RemoveContainer" containerID="eb1fa260f925060668dce2865fd89a378a86a7eb69d6cc31ef71fbb4baeb45da" Oct 03 09:05:51 crc kubenswrapper[4664]: E1003 09:05:51.879628 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 09:06:05 crc kubenswrapper[4664]: I1003 09:06:05.876857 4664 scope.go:117] "RemoveContainer" containerID="eb1fa260f925060668dce2865fd89a378a86a7eb69d6cc31ef71fbb4baeb45da" Oct 03 09:06:05 crc kubenswrapper[4664]: E1003 09:06:05.877890 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 09:06:17 crc kubenswrapper[4664]: I1003 09:06:17.876956 4664 scope.go:117] "RemoveContainer" containerID="eb1fa260f925060668dce2865fd89a378a86a7eb69d6cc31ef71fbb4baeb45da" Oct 03 09:06:17 crc kubenswrapper[4664]: E1003 09:06:17.878602 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 09:06:29 crc kubenswrapper[4664]: I1003 09:06:29.889192 4664 scope.go:117] "RemoveContainer" containerID="eb1fa260f925060668dce2865fd89a378a86a7eb69d6cc31ef71fbb4baeb45da" Oct 03 09:06:29 crc kubenswrapper[4664]: E1003 09:06:29.890374 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 09:06:44 crc kubenswrapper[4664]: I1003 09:06:44.877090 4664 scope.go:117] "RemoveContainer" containerID="eb1fa260f925060668dce2865fd89a378a86a7eb69d6cc31ef71fbb4baeb45da" Oct 03 09:06:44 crc kubenswrapper[4664]: E1003 09:06:44.878545 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 09:06:58 crc kubenswrapper[4664]: I1003 09:06:58.876754 4664 scope.go:117] "RemoveContainer" containerID="eb1fa260f925060668dce2865fd89a378a86a7eb69d6cc31ef71fbb4baeb45da" Oct 03 09:06:58 crc kubenswrapper[4664]: E1003 09:06:58.877789 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 09:07:04 crc kubenswrapper[4664]: I1003 09:07:04.775392 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kss57"] Oct 03 09:07:04 crc kubenswrapper[4664]: E1003 09:07:04.776819 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6129669b-c54a-483c-90ad-faf18ad5a2bb" containerName="extract-utilities" Oct 03 09:07:04 crc kubenswrapper[4664]: I1003 09:07:04.776842 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="6129669b-c54a-483c-90ad-faf18ad5a2bb" containerName="extract-utilities" Oct 03 09:07:04 crc kubenswrapper[4664]: E1003 09:07:04.776879 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6129669b-c54a-483c-90ad-faf18ad5a2bb" containerName="extract-content" Oct 03 09:07:04 crc kubenswrapper[4664]: I1003 09:07:04.776885 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="6129669b-c54a-483c-90ad-faf18ad5a2bb" containerName="extract-content" Oct 03 09:07:04 crc kubenswrapper[4664]: E1003 09:07:04.776904 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6129669b-c54a-483c-90ad-faf18ad5a2bb" containerName="registry-server" Oct 03 09:07:04 crc kubenswrapper[4664]: I1003 09:07:04.776910 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="6129669b-c54a-483c-90ad-faf18ad5a2bb" containerName="registry-server" Oct 03 09:07:04 crc kubenswrapper[4664]: I1003 09:07:04.777104 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="6129669b-c54a-483c-90ad-faf18ad5a2bb" containerName="registry-server" Oct 03 09:07:04 crc kubenswrapper[4664]: I1003 09:07:04.778981 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kss57" Oct 03 09:07:04 crc kubenswrapper[4664]: I1003 09:07:04.789867 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kss57"] Oct 03 09:07:04 crc kubenswrapper[4664]: I1003 09:07:04.810562 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a381683-ec1c-45ff-8775-7340ac298fb2-catalog-content\") pod \"community-operators-kss57\" (UID: \"4a381683-ec1c-45ff-8775-7340ac298fb2\") " pod="openshift-marketplace/community-operators-kss57" Oct 03 09:07:04 crc kubenswrapper[4664]: I1003 09:07:04.810703 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvztv\" (UniqueName: \"kubernetes.io/projected/4a381683-ec1c-45ff-8775-7340ac298fb2-kube-api-access-dvztv\") pod \"community-operators-kss57\" (UID: \"4a381683-ec1c-45ff-8775-7340ac298fb2\") " pod="openshift-marketplace/community-operators-kss57" Oct 03 09:07:04 crc kubenswrapper[4664]: I1003 09:07:04.810768 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a381683-ec1c-45ff-8775-7340ac298fb2-utilities\") pod \"community-operators-kss57\" (UID: \"4a381683-ec1c-45ff-8775-7340ac298fb2\") " pod="openshift-marketplace/community-operators-kss57" Oct 03 09:07:04 crc kubenswrapper[4664]: I1003 09:07:04.912555 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a381683-ec1c-45ff-8775-7340ac298fb2-catalog-content\") pod \"community-operators-kss57\" (UID: \"4a381683-ec1c-45ff-8775-7340ac298fb2\") " pod="openshift-marketplace/community-operators-kss57" Oct 03 09:07:04 crc kubenswrapper[4664]: I1003 09:07:04.912836 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvztv\" (UniqueName: \"kubernetes.io/projected/4a381683-ec1c-45ff-8775-7340ac298fb2-kube-api-access-dvztv\") pod \"community-operators-kss57\" (UID: \"4a381683-ec1c-45ff-8775-7340ac298fb2\") " pod="openshift-marketplace/community-operators-kss57" Oct 03 09:07:04 crc kubenswrapper[4664]: I1003 09:07:04.912901 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a381683-ec1c-45ff-8775-7340ac298fb2-utilities\") pod \"community-operators-kss57\" (UID: \"4a381683-ec1c-45ff-8775-7340ac298fb2\") " pod="openshift-marketplace/community-operators-kss57" Oct 03 09:07:04 crc kubenswrapper[4664]: I1003 09:07:04.913574 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a381683-ec1c-45ff-8775-7340ac298fb2-catalog-content\") pod \"community-operators-kss57\" (UID: \"4a381683-ec1c-45ff-8775-7340ac298fb2\") " pod="openshift-marketplace/community-operators-kss57" Oct 03 09:07:04 crc kubenswrapper[4664]: I1003 09:07:04.913823 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a381683-ec1c-45ff-8775-7340ac298fb2-utilities\") pod \"community-operators-kss57\" (UID: \"4a381683-ec1c-45ff-8775-7340ac298fb2\") " pod="openshift-marketplace/community-operators-kss57" Oct 03 09:07:04 crc kubenswrapper[4664]: I1003 09:07:04.938445 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvztv\" (UniqueName: \"kubernetes.io/projected/4a381683-ec1c-45ff-8775-7340ac298fb2-kube-api-access-dvztv\") pod \"community-operators-kss57\" (UID: \"4a381683-ec1c-45ff-8775-7340ac298fb2\") " pod="openshift-marketplace/community-operators-kss57" Oct 03 09:07:05 crc kubenswrapper[4664]: I1003 09:07:05.118985 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kss57" Oct 03 09:07:05 crc kubenswrapper[4664]: I1003 09:07:05.716371 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kss57"] Oct 03 09:07:05 crc kubenswrapper[4664]: W1003 09:07:05.722923 4664 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a381683_ec1c_45ff_8775_7340ac298fb2.slice/crio-a5bc88445df68fed6bcc9a5ade9b328e437b085d334603f6fbbe84a30e605337 WatchSource:0}: Error finding container a5bc88445df68fed6bcc9a5ade9b328e437b085d334603f6fbbe84a30e605337: Status 404 returned error can't find the container with id a5bc88445df68fed6bcc9a5ade9b328e437b085d334603f6fbbe84a30e605337 Oct 03 09:07:06 crc kubenswrapper[4664]: I1003 09:07:06.548197 4664 generic.go:334] "Generic (PLEG): container finished" podID="4a381683-ec1c-45ff-8775-7340ac298fb2" containerID="3c570f42a34450fb53255997be19288cdf544cb9611da0af8c5f0bff9e53037f" exitCode=0 Oct 03 09:07:06 crc kubenswrapper[4664]: I1003 09:07:06.548640 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kss57" event={"ID":"4a381683-ec1c-45ff-8775-7340ac298fb2","Type":"ContainerDied","Data":"3c570f42a34450fb53255997be19288cdf544cb9611da0af8c5f0bff9e53037f"} Oct 03 09:07:06 crc kubenswrapper[4664]: I1003 09:07:06.548679 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kss57" event={"ID":"4a381683-ec1c-45ff-8775-7340ac298fb2","Type":"ContainerStarted","Data":"a5bc88445df68fed6bcc9a5ade9b328e437b085d334603f6fbbe84a30e605337"} Oct 03 09:07:08 crc kubenswrapper[4664]: E1003 09:07:08.245426 4664 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a381683_ec1c_45ff_8775_7340ac298fb2.slice/crio-f42ce62e9aa3e1d4a79c71d892e5d7fe799f1f7dec2142d01528e7a5b1cfe13f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a381683_ec1c_45ff_8775_7340ac298fb2.slice/crio-conmon-f42ce62e9aa3e1d4a79c71d892e5d7fe799f1f7dec2142d01528e7a5b1cfe13f.scope\": RecentStats: unable to find data in memory cache]" Oct 03 09:07:08 crc kubenswrapper[4664]: I1003 09:07:08.575322 4664 generic.go:334] "Generic (PLEG): container finished" podID="4a381683-ec1c-45ff-8775-7340ac298fb2" containerID="f42ce62e9aa3e1d4a79c71d892e5d7fe799f1f7dec2142d01528e7a5b1cfe13f" exitCode=0 Oct 03 09:07:08 crc kubenswrapper[4664]: I1003 09:07:08.575402 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kss57" event={"ID":"4a381683-ec1c-45ff-8775-7340ac298fb2","Type":"ContainerDied","Data":"f42ce62e9aa3e1d4a79c71d892e5d7fe799f1f7dec2142d01528e7a5b1cfe13f"} Oct 03 09:07:09 crc kubenswrapper[4664]: I1003 09:07:09.590719 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kss57" event={"ID":"4a381683-ec1c-45ff-8775-7340ac298fb2","Type":"ContainerStarted","Data":"d29030e58cdeb65c1d15fc811f2baad06b1e7a8d118df9ace99fe3923406e6f7"} Oct 03 09:07:09 crc kubenswrapper[4664]: I1003 09:07:09.625343 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kss57" podStartSLOduration=2.977479557 podStartE2EDuration="5.625304561s" podCreationTimestamp="2025-10-03 09:07:04 +0000 UTC" firstStartedPulling="2025-10-03 09:07:06.552320865 +0000 UTC m=+4727.373511365" lastFinishedPulling="2025-10-03 09:07:09.200145839 +0000 UTC m=+4730.021336369" observedRunningTime="2025-10-03 09:07:09.616020775 +0000 UTC m=+4730.437211285" watchObservedRunningTime="2025-10-03 09:07:09.625304561 +0000 UTC m=+4730.446495091" Oct 03 09:07:09 crc kubenswrapper[4664]: I1003 09:07:09.882568 4664 scope.go:117] "RemoveContainer" containerID="eb1fa260f925060668dce2865fd89a378a86a7eb69d6cc31ef71fbb4baeb45da" Oct 03 09:07:09 crc kubenswrapper[4664]: E1003 09:07:09.882897 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 09:07:15 crc kubenswrapper[4664]: I1003 09:07:15.119475 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kss57" Oct 03 09:07:15 crc kubenswrapper[4664]: I1003 09:07:15.120453 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kss57" Oct 03 09:07:15 crc kubenswrapper[4664]: I1003 09:07:15.183889 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kss57" Oct 03 09:07:15 crc kubenswrapper[4664]: I1003 09:07:15.732270 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kss57" Oct 03 09:07:15 crc kubenswrapper[4664]: I1003 09:07:15.811634 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kss57"] Oct 03 09:07:17 crc kubenswrapper[4664]: I1003 09:07:17.684453 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kss57" podUID="4a381683-ec1c-45ff-8775-7340ac298fb2" containerName="registry-server" containerID="cri-o://d29030e58cdeb65c1d15fc811f2baad06b1e7a8d118df9ace99fe3923406e6f7" gracePeriod=2 Oct 03 09:07:18 crc kubenswrapper[4664]: I1003 09:07:18.237330 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kss57" Oct 03 09:07:18 crc kubenswrapper[4664]: I1003 09:07:18.370235 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a381683-ec1c-45ff-8775-7340ac298fb2-catalog-content\") pod \"4a381683-ec1c-45ff-8775-7340ac298fb2\" (UID: \"4a381683-ec1c-45ff-8775-7340ac298fb2\") " Oct 03 09:07:18 crc kubenswrapper[4664]: I1003 09:07:18.370505 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvztv\" (UniqueName: \"kubernetes.io/projected/4a381683-ec1c-45ff-8775-7340ac298fb2-kube-api-access-dvztv\") pod \"4a381683-ec1c-45ff-8775-7340ac298fb2\" (UID: \"4a381683-ec1c-45ff-8775-7340ac298fb2\") " Oct 03 09:07:18 crc kubenswrapper[4664]: I1003 09:07:18.371879 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a381683-ec1c-45ff-8775-7340ac298fb2-utilities\") pod \"4a381683-ec1c-45ff-8775-7340ac298fb2\" (UID: \"4a381683-ec1c-45ff-8775-7340ac298fb2\") " Oct 03 09:07:18 crc kubenswrapper[4664]: I1003 09:07:18.373042 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a381683-ec1c-45ff-8775-7340ac298fb2-utilities" (OuterVolumeSpecName: "utilities") pod "4a381683-ec1c-45ff-8775-7340ac298fb2" (UID: "4a381683-ec1c-45ff-8775-7340ac298fb2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:07:18 crc kubenswrapper[4664]: I1003 09:07:18.379456 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a381683-ec1c-45ff-8775-7340ac298fb2-kube-api-access-dvztv" (OuterVolumeSpecName: "kube-api-access-dvztv") pod "4a381683-ec1c-45ff-8775-7340ac298fb2" (UID: "4a381683-ec1c-45ff-8775-7340ac298fb2"). InnerVolumeSpecName "kube-api-access-dvztv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:07:18 crc kubenswrapper[4664]: I1003 09:07:18.474829 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvztv\" (UniqueName: \"kubernetes.io/projected/4a381683-ec1c-45ff-8775-7340ac298fb2-kube-api-access-dvztv\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:18 crc kubenswrapper[4664]: I1003 09:07:18.474864 4664 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a381683-ec1c-45ff-8775-7340ac298fb2-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:18 crc kubenswrapper[4664]: I1003 09:07:18.704417 4664 generic.go:334] "Generic (PLEG): container finished" podID="4a381683-ec1c-45ff-8775-7340ac298fb2" containerID="d29030e58cdeb65c1d15fc811f2baad06b1e7a8d118df9ace99fe3923406e6f7" exitCode=0 Oct 03 09:07:18 crc kubenswrapper[4664]: I1003 09:07:18.704549 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kss57" event={"ID":"4a381683-ec1c-45ff-8775-7340ac298fb2","Type":"ContainerDied","Data":"d29030e58cdeb65c1d15fc811f2baad06b1e7a8d118df9ace99fe3923406e6f7"} Oct 03 09:07:18 crc kubenswrapper[4664]: I1003 09:07:18.704892 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kss57" event={"ID":"4a381683-ec1c-45ff-8775-7340ac298fb2","Type":"ContainerDied","Data":"a5bc88445df68fed6bcc9a5ade9b328e437b085d334603f6fbbe84a30e605337"} Oct 03 09:07:18 crc kubenswrapper[4664]: I1003 09:07:18.704916 4664 scope.go:117] "RemoveContainer" containerID="d29030e58cdeb65c1d15fc811f2baad06b1e7a8d118df9ace99fe3923406e6f7" Oct 03 09:07:18 crc kubenswrapper[4664]: I1003 09:07:18.704715 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kss57" Oct 03 09:07:18 crc kubenswrapper[4664]: I1003 09:07:18.739757 4664 scope.go:117] "RemoveContainer" containerID="f42ce62e9aa3e1d4a79c71d892e5d7fe799f1f7dec2142d01528e7a5b1cfe13f" Oct 03 09:07:18 crc kubenswrapper[4664]: I1003 09:07:18.770903 4664 scope.go:117] "RemoveContainer" containerID="3c570f42a34450fb53255997be19288cdf544cb9611da0af8c5f0bff9e53037f" Oct 03 09:07:18 crc kubenswrapper[4664]: I1003 09:07:18.846493 4664 scope.go:117] "RemoveContainer" containerID="d29030e58cdeb65c1d15fc811f2baad06b1e7a8d118df9ace99fe3923406e6f7" Oct 03 09:07:18 crc kubenswrapper[4664]: E1003 09:07:18.847130 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d29030e58cdeb65c1d15fc811f2baad06b1e7a8d118df9ace99fe3923406e6f7\": container with ID starting with d29030e58cdeb65c1d15fc811f2baad06b1e7a8d118df9ace99fe3923406e6f7 not found: ID does not exist" containerID="d29030e58cdeb65c1d15fc811f2baad06b1e7a8d118df9ace99fe3923406e6f7" Oct 03 09:07:18 crc kubenswrapper[4664]: I1003 09:07:18.847170 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d29030e58cdeb65c1d15fc811f2baad06b1e7a8d118df9ace99fe3923406e6f7"} err="failed to get container status \"d29030e58cdeb65c1d15fc811f2baad06b1e7a8d118df9ace99fe3923406e6f7\": rpc error: code = NotFound desc = could not find container \"d29030e58cdeb65c1d15fc811f2baad06b1e7a8d118df9ace99fe3923406e6f7\": container with ID starting with d29030e58cdeb65c1d15fc811f2baad06b1e7a8d118df9ace99fe3923406e6f7 not found: ID does not exist" Oct 03 09:07:18 crc kubenswrapper[4664]: I1003 09:07:18.847196 4664 scope.go:117] "RemoveContainer" containerID="f42ce62e9aa3e1d4a79c71d892e5d7fe799f1f7dec2142d01528e7a5b1cfe13f" Oct 03 09:07:18 crc kubenswrapper[4664]: E1003 09:07:18.847832 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f42ce62e9aa3e1d4a79c71d892e5d7fe799f1f7dec2142d01528e7a5b1cfe13f\": container with ID starting with f42ce62e9aa3e1d4a79c71d892e5d7fe799f1f7dec2142d01528e7a5b1cfe13f not found: ID does not exist" containerID="f42ce62e9aa3e1d4a79c71d892e5d7fe799f1f7dec2142d01528e7a5b1cfe13f" Oct 03 09:07:18 crc kubenswrapper[4664]: I1003 09:07:18.847901 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f42ce62e9aa3e1d4a79c71d892e5d7fe799f1f7dec2142d01528e7a5b1cfe13f"} err="failed to get container status \"f42ce62e9aa3e1d4a79c71d892e5d7fe799f1f7dec2142d01528e7a5b1cfe13f\": rpc error: code = NotFound desc = could not find container \"f42ce62e9aa3e1d4a79c71d892e5d7fe799f1f7dec2142d01528e7a5b1cfe13f\": container with ID starting with f42ce62e9aa3e1d4a79c71d892e5d7fe799f1f7dec2142d01528e7a5b1cfe13f not found: ID does not exist" Oct 03 09:07:18 crc kubenswrapper[4664]: I1003 09:07:18.847938 4664 scope.go:117] "RemoveContainer" containerID="3c570f42a34450fb53255997be19288cdf544cb9611da0af8c5f0bff9e53037f" Oct 03 09:07:18 crc kubenswrapper[4664]: E1003 09:07:18.848451 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c570f42a34450fb53255997be19288cdf544cb9611da0af8c5f0bff9e53037f\": container with ID starting with 3c570f42a34450fb53255997be19288cdf544cb9611da0af8c5f0bff9e53037f not found: ID does not exist" containerID="3c570f42a34450fb53255997be19288cdf544cb9611da0af8c5f0bff9e53037f" Oct 03 09:07:18 crc kubenswrapper[4664]: I1003 09:07:18.848484 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c570f42a34450fb53255997be19288cdf544cb9611da0af8c5f0bff9e53037f"} err="failed to get container status \"3c570f42a34450fb53255997be19288cdf544cb9611da0af8c5f0bff9e53037f\": rpc error: code = NotFound desc = could not find container \"3c570f42a34450fb53255997be19288cdf544cb9611da0af8c5f0bff9e53037f\": container with ID starting with 3c570f42a34450fb53255997be19288cdf544cb9611da0af8c5f0bff9e53037f not found: ID does not exist" Oct 03 09:07:19 crc kubenswrapper[4664]: I1003 09:07:19.252128 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a381683-ec1c-45ff-8775-7340ac298fb2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4a381683-ec1c-45ff-8775-7340ac298fb2" (UID: "4a381683-ec1c-45ff-8775-7340ac298fb2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:07:19 crc kubenswrapper[4664]: I1003 09:07:19.295303 4664 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a381683-ec1c-45ff-8775-7340ac298fb2-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:19 crc kubenswrapper[4664]: I1003 09:07:19.358174 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kss57"] Oct 03 09:07:19 crc kubenswrapper[4664]: I1003 09:07:19.367369 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kss57"] Oct 03 09:07:19 crc kubenswrapper[4664]: I1003 09:07:19.902537 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a381683-ec1c-45ff-8775-7340ac298fb2" path="/var/lib/kubelet/pods/4a381683-ec1c-45ff-8775-7340ac298fb2/volumes" Oct 03 09:07:20 crc kubenswrapper[4664]: I1003 09:07:20.881983 4664 scope.go:117] "RemoveContainer" containerID="eb1fa260f925060668dce2865fd89a378a86a7eb69d6cc31ef71fbb4baeb45da" Oct 03 09:07:20 crc kubenswrapper[4664]: E1003 09:07:20.882343 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 09:07:32 crc kubenswrapper[4664]: I1003 09:07:32.876921 4664 scope.go:117] "RemoveContainer" containerID="eb1fa260f925060668dce2865fd89a378a86a7eb69d6cc31ef71fbb4baeb45da" Oct 03 09:07:32 crc kubenswrapper[4664]: E1003 09:07:32.878283 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 09:07:47 crc kubenswrapper[4664]: I1003 09:07:47.877597 4664 scope.go:117] "RemoveContainer" containerID="eb1fa260f925060668dce2865fd89a378a86a7eb69d6cc31ef71fbb4baeb45da" Oct 03 09:07:47 crc kubenswrapper[4664]: E1003 09:07:47.878797 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 09:07:52 crc kubenswrapper[4664]: I1003 09:07:52.829238 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ss9h4"] Oct 03 09:07:52 crc kubenswrapper[4664]: E1003 09:07:52.830702 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a381683-ec1c-45ff-8775-7340ac298fb2" containerName="extract-utilities" Oct 03 09:07:52 crc kubenswrapper[4664]: I1003 09:07:52.830738 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a381683-ec1c-45ff-8775-7340ac298fb2" containerName="extract-utilities" Oct 03 09:07:52 crc kubenswrapper[4664]: E1003 09:07:52.830772 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a381683-ec1c-45ff-8775-7340ac298fb2" containerName="registry-server" Oct 03 09:07:52 crc kubenswrapper[4664]: I1003 09:07:52.830780 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a381683-ec1c-45ff-8775-7340ac298fb2" containerName="registry-server" Oct 03 09:07:52 crc kubenswrapper[4664]: E1003 09:07:52.830832 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a381683-ec1c-45ff-8775-7340ac298fb2" containerName="extract-content" Oct 03 09:07:52 crc kubenswrapper[4664]: I1003 09:07:52.830841 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a381683-ec1c-45ff-8775-7340ac298fb2" containerName="extract-content" Oct 03 09:07:52 crc kubenswrapper[4664]: I1003 09:07:52.831104 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a381683-ec1c-45ff-8775-7340ac298fb2" containerName="registry-server" Oct 03 09:07:52 crc kubenswrapper[4664]: I1003 09:07:52.833087 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ss9h4" Oct 03 09:07:52 crc kubenswrapper[4664]: I1003 09:07:52.843566 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ss9h4"] Oct 03 09:07:52 crc kubenswrapper[4664]: I1003 09:07:52.982104 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b528t\" (UniqueName: \"kubernetes.io/projected/9da34948-61c2-49f0-8400-a4b6cf03db40-kube-api-access-b528t\") pod \"redhat-operators-ss9h4\" (UID: \"9da34948-61c2-49f0-8400-a4b6cf03db40\") " pod="openshift-marketplace/redhat-operators-ss9h4" Oct 03 09:07:52 crc kubenswrapper[4664]: I1003 09:07:52.982209 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9da34948-61c2-49f0-8400-a4b6cf03db40-catalog-content\") pod \"redhat-operators-ss9h4\" (UID: \"9da34948-61c2-49f0-8400-a4b6cf03db40\") " pod="openshift-marketplace/redhat-operators-ss9h4" Oct 03 09:07:52 crc kubenswrapper[4664]: I1003 09:07:52.982302 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9da34948-61c2-49f0-8400-a4b6cf03db40-utilities\") pod \"redhat-operators-ss9h4\" (UID: \"9da34948-61c2-49f0-8400-a4b6cf03db40\") " pod="openshift-marketplace/redhat-operators-ss9h4" Oct 03 09:07:53 crc kubenswrapper[4664]: I1003 09:07:53.084340 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9da34948-61c2-49f0-8400-a4b6cf03db40-utilities\") pod \"redhat-operators-ss9h4\" (UID: \"9da34948-61c2-49f0-8400-a4b6cf03db40\") " pod="openshift-marketplace/redhat-operators-ss9h4" Oct 03 09:07:53 crc kubenswrapper[4664]: I1003 09:07:53.084480 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b528t\" (UniqueName: \"kubernetes.io/projected/9da34948-61c2-49f0-8400-a4b6cf03db40-kube-api-access-b528t\") pod \"redhat-operators-ss9h4\" (UID: \"9da34948-61c2-49f0-8400-a4b6cf03db40\") " pod="openshift-marketplace/redhat-operators-ss9h4" Oct 03 09:07:53 crc kubenswrapper[4664]: I1003 09:07:53.084527 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9da34948-61c2-49f0-8400-a4b6cf03db40-catalog-content\") pod \"redhat-operators-ss9h4\" (UID: \"9da34948-61c2-49f0-8400-a4b6cf03db40\") " pod="openshift-marketplace/redhat-operators-ss9h4" Oct 03 09:07:53 crc kubenswrapper[4664]: I1003 09:07:53.085000 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9da34948-61c2-49f0-8400-a4b6cf03db40-catalog-content\") pod \"redhat-operators-ss9h4\" (UID: \"9da34948-61c2-49f0-8400-a4b6cf03db40\") " pod="openshift-marketplace/redhat-operators-ss9h4" Oct 03 09:07:53 crc kubenswrapper[4664]: I1003 09:07:53.085288 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9da34948-61c2-49f0-8400-a4b6cf03db40-utilities\") pod \"redhat-operators-ss9h4\" (UID: \"9da34948-61c2-49f0-8400-a4b6cf03db40\") " pod="openshift-marketplace/redhat-operators-ss9h4" Oct 03 09:07:53 crc kubenswrapper[4664]: I1003 09:07:53.108756 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b528t\" (UniqueName: \"kubernetes.io/projected/9da34948-61c2-49f0-8400-a4b6cf03db40-kube-api-access-b528t\") pod \"redhat-operators-ss9h4\" (UID: \"9da34948-61c2-49f0-8400-a4b6cf03db40\") " pod="openshift-marketplace/redhat-operators-ss9h4" Oct 03 09:07:53 crc kubenswrapper[4664]: I1003 09:07:53.164381 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ss9h4" Oct 03 09:07:53 crc kubenswrapper[4664]: I1003 09:07:53.730583 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ss9h4"] Oct 03 09:07:54 crc kubenswrapper[4664]: I1003 09:07:54.205370 4664 generic.go:334] "Generic (PLEG): container finished" podID="9da34948-61c2-49f0-8400-a4b6cf03db40" containerID="e911f3c51ef37d30c7c8e65d30097b0fbf6e1f66d97be5598ac50ea3781e8896" exitCode=0 Oct 03 09:07:54 crc kubenswrapper[4664]: I1003 09:07:54.205511 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ss9h4" event={"ID":"9da34948-61c2-49f0-8400-a4b6cf03db40","Type":"ContainerDied","Data":"e911f3c51ef37d30c7c8e65d30097b0fbf6e1f66d97be5598ac50ea3781e8896"} Oct 03 09:07:54 crc kubenswrapper[4664]: I1003 09:07:54.205854 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ss9h4" event={"ID":"9da34948-61c2-49f0-8400-a4b6cf03db40","Type":"ContainerStarted","Data":"8e005f54e28cff8be7e2ed3da3b3cc890cfafd644e298bd4d76c50d65e0e9f66"} Oct 03 09:07:56 crc kubenswrapper[4664]: I1003 09:07:56.238440 4664 generic.go:334] "Generic (PLEG): container finished" podID="9da34948-61c2-49f0-8400-a4b6cf03db40" containerID="76868ae4368b564864444908f114c5bcdc731d86010f605fe0705226b73b1a6e" exitCode=0 Oct 03 09:07:56 crc kubenswrapper[4664]: I1003 09:07:56.238506 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ss9h4" event={"ID":"9da34948-61c2-49f0-8400-a4b6cf03db40","Type":"ContainerDied","Data":"76868ae4368b564864444908f114c5bcdc731d86010f605fe0705226b73b1a6e"} Oct 03 09:07:58 crc kubenswrapper[4664]: I1003 09:07:58.271430 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ss9h4" event={"ID":"9da34948-61c2-49f0-8400-a4b6cf03db40","Type":"ContainerStarted","Data":"d3bba657b0ee61fa5951ca571b4694c8663076081afd799c79d511506b287892"} Oct 03 09:07:58 crc kubenswrapper[4664]: I1003 09:07:58.304336 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ss9h4" podStartSLOduration=2.587652161 podStartE2EDuration="6.304314708s" podCreationTimestamp="2025-10-03 09:07:52 +0000 UTC" firstStartedPulling="2025-10-03 09:07:54.207368122 +0000 UTC m=+4775.028558642" lastFinishedPulling="2025-10-03 09:07:57.924030689 +0000 UTC m=+4778.745221189" observedRunningTime="2025-10-03 09:07:58.298144861 +0000 UTC m=+4779.119335381" watchObservedRunningTime="2025-10-03 09:07:58.304314708 +0000 UTC m=+4779.125505198" Oct 03 09:08:01 crc kubenswrapper[4664]: I1003 09:08:01.877085 4664 scope.go:117] "RemoveContainer" containerID="eb1fa260f925060668dce2865fd89a378a86a7eb69d6cc31ef71fbb4baeb45da" Oct 03 09:08:01 crc kubenswrapper[4664]: E1003 09:08:01.878493 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 09:08:03 crc kubenswrapper[4664]: I1003 09:08:03.165008 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ss9h4" Oct 03 09:08:03 crc kubenswrapper[4664]: I1003 09:08:03.165083 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ss9h4" Oct 03 09:08:03 crc kubenswrapper[4664]: I1003 09:08:03.246292 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ss9h4" Oct 03 09:08:03 crc kubenswrapper[4664]: I1003 09:08:03.396782 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ss9h4" Oct 03 09:08:03 crc kubenswrapper[4664]: I1003 09:08:03.504165 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ss9h4"] Oct 03 09:08:05 crc kubenswrapper[4664]: I1003 09:08:05.351237 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ss9h4" podUID="9da34948-61c2-49f0-8400-a4b6cf03db40" containerName="registry-server" containerID="cri-o://d3bba657b0ee61fa5951ca571b4694c8663076081afd799c79d511506b287892" gracePeriod=2 Oct 03 09:08:06 crc kubenswrapper[4664]: I1003 09:08:06.372494 4664 generic.go:334] "Generic (PLEG): container finished" podID="9da34948-61c2-49f0-8400-a4b6cf03db40" containerID="d3bba657b0ee61fa5951ca571b4694c8663076081afd799c79d511506b287892" exitCode=0 Oct 03 09:08:06 crc kubenswrapper[4664]: I1003 09:08:06.372815 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ss9h4" event={"ID":"9da34948-61c2-49f0-8400-a4b6cf03db40","Type":"ContainerDied","Data":"d3bba657b0ee61fa5951ca571b4694c8663076081afd799c79d511506b287892"} Oct 03 09:08:06 crc kubenswrapper[4664]: I1003 09:08:06.765316 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ss9h4" Oct 03 09:08:06 crc kubenswrapper[4664]: I1003 09:08:06.939140 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9da34948-61c2-49f0-8400-a4b6cf03db40-utilities\") pod \"9da34948-61c2-49f0-8400-a4b6cf03db40\" (UID: \"9da34948-61c2-49f0-8400-a4b6cf03db40\") " Oct 03 09:08:06 crc kubenswrapper[4664]: I1003 09:08:06.939261 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9da34948-61c2-49f0-8400-a4b6cf03db40-catalog-content\") pod \"9da34948-61c2-49f0-8400-a4b6cf03db40\" (UID: \"9da34948-61c2-49f0-8400-a4b6cf03db40\") " Oct 03 09:08:06 crc kubenswrapper[4664]: I1003 09:08:06.939347 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b528t\" (UniqueName: \"kubernetes.io/projected/9da34948-61c2-49f0-8400-a4b6cf03db40-kube-api-access-b528t\") pod \"9da34948-61c2-49f0-8400-a4b6cf03db40\" (UID: \"9da34948-61c2-49f0-8400-a4b6cf03db40\") " Oct 03 09:08:06 crc kubenswrapper[4664]: I1003 09:08:06.942155 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9da34948-61c2-49f0-8400-a4b6cf03db40-utilities" (OuterVolumeSpecName: "utilities") pod "9da34948-61c2-49f0-8400-a4b6cf03db40" (UID: "9da34948-61c2-49f0-8400-a4b6cf03db40"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:08:06 crc kubenswrapper[4664]: I1003 09:08:06.950399 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9da34948-61c2-49f0-8400-a4b6cf03db40-kube-api-access-b528t" (OuterVolumeSpecName: "kube-api-access-b528t") pod "9da34948-61c2-49f0-8400-a4b6cf03db40" (UID: "9da34948-61c2-49f0-8400-a4b6cf03db40"). InnerVolumeSpecName "kube-api-access-b528t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:08:07 crc kubenswrapper[4664]: I1003 09:08:07.042007 4664 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9da34948-61c2-49f0-8400-a4b6cf03db40-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:07 crc kubenswrapper[4664]: I1003 09:08:07.042057 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b528t\" (UniqueName: \"kubernetes.io/projected/9da34948-61c2-49f0-8400-a4b6cf03db40-kube-api-access-b528t\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:07 crc kubenswrapper[4664]: I1003 09:08:07.385318 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ss9h4" event={"ID":"9da34948-61c2-49f0-8400-a4b6cf03db40","Type":"ContainerDied","Data":"8e005f54e28cff8be7e2ed3da3b3cc890cfafd644e298bd4d76c50d65e0e9f66"} Oct 03 09:08:07 crc kubenswrapper[4664]: I1003 09:08:07.385392 4664 scope.go:117] "RemoveContainer" containerID="d3bba657b0ee61fa5951ca571b4694c8663076081afd799c79d511506b287892" Oct 03 09:08:07 crc kubenswrapper[4664]: I1003 09:08:07.385390 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ss9h4" Oct 03 09:08:07 crc kubenswrapper[4664]: I1003 09:08:07.409684 4664 scope.go:117] "RemoveContainer" containerID="76868ae4368b564864444908f114c5bcdc731d86010f605fe0705226b73b1a6e" Oct 03 09:08:07 crc kubenswrapper[4664]: I1003 09:08:07.444576 4664 scope.go:117] "RemoveContainer" containerID="e911f3c51ef37d30c7c8e65d30097b0fbf6e1f66d97be5598ac50ea3781e8896" Oct 03 09:08:08 crc kubenswrapper[4664]: I1003 09:08:08.459468 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9da34948-61c2-49f0-8400-a4b6cf03db40-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9da34948-61c2-49f0-8400-a4b6cf03db40" (UID: "9da34948-61c2-49f0-8400-a4b6cf03db40"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:08:08 crc kubenswrapper[4664]: I1003 09:08:08.478893 4664 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9da34948-61c2-49f0-8400-a4b6cf03db40-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:08 crc kubenswrapper[4664]: I1003 09:08:08.639998 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ss9h4"] Oct 03 09:08:08 crc kubenswrapper[4664]: I1003 09:08:08.649756 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ss9h4"] Oct 03 09:08:09 crc kubenswrapper[4664]: I1003 09:08:09.895411 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9da34948-61c2-49f0-8400-a4b6cf03db40" path="/var/lib/kubelet/pods/9da34948-61c2-49f0-8400-a4b6cf03db40/volumes" Oct 03 09:08:13 crc kubenswrapper[4664]: I1003 09:08:13.878699 4664 scope.go:117] "RemoveContainer" containerID="eb1fa260f925060668dce2865fd89a378a86a7eb69d6cc31ef71fbb4baeb45da" Oct 03 09:08:13 crc kubenswrapper[4664]: E1003 09:08:13.879519 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 09:08:27 crc kubenswrapper[4664]: I1003 09:08:27.877053 4664 scope.go:117] "RemoveContainer" containerID="eb1fa260f925060668dce2865fd89a378a86a7eb69d6cc31ef71fbb4baeb45da" Oct 03 09:08:27 crc kubenswrapper[4664]: E1003 09:08:27.878462 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 09:08:40 crc kubenswrapper[4664]: I1003 09:08:40.877637 4664 scope.go:117] "RemoveContainer" containerID="eb1fa260f925060668dce2865fd89a378a86a7eb69d6cc31ef71fbb4baeb45da" Oct 03 09:08:40 crc kubenswrapper[4664]: E1003 09:08:40.879435 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 09:08:52 crc kubenswrapper[4664]: I1003 09:08:52.876796 4664 scope.go:117] "RemoveContainer" containerID="eb1fa260f925060668dce2865fd89a378a86a7eb69d6cc31ef71fbb4baeb45da" Oct 03 09:08:52 crc kubenswrapper[4664]: E1003 09:08:52.877655 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 09:09:04 crc kubenswrapper[4664]: I1003 09:09:04.877282 4664 scope.go:117] "RemoveContainer" containerID="eb1fa260f925060668dce2865fd89a378a86a7eb69d6cc31ef71fbb4baeb45da" Oct 03 09:09:04 crc kubenswrapper[4664]: E1003 09:09:04.878439 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 09:09:19 crc kubenswrapper[4664]: I1003 09:09:19.891748 4664 scope.go:117] "RemoveContainer" containerID="eb1fa260f925060668dce2865fd89a378a86a7eb69d6cc31ef71fbb4baeb45da" Oct 03 09:09:19 crc kubenswrapper[4664]: E1003 09:09:19.893105 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 09:09:33 crc kubenswrapper[4664]: I1003 09:09:33.878522 4664 scope.go:117] "RemoveContainer" containerID="eb1fa260f925060668dce2865fd89a378a86a7eb69d6cc31ef71fbb4baeb45da" Oct 03 09:09:33 crc kubenswrapper[4664]: E1003 09:09:33.879704 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 09:09:45 crc kubenswrapper[4664]: I1003 09:09:45.877108 4664 scope.go:117] "RemoveContainer" containerID="eb1fa260f925060668dce2865fd89a378a86a7eb69d6cc31ef71fbb4baeb45da" Oct 03 09:09:46 crc kubenswrapper[4664]: I1003 09:09:46.609856 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" event={"ID":"598b81ce-0ce7-498f-9337-ae5e6e64682b","Type":"ContainerStarted","Data":"b2f0e8060b32103ae029b4593961836565e27c6c08ac83af035e3afa6eaadd61"} Oct 03 09:12:11 crc kubenswrapper[4664]: I1003 09:12:11.987538 4664 patch_prober.go:28] interesting pod/machine-config-daemon-x9dgm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:12:11 crc kubenswrapper[4664]: I1003 09:12:11.988241 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:12:41 crc kubenswrapper[4664]: I1003 09:12:41.986929 4664 patch_prober.go:28] interesting pod/machine-config-daemon-x9dgm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:12:41 crc kubenswrapper[4664]: I1003 09:12:41.987720 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:13:11 crc kubenswrapper[4664]: I1003 09:13:11.986812 4664 patch_prober.go:28] interesting pod/machine-config-daemon-x9dgm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:13:11 crc kubenswrapper[4664]: I1003 09:13:11.987829 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:13:11 crc kubenswrapper[4664]: I1003 09:13:11.987931 4664 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" Oct 03 09:13:11 crc kubenswrapper[4664]: I1003 09:13:11.989035 4664 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b2f0e8060b32103ae029b4593961836565e27c6c08ac83af035e3afa6eaadd61"} pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 09:13:11 crc kubenswrapper[4664]: I1003 09:13:11.989109 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" containerID="cri-o://b2f0e8060b32103ae029b4593961836565e27c6c08ac83af035e3afa6eaadd61" gracePeriod=600 Oct 03 09:13:12 crc kubenswrapper[4664]: I1003 09:13:12.990835 4664 generic.go:334] "Generic (PLEG): container finished" podID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerID="b2f0e8060b32103ae029b4593961836565e27c6c08ac83af035e3afa6eaadd61" exitCode=0 Oct 03 09:13:12 crc kubenswrapper[4664]: I1003 09:13:12.990906 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" event={"ID":"598b81ce-0ce7-498f-9337-ae5e6e64682b","Type":"ContainerDied","Data":"b2f0e8060b32103ae029b4593961836565e27c6c08ac83af035e3afa6eaadd61"} Oct 03 09:13:12 crc kubenswrapper[4664]: I1003 09:13:12.991307 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" event={"ID":"598b81ce-0ce7-498f-9337-ae5e6e64682b","Type":"ContainerStarted","Data":"ec8cb97235e056eca8cb80e25dc332b41b368618452bc85cd99bf48d88672107"} Oct 03 09:13:12 crc kubenswrapper[4664]: I1003 09:13:12.991346 4664 scope.go:117] "RemoveContainer" containerID="eb1fa260f925060668dce2865fd89a378a86a7eb69d6cc31ef71fbb4baeb45da" Oct 03 09:13:37 crc kubenswrapper[4664]: I1003 09:13:37.201945 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-v7wcj/must-gather-8zp6x"] Oct 03 09:13:37 crc kubenswrapper[4664]: E1003 09:13:37.203006 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9da34948-61c2-49f0-8400-a4b6cf03db40" containerName="registry-server" Oct 03 09:13:37 crc kubenswrapper[4664]: I1003 09:13:37.203027 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="9da34948-61c2-49f0-8400-a4b6cf03db40" containerName="registry-server" Oct 03 09:13:37 crc kubenswrapper[4664]: E1003 09:13:37.203056 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9da34948-61c2-49f0-8400-a4b6cf03db40" containerName="extract-utilities" Oct 03 09:13:37 crc kubenswrapper[4664]: I1003 09:13:37.203066 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="9da34948-61c2-49f0-8400-a4b6cf03db40" containerName="extract-utilities" Oct 03 09:13:37 crc kubenswrapper[4664]: E1003 09:13:37.203120 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9da34948-61c2-49f0-8400-a4b6cf03db40" containerName="extract-content" Oct 03 09:13:37 crc kubenswrapper[4664]: I1003 09:13:37.203129 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="9da34948-61c2-49f0-8400-a4b6cf03db40" containerName="extract-content" Oct 03 09:13:37 crc kubenswrapper[4664]: I1003 09:13:37.203402 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="9da34948-61c2-49f0-8400-a4b6cf03db40" containerName="registry-server" Oct 03 09:13:37 crc kubenswrapper[4664]: I1003 09:13:37.204751 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v7wcj/must-gather-8zp6x" Oct 03 09:13:37 crc kubenswrapper[4664]: I1003 09:13:37.206793 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-v7wcj"/"kube-root-ca.crt" Oct 03 09:13:37 crc kubenswrapper[4664]: I1003 09:13:37.207112 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-v7wcj"/"openshift-service-ca.crt" Oct 03 09:13:37 crc kubenswrapper[4664]: I1003 09:13:37.271269 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-v7wcj/must-gather-8zp6x"] Oct 03 09:13:37 crc kubenswrapper[4664]: I1003 09:13:37.362549 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b5278517-8bdb-4383-9477-631fa551bf9d-must-gather-output\") pod \"must-gather-8zp6x\" (UID: \"b5278517-8bdb-4383-9477-631fa551bf9d\") " pod="openshift-must-gather-v7wcj/must-gather-8zp6x" Oct 03 09:13:37 crc kubenswrapper[4664]: I1003 09:13:37.362652 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxnvh\" (UniqueName: \"kubernetes.io/projected/b5278517-8bdb-4383-9477-631fa551bf9d-kube-api-access-wxnvh\") pod \"must-gather-8zp6x\" (UID: \"b5278517-8bdb-4383-9477-631fa551bf9d\") " pod="openshift-must-gather-v7wcj/must-gather-8zp6x" Oct 03 09:13:37 crc kubenswrapper[4664]: I1003 09:13:37.464225 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b5278517-8bdb-4383-9477-631fa551bf9d-must-gather-output\") pod \"must-gather-8zp6x\" (UID: \"b5278517-8bdb-4383-9477-631fa551bf9d\") " pod="openshift-must-gather-v7wcj/must-gather-8zp6x" Oct 03 09:13:37 crc kubenswrapper[4664]: I1003 09:13:37.464286 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxnvh\" (UniqueName: \"kubernetes.io/projected/b5278517-8bdb-4383-9477-631fa551bf9d-kube-api-access-wxnvh\") pod \"must-gather-8zp6x\" (UID: \"b5278517-8bdb-4383-9477-631fa551bf9d\") " pod="openshift-must-gather-v7wcj/must-gather-8zp6x" Oct 03 09:13:37 crc kubenswrapper[4664]: I1003 09:13:37.465271 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b5278517-8bdb-4383-9477-631fa551bf9d-must-gather-output\") pod \"must-gather-8zp6x\" (UID: \"b5278517-8bdb-4383-9477-631fa551bf9d\") " pod="openshift-must-gather-v7wcj/must-gather-8zp6x" Oct 03 09:13:37 crc kubenswrapper[4664]: I1003 09:13:37.491124 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxnvh\" (UniqueName: \"kubernetes.io/projected/b5278517-8bdb-4383-9477-631fa551bf9d-kube-api-access-wxnvh\") pod \"must-gather-8zp6x\" (UID: \"b5278517-8bdb-4383-9477-631fa551bf9d\") " pod="openshift-must-gather-v7wcj/must-gather-8zp6x" Oct 03 09:13:37 crc kubenswrapper[4664]: I1003 09:13:37.533315 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v7wcj/must-gather-8zp6x" Oct 03 09:13:37 crc kubenswrapper[4664]: I1003 09:13:37.885675 4664 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 09:13:37 crc kubenswrapper[4664]: I1003 09:13:37.890151 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-v7wcj/must-gather-8zp6x"] Oct 03 09:13:38 crc kubenswrapper[4664]: I1003 09:13:38.293989 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v7wcj/must-gather-8zp6x" event={"ID":"b5278517-8bdb-4383-9477-631fa551bf9d","Type":"ContainerStarted","Data":"0052cb23c870367a5c16500a7f5faf5ff02188751d51e356dfc73ec1fce77fc7"} Oct 03 09:13:42 crc kubenswrapper[4664]: I1003 09:13:42.338060 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v7wcj/must-gather-8zp6x" event={"ID":"b5278517-8bdb-4383-9477-631fa551bf9d","Type":"ContainerStarted","Data":"e33c8e84bd795cd47f958735dda1723f43e435e4ab67440b695bdb89fa27ea12"} Oct 03 09:13:43 crc kubenswrapper[4664]: I1003 09:13:43.349826 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v7wcj/must-gather-8zp6x" event={"ID":"b5278517-8bdb-4383-9477-631fa551bf9d","Type":"ContainerStarted","Data":"f368ce669fa9b72d5bbca31a768ae5e0196ebab5542a4cd390ff60e5aa30cce9"} Oct 03 09:13:43 crc kubenswrapper[4664]: I1003 09:13:43.391999 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-v7wcj/must-gather-8zp6x" podStartSLOduration=2.420036078 podStartE2EDuration="6.391967284s" podCreationTimestamp="2025-10-03 09:13:37 +0000 UTC" firstStartedPulling="2025-10-03 09:13:37.885643517 +0000 UTC m=+5118.706834007" lastFinishedPulling="2025-10-03 09:13:41.857574723 +0000 UTC m=+5122.678765213" observedRunningTime="2025-10-03 09:13:43.369020377 +0000 UTC m=+5124.190210907" watchObservedRunningTime="2025-10-03 09:13:43.391967284 +0000 UTC m=+5124.213157784" Oct 03 09:13:47 crc kubenswrapper[4664]: I1003 09:13:47.527798 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-v7wcj/crc-debug-z6zpm"] Oct 03 09:13:47 crc kubenswrapper[4664]: I1003 09:13:47.529765 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v7wcj/crc-debug-z6zpm" Oct 03 09:13:47 crc kubenswrapper[4664]: I1003 09:13:47.531727 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-v7wcj"/"default-dockercfg-rd7wf" Oct 03 09:13:47 crc kubenswrapper[4664]: I1003 09:13:47.711879 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f88ba97c-5fff-4d9e-bef7-c0b187008eaf-host\") pod \"crc-debug-z6zpm\" (UID: \"f88ba97c-5fff-4d9e-bef7-c0b187008eaf\") " pod="openshift-must-gather-v7wcj/crc-debug-z6zpm" Oct 03 09:13:47 crc kubenswrapper[4664]: I1003 09:13:47.711967 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6hmb\" (UniqueName: \"kubernetes.io/projected/f88ba97c-5fff-4d9e-bef7-c0b187008eaf-kube-api-access-b6hmb\") pod \"crc-debug-z6zpm\" (UID: \"f88ba97c-5fff-4d9e-bef7-c0b187008eaf\") " pod="openshift-must-gather-v7wcj/crc-debug-z6zpm" Oct 03 09:13:47 crc kubenswrapper[4664]: I1003 09:13:47.813715 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f88ba97c-5fff-4d9e-bef7-c0b187008eaf-host\") pod \"crc-debug-z6zpm\" (UID: \"f88ba97c-5fff-4d9e-bef7-c0b187008eaf\") " pod="openshift-must-gather-v7wcj/crc-debug-z6zpm" Oct 03 09:13:47 crc kubenswrapper[4664]: I1003 09:13:47.813802 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6hmb\" (UniqueName: \"kubernetes.io/projected/f88ba97c-5fff-4d9e-bef7-c0b187008eaf-kube-api-access-b6hmb\") pod \"crc-debug-z6zpm\" (UID: \"f88ba97c-5fff-4d9e-bef7-c0b187008eaf\") " pod="openshift-must-gather-v7wcj/crc-debug-z6zpm" Oct 03 09:13:47 crc kubenswrapper[4664]: I1003 09:13:47.813983 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f88ba97c-5fff-4d9e-bef7-c0b187008eaf-host\") pod \"crc-debug-z6zpm\" (UID: \"f88ba97c-5fff-4d9e-bef7-c0b187008eaf\") " pod="openshift-must-gather-v7wcj/crc-debug-z6zpm" Oct 03 09:13:47 crc kubenswrapper[4664]: I1003 09:13:47.861142 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6hmb\" (UniqueName: \"kubernetes.io/projected/f88ba97c-5fff-4d9e-bef7-c0b187008eaf-kube-api-access-b6hmb\") pod \"crc-debug-z6zpm\" (UID: \"f88ba97c-5fff-4d9e-bef7-c0b187008eaf\") " pod="openshift-must-gather-v7wcj/crc-debug-z6zpm" Oct 03 09:13:48 crc kubenswrapper[4664]: I1003 09:13:48.149552 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v7wcj/crc-debug-z6zpm" Oct 03 09:13:48 crc kubenswrapper[4664]: I1003 09:13:48.406744 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v7wcj/crc-debug-z6zpm" event={"ID":"f88ba97c-5fff-4d9e-bef7-c0b187008eaf","Type":"ContainerStarted","Data":"feada5b70a867a8e3ed8254c90b16eb9e8306d3ca9ff8797489e7f6ad799426e"} Oct 03 09:14:00 crc kubenswrapper[4664]: I1003 09:14:00.520569 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v7wcj/crc-debug-z6zpm" event={"ID":"f88ba97c-5fff-4d9e-bef7-c0b187008eaf","Type":"ContainerStarted","Data":"a81ddb17cf9d0844c4675cf281837cb42bba29baac0fc4488c75aaf5e125eb73"} Oct 03 09:14:00 crc kubenswrapper[4664]: I1003 09:14:00.542070 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-v7wcj/crc-debug-z6zpm" podStartSLOduration=2.017918596 podStartE2EDuration="13.542049827s" podCreationTimestamp="2025-10-03 09:13:47 +0000 UTC" firstStartedPulling="2025-10-03 09:13:48.188192131 +0000 UTC m=+5129.009382621" lastFinishedPulling="2025-10-03 09:13:59.712323362 +0000 UTC m=+5140.533513852" observedRunningTime="2025-10-03 09:14:00.538732982 +0000 UTC m=+5141.359923502" watchObservedRunningTime="2025-10-03 09:14:00.542049827 +0000 UTC m=+5141.363240317" Oct 03 09:14:42 crc kubenswrapper[4664]: I1003 09:14:42.039574 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-msdnl"] Oct 03 09:14:42 crc kubenswrapper[4664]: I1003 09:14:42.042814 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-msdnl" Oct 03 09:14:42 crc kubenswrapper[4664]: I1003 09:14:42.085013 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-msdnl"] Oct 03 09:14:42 crc kubenswrapper[4664]: I1003 09:14:42.211963 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d73d005-263f-4fdc-8a15-f7fec3f33005-utilities\") pod \"certified-operators-msdnl\" (UID: \"9d73d005-263f-4fdc-8a15-f7fec3f33005\") " pod="openshift-marketplace/certified-operators-msdnl" Oct 03 09:14:42 crc kubenswrapper[4664]: I1003 09:14:42.212077 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d73d005-263f-4fdc-8a15-f7fec3f33005-catalog-content\") pod \"certified-operators-msdnl\" (UID: \"9d73d005-263f-4fdc-8a15-f7fec3f33005\") " pod="openshift-marketplace/certified-operators-msdnl" Oct 03 09:14:42 crc kubenswrapper[4664]: I1003 09:14:42.212100 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h55zk\" (UniqueName: \"kubernetes.io/projected/9d73d005-263f-4fdc-8a15-f7fec3f33005-kube-api-access-h55zk\") pod \"certified-operators-msdnl\" (UID: \"9d73d005-263f-4fdc-8a15-f7fec3f33005\") " pod="openshift-marketplace/certified-operators-msdnl" Oct 03 09:14:42 crc kubenswrapper[4664]: I1003 09:14:42.314515 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d73d005-263f-4fdc-8a15-f7fec3f33005-utilities\") pod \"certified-operators-msdnl\" (UID: \"9d73d005-263f-4fdc-8a15-f7fec3f33005\") " pod="openshift-marketplace/certified-operators-msdnl" Oct 03 09:14:42 crc kubenswrapper[4664]: I1003 09:14:42.314674 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d73d005-263f-4fdc-8a15-f7fec3f33005-catalog-content\") pod \"certified-operators-msdnl\" (UID: \"9d73d005-263f-4fdc-8a15-f7fec3f33005\") " pod="openshift-marketplace/certified-operators-msdnl" Oct 03 09:14:42 crc kubenswrapper[4664]: I1003 09:14:42.314705 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h55zk\" (UniqueName: \"kubernetes.io/projected/9d73d005-263f-4fdc-8a15-f7fec3f33005-kube-api-access-h55zk\") pod \"certified-operators-msdnl\" (UID: \"9d73d005-263f-4fdc-8a15-f7fec3f33005\") " pod="openshift-marketplace/certified-operators-msdnl" Oct 03 09:14:42 crc kubenswrapper[4664]: I1003 09:14:42.315303 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d73d005-263f-4fdc-8a15-f7fec3f33005-catalog-content\") pod \"certified-operators-msdnl\" (UID: \"9d73d005-263f-4fdc-8a15-f7fec3f33005\") " pod="openshift-marketplace/certified-operators-msdnl" Oct 03 09:14:42 crc kubenswrapper[4664]: I1003 09:14:42.315319 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d73d005-263f-4fdc-8a15-f7fec3f33005-utilities\") pod \"certified-operators-msdnl\" (UID: \"9d73d005-263f-4fdc-8a15-f7fec3f33005\") " pod="openshift-marketplace/certified-operators-msdnl" Oct 03 09:14:42 crc kubenswrapper[4664]: I1003 09:14:42.338582 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h55zk\" (UniqueName: \"kubernetes.io/projected/9d73d005-263f-4fdc-8a15-f7fec3f33005-kube-api-access-h55zk\") pod \"certified-operators-msdnl\" (UID: \"9d73d005-263f-4fdc-8a15-f7fec3f33005\") " pod="openshift-marketplace/certified-operators-msdnl" Oct 03 09:14:42 crc kubenswrapper[4664]: I1003 09:14:42.389973 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-msdnl" Oct 03 09:14:42 crc kubenswrapper[4664]: I1003 09:14:42.740248 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-msdnl"] Oct 03 09:14:42 crc kubenswrapper[4664]: I1003 09:14:42.936979 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-msdnl" event={"ID":"9d73d005-263f-4fdc-8a15-f7fec3f33005","Type":"ContainerStarted","Data":"28a7528b8629aea91cb9ba5e20edf8c412c3924fd2cadb9b3bbb4a67be90831f"} Oct 03 09:14:43 crc kubenswrapper[4664]: I1003 09:14:43.951016 4664 generic.go:334] "Generic (PLEG): container finished" podID="9d73d005-263f-4fdc-8a15-f7fec3f33005" containerID="6051fe2e5ec68bb05ba67809a2670120f923c4940aa778f6384e41cbb7d66e7f" exitCode=0 Oct 03 09:14:43 crc kubenswrapper[4664]: I1003 09:14:43.951220 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-msdnl" event={"ID":"9d73d005-263f-4fdc-8a15-f7fec3f33005","Type":"ContainerDied","Data":"6051fe2e5ec68bb05ba67809a2670120f923c4940aa778f6384e41cbb7d66e7f"} Oct 03 09:14:45 crc kubenswrapper[4664]: I1003 09:14:45.980279 4664 generic.go:334] "Generic (PLEG): container finished" podID="9d73d005-263f-4fdc-8a15-f7fec3f33005" containerID="a0f05db1122fd10e070ab80646588e6b2f0eda3f7bddb043f7cc2543054c28ee" exitCode=0 Oct 03 09:14:45 crc kubenswrapper[4664]: I1003 09:14:45.980379 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-msdnl" event={"ID":"9d73d005-263f-4fdc-8a15-f7fec3f33005","Type":"ContainerDied","Data":"a0f05db1122fd10e070ab80646588e6b2f0eda3f7bddb043f7cc2543054c28ee"} Oct 03 09:14:48 crc kubenswrapper[4664]: I1003 09:14:48.006820 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-msdnl" event={"ID":"9d73d005-263f-4fdc-8a15-f7fec3f33005","Type":"ContainerStarted","Data":"041481d71f76fd5be2f04868299ad9426da7f69c84289bac8fe3e3579438b0fa"} Oct 03 09:14:52 crc kubenswrapper[4664]: I1003 09:14:52.390177 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-msdnl" Oct 03 09:14:52 crc kubenswrapper[4664]: I1003 09:14:52.390673 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-msdnl" Oct 03 09:14:52 crc kubenswrapper[4664]: I1003 09:14:52.459655 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-msdnl" Oct 03 09:14:52 crc kubenswrapper[4664]: I1003 09:14:52.485465 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-msdnl" podStartSLOduration=7.713322089 podStartE2EDuration="10.485445641s" podCreationTimestamp="2025-10-03 09:14:42 +0000 UTC" firstStartedPulling="2025-10-03 09:14:43.95431438 +0000 UTC m=+5184.775504870" lastFinishedPulling="2025-10-03 09:14:46.726437932 +0000 UTC m=+5187.547628422" observedRunningTime="2025-10-03 09:14:48.029973666 +0000 UTC m=+5188.851164186" watchObservedRunningTime="2025-10-03 09:14:52.485445641 +0000 UTC m=+5193.306636131" Oct 03 09:14:52 crc kubenswrapper[4664]: I1003 09:14:52.614003 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-797fcd6b7d-kpnbp_dc9a9e6e-b8f5-4991-9a64-7a928f66075c/barbican-api/0.log" Oct 03 09:14:52 crc kubenswrapper[4664]: I1003 09:14:52.621298 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-797fcd6b7d-kpnbp_dc9a9e6e-b8f5-4991-9a64-7a928f66075c/barbican-api-log/0.log" Oct 03 09:14:52 crc kubenswrapper[4664]: I1003 09:14:52.863592 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-58f9b497cd-8m4l7_ff0f93c1-983e-4202-b659-9a4b68fb015e/barbican-keystone-listener/0.log" Oct 03 09:14:52 crc kubenswrapper[4664]: I1003 09:14:52.897359 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-58f9b497cd-8m4l7_ff0f93c1-983e-4202-b659-9a4b68fb015e/barbican-keystone-listener-log/0.log" Oct 03 09:14:53 crc kubenswrapper[4664]: I1003 09:14:53.090777 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7f467f54bc-hkh4m_1ca16101-0bee-4cb4-b9f4-3a2db110eaba/barbican-worker/0.log" Oct 03 09:14:53 crc kubenswrapper[4664]: I1003 09:14:53.103931 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-msdnl" Oct 03 09:14:53 crc kubenswrapper[4664]: I1003 09:14:53.137238 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7f467f54bc-hkh4m_1ca16101-0bee-4cb4-b9f4-3a2db110eaba/barbican-worker-log/0.log" Oct 03 09:14:53 crc kubenswrapper[4664]: I1003 09:14:53.152033 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-msdnl"] Oct 03 09:14:53 crc kubenswrapper[4664]: I1003 09:14:53.303402 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-5hmgn_d922fb2a-651d-432e-9859-c89cd4a2268f/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 09:14:53 crc kubenswrapper[4664]: I1003 09:14:53.550530 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_9d15d050-2edb-489f-aa55-439467f10bd8/ceilometer-central-agent/0.log" Oct 03 09:14:53 crc kubenswrapper[4664]: I1003 09:14:53.627265 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_9d15d050-2edb-489f-aa55-439467f10bd8/proxy-httpd/0.log" Oct 03 09:14:53 crc kubenswrapper[4664]: I1003 09:14:53.664400 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_9d15d050-2edb-489f-aa55-439467f10bd8/ceilometer-notification-agent/0.log" Oct 03 09:14:53 crc kubenswrapper[4664]: I1003 09:14:53.772891 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_9d15d050-2edb-489f-aa55-439467f10bd8/sg-core/0.log" Oct 03 09:14:53 crc kubenswrapper[4664]: I1003 09:14:53.953589 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_81114761-59aa-4b5d-8848-963f6c73efe2/cinder-api/0.log" Oct 03 09:14:54 crc kubenswrapper[4664]: I1003 09:14:54.037171 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_81114761-59aa-4b5d-8848-963f6c73efe2/cinder-api-log/0.log" Oct 03 09:14:54 crc kubenswrapper[4664]: I1003 09:14:54.170581 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_0bb12c08-bee2-4964-892c-98b4d9a75b82/cinder-scheduler/0.log" Oct 03 09:14:54 crc kubenswrapper[4664]: I1003 09:14:54.290851 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_0bb12c08-bee2-4964-892c-98b4d9a75b82/probe/0.log" Oct 03 09:14:54 crc kubenswrapper[4664]: I1003 09:14:54.362960 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-bj8rm_f1a9115e-e1a1-4de5-8a59-3b23ac395ec4/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 09:14:54 crc kubenswrapper[4664]: I1003 09:14:54.535642 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-77v4r_fa362fe3-175b-4212-b34c-341eab1572cf/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 09:14:54 crc kubenswrapper[4664]: I1003 09:14:54.738355 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-gtls8_1bec1364-2710-466d-83d1-66e226d2e314/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 09:14:54 crc kubenswrapper[4664]: I1003 09:14:54.797757 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-t8fh9_f95a9f32-c6c5-4b5d-be2c-faab7a8d8e6a/init/0.log" Oct 03 09:14:54 crc kubenswrapper[4664]: I1003 09:14:54.946751 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-t8fh9_f95a9f32-c6c5-4b5d-be2c-faab7a8d8e6a/init/0.log" Oct 03 09:14:54 crc kubenswrapper[4664]: I1003 09:14:54.992801 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-t8fh9_f95a9f32-c6c5-4b5d-be2c-faab7a8d8e6a/dnsmasq-dns/0.log" Oct 03 09:14:55 crc kubenswrapper[4664]: I1003 09:14:55.044585 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-5tthl_70c69b69-6f48-4169-9cbc-a145ed1d8e07/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 09:14:55 crc kubenswrapper[4664]: I1003 09:14:55.073902 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-msdnl" podUID="9d73d005-263f-4fdc-8a15-f7fec3f33005" containerName="registry-server" containerID="cri-o://041481d71f76fd5be2f04868299ad9426da7f69c84289bac8fe3e3579438b0fa" gracePeriod=2 Oct 03 09:14:55 crc kubenswrapper[4664]: I1003 09:14:55.183230 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_8f9a018c-ae4c-475f-a813-9b4cf0e51f49/glance-httpd/0.log" Oct 03 09:14:55 crc kubenswrapper[4664]: I1003 09:14:55.222176 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_8f9a018c-ae4c-475f-a813-9b4cf0e51f49/glance-log/0.log" Oct 03 09:14:55 crc kubenswrapper[4664]: I1003 09:14:55.426898 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_44bbde12-29da-4681-a413-58bd1db590e9/glance-httpd/0.log" Oct 03 09:14:55 crc kubenswrapper[4664]: I1003 09:14:55.500408 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-msdnl" Oct 03 09:14:55 crc kubenswrapper[4664]: I1003 09:14:55.543421 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_44bbde12-29da-4681-a413-58bd1db590e9/glance-log/0.log" Oct 03 09:14:55 crc kubenswrapper[4664]: I1003 09:14:55.574107 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d73d005-263f-4fdc-8a15-f7fec3f33005-utilities\") pod \"9d73d005-263f-4fdc-8a15-f7fec3f33005\" (UID: \"9d73d005-263f-4fdc-8a15-f7fec3f33005\") " Oct 03 09:14:55 crc kubenswrapper[4664]: I1003 09:14:55.574326 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d73d005-263f-4fdc-8a15-f7fec3f33005-catalog-content\") pod \"9d73d005-263f-4fdc-8a15-f7fec3f33005\" (UID: \"9d73d005-263f-4fdc-8a15-f7fec3f33005\") " Oct 03 09:14:55 crc kubenswrapper[4664]: I1003 09:14:55.574445 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h55zk\" (UniqueName: \"kubernetes.io/projected/9d73d005-263f-4fdc-8a15-f7fec3f33005-kube-api-access-h55zk\") pod \"9d73d005-263f-4fdc-8a15-f7fec3f33005\" (UID: \"9d73d005-263f-4fdc-8a15-f7fec3f33005\") " Oct 03 09:14:55 crc kubenswrapper[4664]: I1003 09:14:55.575820 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d73d005-263f-4fdc-8a15-f7fec3f33005-utilities" (OuterVolumeSpecName: "utilities") pod "9d73d005-263f-4fdc-8a15-f7fec3f33005" (UID: "9d73d005-263f-4fdc-8a15-f7fec3f33005"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:14:55 crc kubenswrapper[4664]: I1003 09:14:55.596176 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d73d005-263f-4fdc-8a15-f7fec3f33005-kube-api-access-h55zk" (OuterVolumeSpecName: "kube-api-access-h55zk") pod "9d73d005-263f-4fdc-8a15-f7fec3f33005" (UID: "9d73d005-263f-4fdc-8a15-f7fec3f33005"). InnerVolumeSpecName "kube-api-access-h55zk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:14:55 crc kubenswrapper[4664]: I1003 09:14:55.650224 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d73d005-263f-4fdc-8a15-f7fec3f33005-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9d73d005-263f-4fdc-8a15-f7fec3f33005" (UID: "9d73d005-263f-4fdc-8a15-f7fec3f33005"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:14:55 crc kubenswrapper[4664]: I1003 09:14:55.676144 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h55zk\" (UniqueName: \"kubernetes.io/projected/9d73d005-263f-4fdc-8a15-f7fec3f33005-kube-api-access-h55zk\") on node \"crc\" DevicePath \"\"" Oct 03 09:14:55 crc kubenswrapper[4664]: I1003 09:14:55.676172 4664 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d73d005-263f-4fdc-8a15-f7fec3f33005-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:14:55 crc kubenswrapper[4664]: I1003 09:14:55.676182 4664 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d73d005-263f-4fdc-8a15-f7fec3f33005-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:14:55 crc kubenswrapper[4664]: I1003 09:14:55.712848 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-85fcf9fb6-r8r76_f1015cf1-8e4b-44fd-a794-27edfecdceed/horizon/0.log" Oct 03 09:14:55 crc kubenswrapper[4664]: I1003 09:14:55.808276 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-5l55k_9334fa32-3e3c-4a4c-ab00-71902c455beb/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 09:14:56 crc kubenswrapper[4664]: I1003 09:14:56.087080 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-6kz44_d40d65c6-7d3e-4be9-8c0c-a24b74166668/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 09:14:56 crc kubenswrapper[4664]: I1003 09:14:56.095076 4664 generic.go:334] "Generic (PLEG): container finished" podID="9d73d005-263f-4fdc-8a15-f7fec3f33005" containerID="041481d71f76fd5be2f04868299ad9426da7f69c84289bac8fe3e3579438b0fa" exitCode=0 Oct 03 09:14:56 crc kubenswrapper[4664]: I1003 09:14:56.095120 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-msdnl" event={"ID":"9d73d005-263f-4fdc-8a15-f7fec3f33005","Type":"ContainerDied","Data":"041481d71f76fd5be2f04868299ad9426da7f69c84289bac8fe3e3579438b0fa"} Oct 03 09:14:56 crc kubenswrapper[4664]: I1003 09:14:56.095151 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-msdnl" event={"ID":"9d73d005-263f-4fdc-8a15-f7fec3f33005","Type":"ContainerDied","Data":"28a7528b8629aea91cb9ba5e20edf8c412c3924fd2cadb9b3bbb4a67be90831f"} Oct 03 09:14:56 crc kubenswrapper[4664]: I1003 09:14:56.095168 4664 scope.go:117] "RemoveContainer" containerID="041481d71f76fd5be2f04868299ad9426da7f69c84289bac8fe3e3579438b0fa" Oct 03 09:14:56 crc kubenswrapper[4664]: I1003 09:14:56.095186 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-msdnl" Oct 03 09:14:56 crc kubenswrapper[4664]: I1003 09:14:56.126958 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-msdnl"] Oct 03 09:14:56 crc kubenswrapper[4664]: I1003 09:14:56.127624 4664 scope.go:117] "RemoveContainer" containerID="a0f05db1122fd10e070ab80646588e6b2f0eda3f7bddb043f7cc2543054c28ee" Oct 03 09:14:56 crc kubenswrapper[4664]: I1003 09:14:56.134312 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-msdnl"] Oct 03 09:14:56 crc kubenswrapper[4664]: I1003 09:14:56.155784 4664 scope.go:117] "RemoveContainer" containerID="6051fe2e5ec68bb05ba67809a2670120f923c4940aa778f6384e41cbb7d66e7f" Oct 03 09:14:56 crc kubenswrapper[4664]: I1003 09:14:56.223387 4664 scope.go:117] "RemoveContainer" containerID="041481d71f76fd5be2f04868299ad9426da7f69c84289bac8fe3e3579438b0fa" Oct 03 09:14:56 crc kubenswrapper[4664]: E1003 09:14:56.224435 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"041481d71f76fd5be2f04868299ad9426da7f69c84289bac8fe3e3579438b0fa\": container with ID starting with 041481d71f76fd5be2f04868299ad9426da7f69c84289bac8fe3e3579438b0fa not found: ID does not exist" containerID="041481d71f76fd5be2f04868299ad9426da7f69c84289bac8fe3e3579438b0fa" Oct 03 09:14:56 crc kubenswrapper[4664]: I1003 09:14:56.224503 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"041481d71f76fd5be2f04868299ad9426da7f69c84289bac8fe3e3579438b0fa"} err="failed to get container status \"041481d71f76fd5be2f04868299ad9426da7f69c84289bac8fe3e3579438b0fa\": rpc error: code = NotFound desc = could not find container \"041481d71f76fd5be2f04868299ad9426da7f69c84289bac8fe3e3579438b0fa\": container with ID starting with 041481d71f76fd5be2f04868299ad9426da7f69c84289bac8fe3e3579438b0fa not found: ID does not exist" Oct 03 09:14:56 crc kubenswrapper[4664]: I1003 09:14:56.224531 4664 scope.go:117] "RemoveContainer" containerID="a0f05db1122fd10e070ab80646588e6b2f0eda3f7bddb043f7cc2543054c28ee" Oct 03 09:14:56 crc kubenswrapper[4664]: E1003 09:14:56.225006 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0f05db1122fd10e070ab80646588e6b2f0eda3f7bddb043f7cc2543054c28ee\": container with ID starting with a0f05db1122fd10e070ab80646588e6b2f0eda3f7bddb043f7cc2543054c28ee not found: ID does not exist" containerID="a0f05db1122fd10e070ab80646588e6b2f0eda3f7bddb043f7cc2543054c28ee" Oct 03 09:14:56 crc kubenswrapper[4664]: I1003 09:14:56.225108 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0f05db1122fd10e070ab80646588e6b2f0eda3f7bddb043f7cc2543054c28ee"} err="failed to get container status \"a0f05db1122fd10e070ab80646588e6b2f0eda3f7bddb043f7cc2543054c28ee\": rpc error: code = NotFound desc = could not find container \"a0f05db1122fd10e070ab80646588e6b2f0eda3f7bddb043f7cc2543054c28ee\": container with ID starting with a0f05db1122fd10e070ab80646588e6b2f0eda3f7bddb043f7cc2543054c28ee not found: ID does not exist" Oct 03 09:14:56 crc kubenswrapper[4664]: I1003 09:14:56.225186 4664 scope.go:117] "RemoveContainer" containerID="6051fe2e5ec68bb05ba67809a2670120f923c4940aa778f6384e41cbb7d66e7f" Oct 03 09:14:56 crc kubenswrapper[4664]: E1003 09:14:56.225683 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6051fe2e5ec68bb05ba67809a2670120f923c4940aa778f6384e41cbb7d66e7f\": container with ID starting with 6051fe2e5ec68bb05ba67809a2670120f923c4940aa778f6384e41cbb7d66e7f not found: ID does not exist" containerID="6051fe2e5ec68bb05ba67809a2670120f923c4940aa778f6384e41cbb7d66e7f" Oct 03 09:14:56 crc kubenswrapper[4664]: I1003 09:14:56.225740 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6051fe2e5ec68bb05ba67809a2670120f923c4940aa778f6384e41cbb7d66e7f"} err="failed to get container status \"6051fe2e5ec68bb05ba67809a2670120f923c4940aa778f6384e41cbb7d66e7f\": rpc error: code = NotFound desc = could not find container \"6051fe2e5ec68bb05ba67809a2670120f923c4940aa778f6384e41cbb7d66e7f\": container with ID starting with 6051fe2e5ec68bb05ba67809a2670120f923c4940aa778f6384e41cbb7d66e7f not found: ID does not exist" Oct 03 09:14:56 crc kubenswrapper[4664]: I1003 09:14:56.317691 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-85fcf9fb6-r8r76_f1015cf1-8e4b-44fd-a794-27edfecdceed/horizon-log/0.log" Oct 03 09:14:56 crc kubenswrapper[4664]: I1003 09:14:56.354422 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-ccfbd46bc-qz9qm_05056d09-f95e-40cb-96e7-100243e5a858/keystone-api/0.log" Oct 03 09:14:56 crc kubenswrapper[4664]: I1003 09:14:56.396087 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29324701-s7s4z_58076b89-9a2c-42e4-83dc-f26ef09f5d55/keystone-cron/0.log" Oct 03 09:14:56 crc kubenswrapper[4664]: I1003 09:14:56.521164 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_f0438fa7-19ee-4886-a77a-bc835552ca0a/kube-state-metrics/0.log" Oct 03 09:14:56 crc kubenswrapper[4664]: I1003 09:14:56.869337 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-b754456d9-lc2mg_d882eea5-c7df-4023-b542-96e4057ad948/neutron-api/0.log" Oct 03 09:14:56 crc kubenswrapper[4664]: I1003 09:14:56.891140 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-b754456d9-lc2mg_d882eea5-c7df-4023-b542-96e4057ad948/neutron-httpd/0.log" Oct 03 09:14:57 crc kubenswrapper[4664]: I1003 09:14:57.304875 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_b708c60d-d069-4a68-8bf7-0d2e9a325eb0/nova-api-log/0.log" Oct 03 09:14:57 crc kubenswrapper[4664]: I1003 09:14:57.585493 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_b708c60d-d069-4a68-8bf7-0d2e9a325eb0/nova-api-api/0.log" Oct 03 09:14:57 crc kubenswrapper[4664]: I1003 09:14:57.815098 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_055183e7-9fec-4cb4-858b-3a9f7fabfdcf/nova-cell0-conductor-conductor/0.log" Oct 03 09:14:57 crc kubenswrapper[4664]: I1003 09:14:57.889677 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d73d005-263f-4fdc-8a15-f7fec3f33005" path="/var/lib/kubelet/pods/9d73d005-263f-4fdc-8a15-f7fec3f33005/volumes" Oct 03 09:14:57 crc kubenswrapper[4664]: I1003 09:14:57.956214 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_647a5434-1e87-43fa-a85c-e57c75f985f3/nova-cell1-conductor-conductor/0.log" Oct 03 09:14:58 crc kubenswrapper[4664]: I1003 09:14:58.115644 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_dc938212-d4cd-4ab5-bc5c-126715d0e3d4/nova-cell1-novncproxy-novncproxy/0.log" Oct 03 09:14:58 crc kubenswrapper[4664]: I1003 09:14:58.358803 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_817b8b02-eef7-4753-ad19-8bf7fd3fbe9a/nova-metadata-log/0.log" Oct 03 09:14:58 crc kubenswrapper[4664]: I1003 09:14:58.975284 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_e271fae6-d173-43f6-ad2b-27e3c182134b/nova-scheduler-scheduler/0.log" Oct 03 09:14:59 crc kubenswrapper[4664]: I1003 09:14:59.261482 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_0710a5a3-3e65-42b8-bd1d-d40deb6a325d/mysql-bootstrap/0.log" Oct 03 09:14:59 crc kubenswrapper[4664]: I1003 09:14:59.403291 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_0710a5a3-3e65-42b8-bd1d-d40deb6a325d/mysql-bootstrap/0.log" Oct 03 09:14:59 crc kubenswrapper[4664]: I1003 09:14:59.481078 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_0710a5a3-3e65-42b8-bd1d-d40deb6a325d/galera/0.log" Oct 03 09:14:59 crc kubenswrapper[4664]: I1003 09:14:59.794248 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_a5b770eb-2222-43c8-bb15-6e2d18e95fbf/mysql-bootstrap/0.log" Oct 03 09:14:59 crc kubenswrapper[4664]: I1003 09:14:59.934873 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_a5b770eb-2222-43c8-bb15-6e2d18e95fbf/mysql-bootstrap/0.log" Oct 03 09:15:00 crc kubenswrapper[4664]: I1003 09:15:00.048000 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_a5b770eb-2222-43c8-bb15-6e2d18e95fbf/galera/0.log" Oct 03 09:15:00 crc kubenswrapper[4664]: I1003 09:15:00.164499 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324715-bqd7n"] Oct 03 09:15:00 crc kubenswrapper[4664]: E1003 09:15:00.164964 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d73d005-263f-4fdc-8a15-f7fec3f33005" containerName="extract-utilities" Oct 03 09:15:00 crc kubenswrapper[4664]: I1003 09:15:00.164984 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d73d005-263f-4fdc-8a15-f7fec3f33005" containerName="extract-utilities" Oct 03 09:15:00 crc kubenswrapper[4664]: E1003 09:15:00.164999 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d73d005-263f-4fdc-8a15-f7fec3f33005" containerName="registry-server" Oct 03 09:15:00 crc kubenswrapper[4664]: I1003 09:15:00.165006 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d73d005-263f-4fdc-8a15-f7fec3f33005" containerName="registry-server" Oct 03 09:15:00 crc kubenswrapper[4664]: E1003 09:15:00.165028 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d73d005-263f-4fdc-8a15-f7fec3f33005" containerName="extract-content" Oct 03 09:15:00 crc kubenswrapper[4664]: I1003 09:15:00.165034 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d73d005-263f-4fdc-8a15-f7fec3f33005" containerName="extract-content" Oct 03 09:15:00 crc kubenswrapper[4664]: I1003 09:15:00.165207 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d73d005-263f-4fdc-8a15-f7fec3f33005" containerName="registry-server" Oct 03 09:15:00 crc kubenswrapper[4664]: I1003 09:15:00.165932 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-bqd7n" Oct 03 09:15:00 crc kubenswrapper[4664]: I1003 09:15:00.169241 4664 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 09:15:00 crc kubenswrapper[4664]: I1003 09:15:00.169474 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 09:15:00 crc kubenswrapper[4664]: I1003 09:15:00.179591 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324715-bqd7n"] Oct 03 09:15:00 crc kubenswrapper[4664]: I1003 09:15:00.260774 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltcjg\" (UniqueName: \"kubernetes.io/projected/79e397f2-9430-46e9-bce4-2a0aac2c49a9-kube-api-access-ltcjg\") pod \"collect-profiles-29324715-bqd7n\" (UID: \"79e397f2-9430-46e9-bce4-2a0aac2c49a9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-bqd7n" Oct 03 09:15:00 crc kubenswrapper[4664]: I1003 09:15:00.261192 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/79e397f2-9430-46e9-bce4-2a0aac2c49a9-secret-volume\") pod \"collect-profiles-29324715-bqd7n\" (UID: \"79e397f2-9430-46e9-bce4-2a0aac2c49a9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-bqd7n" Oct 03 09:15:00 crc kubenswrapper[4664]: I1003 09:15:00.261471 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/79e397f2-9430-46e9-bce4-2a0aac2c49a9-config-volume\") pod \"collect-profiles-29324715-bqd7n\" (UID: \"79e397f2-9430-46e9-bce4-2a0aac2c49a9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-bqd7n" Oct 03 09:15:00 crc kubenswrapper[4664]: I1003 09:15:00.275126 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_06ee9d13-7b0c-4619-8421-6f1a5d8a2f05/openstackclient/0.log" Oct 03 09:15:00 crc kubenswrapper[4664]: I1003 09:15:00.363028 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/79e397f2-9430-46e9-bce4-2a0aac2c49a9-config-volume\") pod \"collect-profiles-29324715-bqd7n\" (UID: \"79e397f2-9430-46e9-bce4-2a0aac2c49a9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-bqd7n" Oct 03 09:15:00 crc kubenswrapper[4664]: I1003 09:15:00.363385 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltcjg\" (UniqueName: \"kubernetes.io/projected/79e397f2-9430-46e9-bce4-2a0aac2c49a9-kube-api-access-ltcjg\") pod \"collect-profiles-29324715-bqd7n\" (UID: \"79e397f2-9430-46e9-bce4-2a0aac2c49a9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-bqd7n" Oct 03 09:15:00 crc kubenswrapper[4664]: I1003 09:15:00.363713 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/79e397f2-9430-46e9-bce4-2a0aac2c49a9-secret-volume\") pod \"collect-profiles-29324715-bqd7n\" (UID: \"79e397f2-9430-46e9-bce4-2a0aac2c49a9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-bqd7n" Oct 03 09:15:00 crc kubenswrapper[4664]: I1003 09:15:00.364402 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/79e397f2-9430-46e9-bce4-2a0aac2c49a9-config-volume\") pod \"collect-profiles-29324715-bqd7n\" (UID: \"79e397f2-9430-46e9-bce4-2a0aac2c49a9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-bqd7n" Oct 03 09:15:00 crc kubenswrapper[4664]: I1003 09:15:00.370780 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/79e397f2-9430-46e9-bce4-2a0aac2c49a9-secret-volume\") pod \"collect-profiles-29324715-bqd7n\" (UID: \"79e397f2-9430-46e9-bce4-2a0aac2c49a9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-bqd7n" Oct 03 09:15:00 crc kubenswrapper[4664]: I1003 09:15:00.387733 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltcjg\" (UniqueName: \"kubernetes.io/projected/79e397f2-9430-46e9-bce4-2a0aac2c49a9-kube-api-access-ltcjg\") pod \"collect-profiles-29324715-bqd7n\" (UID: \"79e397f2-9430-46e9-bce4-2a0aac2c49a9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-bqd7n" Oct 03 09:15:00 crc kubenswrapper[4664]: I1003 09:15:00.460599 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-fb5ld_b65aa3e9-2d60-4cd9-b63a-93a07ab33e72/ovn-controller/0.log" Oct 03 09:15:00 crc kubenswrapper[4664]: I1003 09:15:00.508727 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-bqd7n" Oct 03 09:15:00 crc kubenswrapper[4664]: I1003 09:15:00.759178 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_817b8b02-eef7-4753-ad19-8bf7fd3fbe9a/nova-metadata-metadata/0.log" Oct 03 09:15:00 crc kubenswrapper[4664]: I1003 09:15:00.822495 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-2k7c8_9f1aa7a2-0b64-4d75-9e05-9a53c987be28/openstack-network-exporter/0.log" Oct 03 09:15:01 crc kubenswrapper[4664]: I1003 09:15:01.026195 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324715-bqd7n"] Oct 03 09:15:01 crc kubenswrapper[4664]: I1003 09:15:01.092378 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qcwp9_deb5f246-e857-4517-9f9c-290bc76ba8f6/ovsdb-server-init/0.log" Oct 03 09:15:01 crc kubenswrapper[4664]: I1003 09:15:01.167236 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-bqd7n" event={"ID":"79e397f2-9430-46e9-bce4-2a0aac2c49a9","Type":"ContainerStarted","Data":"a923f643af9cc86f7b28de9947a1928e10a6287b8f05970d93775f8846cfea32"} Oct 03 09:15:01 crc kubenswrapper[4664]: I1003 09:15:01.258182 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qcwp9_deb5f246-e857-4517-9f9c-290bc76ba8f6/ovsdb-server-init/0.log" Oct 03 09:15:01 crc kubenswrapper[4664]: I1003 09:15:01.308147 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qcwp9_deb5f246-e857-4517-9f9c-290bc76ba8f6/ovsdb-server/0.log" Oct 03 09:15:01 crc kubenswrapper[4664]: I1003 09:15:01.320472 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qcwp9_deb5f246-e857-4517-9f9c-290bc76ba8f6/ovs-vswitchd/0.log" Oct 03 09:15:01 crc kubenswrapper[4664]: I1003 09:15:01.678044 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-5fkkn_2e962f71-22f7-48d9-af7d-53baea22b9cc/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 09:15:01 crc kubenswrapper[4664]: I1003 09:15:01.846556 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-cbsf6_da2b4748-a1ed-4edd-ba5e-2bdc917790dd/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 09:15:01 crc kubenswrapper[4664]: I1003 09:15:01.903757 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-fgwdn_a36410cf-2458-4589-86e0-e921e7489d07/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 09:15:02 crc kubenswrapper[4664]: I1003 09:15:02.130264 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-l9xcl_41264278-270d-4f29-b68a-15340641bcb4/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 09:15:02 crc kubenswrapper[4664]: I1003 09:15:02.178498 4664 generic.go:334] "Generic (PLEG): container finished" podID="79e397f2-9430-46e9-bce4-2a0aac2c49a9" containerID="e0296fd42958f3114a239ea96cc4d6b63ac14091c34a1183d1583deb615b16d7" exitCode=0 Oct 03 09:15:02 crc kubenswrapper[4664]: I1003 09:15:02.178556 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-bqd7n" event={"ID":"79e397f2-9430-46e9-bce4-2a0aac2c49a9","Type":"ContainerDied","Data":"e0296fd42958f3114a239ea96cc4d6b63ac14091c34a1183d1583deb615b16d7"} Oct 03 09:15:02 crc kubenswrapper[4664]: I1003 09:15:02.381926 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-sk7ch_50ab7fd6-d934-496e-96fd-debf67b6b634/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 09:15:02 crc kubenswrapper[4664]: I1003 09:15:02.465676 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-v46ns_c317ebff-b55c-4484-97a8-d90142316326/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 09:15:02 crc kubenswrapper[4664]: I1003 09:15:02.654862 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-wbt4p_e714e59e-a513-4061-852d-0e7a0f36e923/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 09:15:03 crc kubenswrapper[4664]: I1003 09:15:03.079321 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_55811e3b-b345-4cc9-9ade-c8f977a0706c/ovn-northd/0.log" Oct 03 09:15:03 crc kubenswrapper[4664]: I1003 09:15:03.194876 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_55811e3b-b345-4cc9-9ade-c8f977a0706c/openstack-network-exporter/0.log" Oct 03 09:15:03 crc kubenswrapper[4664]: I1003 09:15:03.257382 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3eeeda4d-56ed-4cc1-8f0a-fb43df8f94b3/openstack-network-exporter/0.log" Oct 03 09:15:03 crc kubenswrapper[4664]: I1003 09:15:03.407239 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3eeeda4d-56ed-4cc1-8f0a-fb43df8f94b3/ovsdbserver-nb/0.log" Oct 03 09:15:03 crc kubenswrapper[4664]: I1003 09:15:03.507220 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_dd7a7515-e2bf-4c9f-9dac-1c0bd3b87e78/openstack-network-exporter/0.log" Oct 03 09:15:03 crc kubenswrapper[4664]: I1003 09:15:03.571056 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-bqd7n" Oct 03 09:15:03 crc kubenswrapper[4664]: I1003 09:15:03.657637 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/79e397f2-9430-46e9-bce4-2a0aac2c49a9-config-volume\") pod \"79e397f2-9430-46e9-bce4-2a0aac2c49a9\" (UID: \"79e397f2-9430-46e9-bce4-2a0aac2c49a9\") " Oct 03 09:15:03 crc kubenswrapper[4664]: I1003 09:15:03.657735 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltcjg\" (UniqueName: \"kubernetes.io/projected/79e397f2-9430-46e9-bce4-2a0aac2c49a9-kube-api-access-ltcjg\") pod \"79e397f2-9430-46e9-bce4-2a0aac2c49a9\" (UID: \"79e397f2-9430-46e9-bce4-2a0aac2c49a9\") " Oct 03 09:15:03 crc kubenswrapper[4664]: I1003 09:15:03.657815 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/79e397f2-9430-46e9-bce4-2a0aac2c49a9-secret-volume\") pod \"79e397f2-9430-46e9-bce4-2a0aac2c49a9\" (UID: \"79e397f2-9430-46e9-bce4-2a0aac2c49a9\") " Oct 03 09:15:03 crc kubenswrapper[4664]: I1003 09:15:03.658705 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79e397f2-9430-46e9-bce4-2a0aac2c49a9-config-volume" (OuterVolumeSpecName: "config-volume") pod "79e397f2-9430-46e9-bce4-2a0aac2c49a9" (UID: "79e397f2-9430-46e9-bce4-2a0aac2c49a9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:15:03 crc kubenswrapper[4664]: I1003 09:15:03.659034 4664 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/79e397f2-9430-46e9-bce4-2a0aac2c49a9-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 09:15:03 crc kubenswrapper[4664]: I1003 09:15:03.664893 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79e397f2-9430-46e9-bce4-2a0aac2c49a9-kube-api-access-ltcjg" (OuterVolumeSpecName: "kube-api-access-ltcjg") pod "79e397f2-9430-46e9-bce4-2a0aac2c49a9" (UID: "79e397f2-9430-46e9-bce4-2a0aac2c49a9"). InnerVolumeSpecName "kube-api-access-ltcjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:15:03 crc kubenswrapper[4664]: I1003 09:15:03.665558 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79e397f2-9430-46e9-bce4-2a0aac2c49a9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "79e397f2-9430-46e9-bce4-2a0aac2c49a9" (UID: "79e397f2-9430-46e9-bce4-2a0aac2c49a9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:15:03 crc kubenswrapper[4664]: I1003 09:15:03.693986 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_dd7a7515-e2bf-4c9f-9dac-1c0bd3b87e78/ovsdbserver-sb/0.log" Oct 03 09:15:03 crc kubenswrapper[4664]: I1003 09:15:03.761559 4664 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/79e397f2-9430-46e9-bce4-2a0aac2c49a9-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 09:15:03 crc kubenswrapper[4664]: I1003 09:15:03.761834 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltcjg\" (UniqueName: \"kubernetes.io/projected/79e397f2-9430-46e9-bce4-2a0aac2c49a9-kube-api-access-ltcjg\") on node \"crc\" DevicePath \"\"" Oct 03 09:15:03 crc kubenswrapper[4664]: I1003 09:15:03.880160 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-67cd87976d-7fbgw_a226bb56-10cc-42f6-81dc-bf62f7f4038d/placement-api/0.log" Oct 03 09:15:04 crc kubenswrapper[4664]: I1003 09:15:04.045105 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-67cd87976d-7fbgw_a226bb56-10cc-42f6-81dc-bf62f7f4038d/placement-log/0.log" Oct 03 09:15:04 crc kubenswrapper[4664]: I1003 09:15:04.094835 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_309f57eb-9191-4466-a9eb-4beac6f647ae/setup-container/0.log" Oct 03 09:15:04 crc kubenswrapper[4664]: I1003 09:15:04.198715 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-bqd7n" event={"ID":"79e397f2-9430-46e9-bce4-2a0aac2c49a9","Type":"ContainerDied","Data":"a923f643af9cc86f7b28de9947a1928e10a6287b8f05970d93775f8846cfea32"} Oct 03 09:15:04 crc kubenswrapper[4664]: I1003 09:15:04.198754 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a923f643af9cc86f7b28de9947a1928e10a6287b8f05970d93775f8846cfea32" Oct 03 09:15:04 crc kubenswrapper[4664]: I1003 09:15:04.198813 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-bqd7n" Oct 03 09:15:04 crc kubenswrapper[4664]: I1003 09:15:04.291097 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_309f57eb-9191-4466-a9eb-4beac6f647ae/rabbitmq/0.log" Oct 03 09:15:04 crc kubenswrapper[4664]: I1003 09:15:04.299892 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_309f57eb-9191-4466-a9eb-4beac6f647ae/setup-container/0.log" Oct 03 09:15:04 crc kubenswrapper[4664]: I1003 09:15:04.481176 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_2d2ecc09-bcf3-4702-9456-e6c6880256cb/setup-container/0.log" Oct 03 09:15:04 crc kubenswrapper[4664]: I1003 09:15:04.644887 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324670-cwfcv"] Oct 03 09:15:04 crc kubenswrapper[4664]: I1003 09:15:04.654269 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324670-cwfcv"] Oct 03 09:15:04 crc kubenswrapper[4664]: I1003 09:15:04.728285 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_2d2ecc09-bcf3-4702-9456-e6c6880256cb/setup-container/0.log" Oct 03 09:15:04 crc kubenswrapper[4664]: I1003 09:15:04.746759 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_2d2ecc09-bcf3-4702-9456-e6c6880256cb/rabbitmq/0.log" Oct 03 09:15:04 crc kubenswrapper[4664]: I1003 09:15:04.931390 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-2pdvl_2080288a-c108-46e5-b794-0a3f41eb2e31/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 09:15:04 crc kubenswrapper[4664]: I1003 09:15:04.980017 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-f7php_52361bdc-8d63-429f-b009-d28f61360dd8/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 09:15:05 crc kubenswrapper[4664]: I1003 09:15:05.201495 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-h6fr8_45e227c3-f408-44ab-808f-8351845af92c/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 09:15:05 crc kubenswrapper[4664]: I1003 09:15:05.445394 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-fgwgk_08f00e80-ab83-47e3-b8e6-71d4d76300c4/ssh-known-hosts-edpm-deployment/0.log" Oct 03 09:15:05 crc kubenswrapper[4664]: I1003 09:15:05.476114 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-zs964_9f440401-13c2-4c2f-aff8-41e856a920c5/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 09:15:05 crc kubenswrapper[4664]: I1003 09:15:05.715149 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-bcb7d647f-zjhrm_f47ce762-22b9-4066-87d2-39e16a5b6c6d/proxy-server/0.log" Oct 03 09:15:05 crc kubenswrapper[4664]: I1003 09:15:05.873695 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-bcb7d647f-zjhrm_f47ce762-22b9-4066-87d2-39e16a5b6c6d/proxy-httpd/0.log" Oct 03 09:15:05 crc kubenswrapper[4664]: I1003 09:15:05.891926 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ad251f2-3025-4616-a70f-b280170c1443" path="/var/lib/kubelet/pods/8ad251f2-3025-4616-a70f-b280170c1443/volumes" Oct 03 09:15:05 crc kubenswrapper[4664]: I1003 09:15:05.907688 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-pwkt9_861470e0-672f-4457-86cd-9711fc6dd059/swift-ring-rebalance/0.log" Oct 03 09:15:06 crc kubenswrapper[4664]: I1003 09:15:06.059519 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e/account-auditor/0.log" Oct 03 09:15:06 crc kubenswrapper[4664]: I1003 09:15:06.117509 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e/account-reaper/0.log" Oct 03 09:15:06 crc kubenswrapper[4664]: I1003 09:15:06.285593 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e/account-replicator/0.log" Oct 03 09:15:06 crc kubenswrapper[4664]: I1003 09:15:06.298745 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e/account-server/0.log" Oct 03 09:15:06 crc kubenswrapper[4664]: I1003 09:15:06.392293 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e/container-auditor/0.log" Oct 03 09:15:06 crc kubenswrapper[4664]: I1003 09:15:06.505013 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e/container-server/0.log" Oct 03 09:15:06 crc kubenswrapper[4664]: I1003 09:15:06.538370 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e/container-replicator/0.log" Oct 03 09:15:06 crc kubenswrapper[4664]: I1003 09:15:06.594069 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e/container-updater/0.log" Oct 03 09:15:06 crc kubenswrapper[4664]: I1003 09:15:06.741190 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e/object-expirer/0.log" Oct 03 09:15:06 crc kubenswrapper[4664]: I1003 09:15:06.784463 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e/object-auditor/0.log" Oct 03 09:15:06 crc kubenswrapper[4664]: I1003 09:15:06.815726 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e/object-replicator/0.log" Oct 03 09:15:06 crc kubenswrapper[4664]: I1003 09:15:06.991048 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e/object-server/0.log" Oct 03 09:15:07 crc kubenswrapper[4664]: I1003 09:15:07.007858 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e/object-updater/0.log" Oct 03 09:15:07 crc kubenswrapper[4664]: I1003 09:15:07.046886 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e/rsync/0.log" Oct 03 09:15:07 crc kubenswrapper[4664]: I1003 09:15:07.230428 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2768ef4c-9c4d-40db-a5e0-6b45d1b0d90e/swift-recon-cron/0.log" Oct 03 09:15:07 crc kubenswrapper[4664]: I1003 09:15:07.276039 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-57b7j_8307f32a-5a7d-4239-bd00-5b69c80f407d/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 09:15:10 crc kubenswrapper[4664]: I1003 09:15:10.358982 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_350764d9-9981-4b2d-b69d-42712338bdd1/memcached/0.log" Oct 03 09:15:27 crc kubenswrapper[4664]: I1003 09:15:27.778934 4664 scope.go:117] "RemoveContainer" containerID="fb219621c5cd41755eb4f9aa1afe1895258112550fec6d11895f09a6ef88c605" Oct 03 09:15:39 crc kubenswrapper[4664]: I1003 09:15:39.559454 4664 generic.go:334] "Generic (PLEG): container finished" podID="f88ba97c-5fff-4d9e-bef7-c0b187008eaf" containerID="a81ddb17cf9d0844c4675cf281837cb42bba29baac0fc4488c75aaf5e125eb73" exitCode=0 Oct 03 09:15:39 crc kubenswrapper[4664]: I1003 09:15:39.560146 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v7wcj/crc-debug-z6zpm" event={"ID":"f88ba97c-5fff-4d9e-bef7-c0b187008eaf","Type":"ContainerDied","Data":"a81ddb17cf9d0844c4675cf281837cb42bba29baac0fc4488c75aaf5e125eb73"} Oct 03 09:15:40 crc kubenswrapper[4664]: I1003 09:15:40.670161 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v7wcj/crc-debug-z6zpm" Oct 03 09:15:40 crc kubenswrapper[4664]: I1003 09:15:40.704463 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-v7wcj/crc-debug-z6zpm"] Oct 03 09:15:40 crc kubenswrapper[4664]: I1003 09:15:40.713553 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-v7wcj/crc-debug-z6zpm"] Oct 03 09:15:40 crc kubenswrapper[4664]: I1003 09:15:40.764429 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6hmb\" (UniqueName: \"kubernetes.io/projected/f88ba97c-5fff-4d9e-bef7-c0b187008eaf-kube-api-access-b6hmb\") pod \"f88ba97c-5fff-4d9e-bef7-c0b187008eaf\" (UID: \"f88ba97c-5fff-4d9e-bef7-c0b187008eaf\") " Oct 03 09:15:40 crc kubenswrapper[4664]: I1003 09:15:40.764779 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f88ba97c-5fff-4d9e-bef7-c0b187008eaf-host\") pod \"f88ba97c-5fff-4d9e-bef7-c0b187008eaf\" (UID: \"f88ba97c-5fff-4d9e-bef7-c0b187008eaf\") " Oct 03 09:15:40 crc kubenswrapper[4664]: I1003 09:15:40.764962 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f88ba97c-5fff-4d9e-bef7-c0b187008eaf-host" (OuterVolumeSpecName: "host") pod "f88ba97c-5fff-4d9e-bef7-c0b187008eaf" (UID: "f88ba97c-5fff-4d9e-bef7-c0b187008eaf"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 09:15:40 crc kubenswrapper[4664]: I1003 09:15:40.765231 4664 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f88ba97c-5fff-4d9e-bef7-c0b187008eaf-host\") on node \"crc\" DevicePath \"\"" Oct 03 09:15:40 crc kubenswrapper[4664]: I1003 09:15:40.772083 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88ba97c-5fff-4d9e-bef7-c0b187008eaf-kube-api-access-b6hmb" (OuterVolumeSpecName: "kube-api-access-b6hmb") pod "f88ba97c-5fff-4d9e-bef7-c0b187008eaf" (UID: "f88ba97c-5fff-4d9e-bef7-c0b187008eaf"). InnerVolumeSpecName "kube-api-access-b6hmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:15:40 crc kubenswrapper[4664]: I1003 09:15:40.867509 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6hmb\" (UniqueName: \"kubernetes.io/projected/f88ba97c-5fff-4d9e-bef7-c0b187008eaf-kube-api-access-b6hmb\") on node \"crc\" DevicePath \"\"" Oct 03 09:15:41 crc kubenswrapper[4664]: I1003 09:15:41.608909 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="feada5b70a867a8e3ed8254c90b16eb9e8306d3ca9ff8797489e7f6ad799426e" Oct 03 09:15:41 crc kubenswrapper[4664]: I1003 09:15:41.609256 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v7wcj/crc-debug-z6zpm" Oct 03 09:15:41 crc kubenswrapper[4664]: I1003 09:15:41.893085 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88ba97c-5fff-4d9e-bef7-c0b187008eaf" path="/var/lib/kubelet/pods/f88ba97c-5fff-4d9e-bef7-c0b187008eaf/volumes" Oct 03 09:15:41 crc kubenswrapper[4664]: I1003 09:15:41.905306 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-v7wcj/crc-debug-6pfgb"] Oct 03 09:15:41 crc kubenswrapper[4664]: E1003 09:15:41.905862 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f88ba97c-5fff-4d9e-bef7-c0b187008eaf" containerName="container-00" Oct 03 09:15:41 crc kubenswrapper[4664]: I1003 09:15:41.905883 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="f88ba97c-5fff-4d9e-bef7-c0b187008eaf" containerName="container-00" Oct 03 09:15:41 crc kubenswrapper[4664]: E1003 09:15:41.905900 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79e397f2-9430-46e9-bce4-2a0aac2c49a9" containerName="collect-profiles" Oct 03 09:15:41 crc kubenswrapper[4664]: I1003 09:15:41.905909 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="79e397f2-9430-46e9-bce4-2a0aac2c49a9" containerName="collect-profiles" Oct 03 09:15:41 crc kubenswrapper[4664]: I1003 09:15:41.906140 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="79e397f2-9430-46e9-bce4-2a0aac2c49a9" containerName="collect-profiles" Oct 03 09:15:41 crc kubenswrapper[4664]: I1003 09:15:41.906177 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="f88ba97c-5fff-4d9e-bef7-c0b187008eaf" containerName="container-00" Oct 03 09:15:41 crc kubenswrapper[4664]: I1003 09:15:41.907024 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v7wcj/crc-debug-6pfgb" Oct 03 09:15:41 crc kubenswrapper[4664]: I1003 09:15:41.909555 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-v7wcj"/"default-dockercfg-rd7wf" Oct 03 09:15:41 crc kubenswrapper[4664]: I1003 09:15:41.987060 4664 patch_prober.go:28] interesting pod/machine-config-daemon-x9dgm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:15:41 crc kubenswrapper[4664]: I1003 09:15:41.987169 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:15:41 crc kubenswrapper[4664]: I1003 09:15:41.993743 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6qbg\" (UniqueName: \"kubernetes.io/projected/8d1b89f6-ef3c-47f0-90fe-b3ff427a1b1f-kube-api-access-n6qbg\") pod \"crc-debug-6pfgb\" (UID: \"8d1b89f6-ef3c-47f0-90fe-b3ff427a1b1f\") " pod="openshift-must-gather-v7wcj/crc-debug-6pfgb" Oct 03 09:15:41 crc kubenswrapper[4664]: I1003 09:15:41.993972 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8d1b89f6-ef3c-47f0-90fe-b3ff427a1b1f-host\") pod \"crc-debug-6pfgb\" (UID: \"8d1b89f6-ef3c-47f0-90fe-b3ff427a1b1f\") " pod="openshift-must-gather-v7wcj/crc-debug-6pfgb" Oct 03 09:15:42 crc kubenswrapper[4664]: I1003 09:15:42.095929 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8d1b89f6-ef3c-47f0-90fe-b3ff427a1b1f-host\") pod \"crc-debug-6pfgb\" (UID: \"8d1b89f6-ef3c-47f0-90fe-b3ff427a1b1f\") " pod="openshift-must-gather-v7wcj/crc-debug-6pfgb" Oct 03 09:15:42 crc kubenswrapper[4664]: I1003 09:15:42.096090 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6qbg\" (UniqueName: \"kubernetes.io/projected/8d1b89f6-ef3c-47f0-90fe-b3ff427a1b1f-kube-api-access-n6qbg\") pod \"crc-debug-6pfgb\" (UID: \"8d1b89f6-ef3c-47f0-90fe-b3ff427a1b1f\") " pod="openshift-must-gather-v7wcj/crc-debug-6pfgb" Oct 03 09:15:42 crc kubenswrapper[4664]: I1003 09:15:42.096093 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8d1b89f6-ef3c-47f0-90fe-b3ff427a1b1f-host\") pod \"crc-debug-6pfgb\" (UID: \"8d1b89f6-ef3c-47f0-90fe-b3ff427a1b1f\") " pod="openshift-must-gather-v7wcj/crc-debug-6pfgb" Oct 03 09:15:42 crc kubenswrapper[4664]: I1003 09:15:42.122085 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6qbg\" (UniqueName: \"kubernetes.io/projected/8d1b89f6-ef3c-47f0-90fe-b3ff427a1b1f-kube-api-access-n6qbg\") pod \"crc-debug-6pfgb\" (UID: \"8d1b89f6-ef3c-47f0-90fe-b3ff427a1b1f\") " pod="openshift-must-gather-v7wcj/crc-debug-6pfgb" Oct 03 09:15:42 crc kubenswrapper[4664]: I1003 09:15:42.233094 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v7wcj/crc-debug-6pfgb" Oct 03 09:15:42 crc kubenswrapper[4664]: I1003 09:15:42.618828 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v7wcj/crc-debug-6pfgb" event={"ID":"8d1b89f6-ef3c-47f0-90fe-b3ff427a1b1f","Type":"ContainerStarted","Data":"ff36ecde0f12f0ae5192572d3997261e1a121564eba8d786c7e38f1a8603dcf1"} Oct 03 09:15:42 crc kubenswrapper[4664]: I1003 09:15:42.619185 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v7wcj/crc-debug-6pfgb" event={"ID":"8d1b89f6-ef3c-47f0-90fe-b3ff427a1b1f","Type":"ContainerStarted","Data":"e1636ebc76003847622c3909921651e36f50999b6b57fd1aa487d20758b8e59c"} Oct 03 09:15:42 crc kubenswrapper[4664]: I1003 09:15:42.636864 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-v7wcj/crc-debug-6pfgb" podStartSLOduration=1.636842932 podStartE2EDuration="1.636842932s" podCreationTimestamp="2025-10-03 09:15:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:15:42.634566647 +0000 UTC m=+5243.455757167" watchObservedRunningTime="2025-10-03 09:15:42.636842932 +0000 UTC m=+5243.458033442" Oct 03 09:15:43 crc kubenswrapper[4664]: I1003 09:15:43.631408 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v7wcj/crc-debug-6pfgb" event={"ID":"8d1b89f6-ef3c-47f0-90fe-b3ff427a1b1f","Type":"ContainerDied","Data":"ff36ecde0f12f0ae5192572d3997261e1a121564eba8d786c7e38f1a8603dcf1"} Oct 03 09:15:43 crc kubenswrapper[4664]: I1003 09:15:43.631466 4664 generic.go:334] "Generic (PLEG): container finished" podID="8d1b89f6-ef3c-47f0-90fe-b3ff427a1b1f" containerID="ff36ecde0f12f0ae5192572d3997261e1a121564eba8d786c7e38f1a8603dcf1" exitCode=0 Oct 03 09:15:44 crc kubenswrapper[4664]: I1003 09:15:44.754036 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v7wcj/crc-debug-6pfgb" Oct 03 09:15:44 crc kubenswrapper[4664]: I1003 09:15:44.839772 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6qbg\" (UniqueName: \"kubernetes.io/projected/8d1b89f6-ef3c-47f0-90fe-b3ff427a1b1f-kube-api-access-n6qbg\") pod \"8d1b89f6-ef3c-47f0-90fe-b3ff427a1b1f\" (UID: \"8d1b89f6-ef3c-47f0-90fe-b3ff427a1b1f\") " Oct 03 09:15:44 crc kubenswrapper[4664]: I1003 09:15:44.839909 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8d1b89f6-ef3c-47f0-90fe-b3ff427a1b1f-host\") pod \"8d1b89f6-ef3c-47f0-90fe-b3ff427a1b1f\" (UID: \"8d1b89f6-ef3c-47f0-90fe-b3ff427a1b1f\") " Oct 03 09:15:44 crc kubenswrapper[4664]: I1003 09:15:44.840236 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8d1b89f6-ef3c-47f0-90fe-b3ff427a1b1f-host" (OuterVolumeSpecName: "host") pod "8d1b89f6-ef3c-47f0-90fe-b3ff427a1b1f" (UID: "8d1b89f6-ef3c-47f0-90fe-b3ff427a1b1f"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 09:15:44 crc kubenswrapper[4664]: I1003 09:15:44.840738 4664 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8d1b89f6-ef3c-47f0-90fe-b3ff427a1b1f-host\") on node \"crc\" DevicePath \"\"" Oct 03 09:15:44 crc kubenswrapper[4664]: I1003 09:15:44.858986 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d1b89f6-ef3c-47f0-90fe-b3ff427a1b1f-kube-api-access-n6qbg" (OuterVolumeSpecName: "kube-api-access-n6qbg") pod "8d1b89f6-ef3c-47f0-90fe-b3ff427a1b1f" (UID: "8d1b89f6-ef3c-47f0-90fe-b3ff427a1b1f"). InnerVolumeSpecName "kube-api-access-n6qbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:15:44 crc kubenswrapper[4664]: I1003 09:15:44.942319 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6qbg\" (UniqueName: \"kubernetes.io/projected/8d1b89f6-ef3c-47f0-90fe-b3ff427a1b1f-kube-api-access-n6qbg\") on node \"crc\" DevicePath \"\"" Oct 03 09:15:45 crc kubenswrapper[4664]: I1003 09:15:45.655409 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v7wcj/crc-debug-6pfgb" event={"ID":"8d1b89f6-ef3c-47f0-90fe-b3ff427a1b1f","Type":"ContainerDied","Data":"e1636ebc76003847622c3909921651e36f50999b6b57fd1aa487d20758b8e59c"} Oct 03 09:15:45 crc kubenswrapper[4664]: I1003 09:15:45.655458 4664 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1636ebc76003847622c3909921651e36f50999b6b57fd1aa487d20758b8e59c" Oct 03 09:15:45 crc kubenswrapper[4664]: I1003 09:15:45.655474 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v7wcj/crc-debug-6pfgb" Oct 03 09:15:49 crc kubenswrapper[4664]: I1003 09:15:49.296702 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-v7wcj/crc-debug-6pfgb"] Oct 03 09:15:49 crc kubenswrapper[4664]: I1003 09:15:49.308215 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-v7wcj/crc-debug-6pfgb"] Oct 03 09:15:49 crc kubenswrapper[4664]: I1003 09:15:49.892858 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d1b89f6-ef3c-47f0-90fe-b3ff427a1b1f" path="/var/lib/kubelet/pods/8d1b89f6-ef3c-47f0-90fe-b3ff427a1b1f/volumes" Oct 03 09:15:50 crc kubenswrapper[4664]: I1003 09:15:50.467789 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-v7wcj/crc-debug-bwrr8"] Oct 03 09:15:50 crc kubenswrapper[4664]: E1003 09:15:50.468222 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d1b89f6-ef3c-47f0-90fe-b3ff427a1b1f" containerName="container-00" Oct 03 09:15:50 crc kubenswrapper[4664]: I1003 09:15:50.468236 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d1b89f6-ef3c-47f0-90fe-b3ff427a1b1f" containerName="container-00" Oct 03 09:15:50 crc kubenswrapper[4664]: I1003 09:15:50.468425 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d1b89f6-ef3c-47f0-90fe-b3ff427a1b1f" containerName="container-00" Oct 03 09:15:50 crc kubenswrapper[4664]: I1003 09:15:50.469074 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v7wcj/crc-debug-bwrr8" Oct 03 09:15:50 crc kubenswrapper[4664]: I1003 09:15:50.472537 4664 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-v7wcj"/"default-dockercfg-rd7wf" Oct 03 09:15:50 crc kubenswrapper[4664]: I1003 09:15:50.532968 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69rlk\" (UniqueName: \"kubernetes.io/projected/c2eb97c7-1877-49e7-8419-171ea71901f2-kube-api-access-69rlk\") pod \"crc-debug-bwrr8\" (UID: \"c2eb97c7-1877-49e7-8419-171ea71901f2\") " pod="openshift-must-gather-v7wcj/crc-debug-bwrr8" Oct 03 09:15:50 crc kubenswrapper[4664]: I1003 09:15:50.533211 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c2eb97c7-1877-49e7-8419-171ea71901f2-host\") pod \"crc-debug-bwrr8\" (UID: \"c2eb97c7-1877-49e7-8419-171ea71901f2\") " pod="openshift-must-gather-v7wcj/crc-debug-bwrr8" Oct 03 09:15:50 crc kubenswrapper[4664]: I1003 09:15:50.635201 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c2eb97c7-1877-49e7-8419-171ea71901f2-host\") pod \"crc-debug-bwrr8\" (UID: \"c2eb97c7-1877-49e7-8419-171ea71901f2\") " pod="openshift-must-gather-v7wcj/crc-debug-bwrr8" Oct 03 09:15:50 crc kubenswrapper[4664]: I1003 09:15:50.635334 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69rlk\" (UniqueName: \"kubernetes.io/projected/c2eb97c7-1877-49e7-8419-171ea71901f2-kube-api-access-69rlk\") pod \"crc-debug-bwrr8\" (UID: \"c2eb97c7-1877-49e7-8419-171ea71901f2\") " pod="openshift-must-gather-v7wcj/crc-debug-bwrr8" Oct 03 09:15:50 crc kubenswrapper[4664]: I1003 09:15:50.635537 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c2eb97c7-1877-49e7-8419-171ea71901f2-host\") pod \"crc-debug-bwrr8\" (UID: \"c2eb97c7-1877-49e7-8419-171ea71901f2\") " pod="openshift-must-gather-v7wcj/crc-debug-bwrr8" Oct 03 09:15:50 crc kubenswrapper[4664]: I1003 09:15:50.663760 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69rlk\" (UniqueName: \"kubernetes.io/projected/c2eb97c7-1877-49e7-8419-171ea71901f2-kube-api-access-69rlk\") pod \"crc-debug-bwrr8\" (UID: \"c2eb97c7-1877-49e7-8419-171ea71901f2\") " pod="openshift-must-gather-v7wcj/crc-debug-bwrr8" Oct 03 09:15:50 crc kubenswrapper[4664]: I1003 09:15:50.786539 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v7wcj/crc-debug-bwrr8" Oct 03 09:15:51 crc kubenswrapper[4664]: I1003 09:15:51.719296 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v7wcj/crc-debug-bwrr8" event={"ID":"c2eb97c7-1877-49e7-8419-171ea71901f2","Type":"ContainerStarted","Data":"679648296bfe59fa540b3baba305f1c6ef0f993cfd0d6022d87fd09350e11f84"} Oct 03 09:15:51 crc kubenswrapper[4664]: I1003 09:15:51.719351 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v7wcj/crc-debug-bwrr8" event={"ID":"c2eb97c7-1877-49e7-8419-171ea71901f2","Type":"ContainerStarted","Data":"0202c0569b8ea3e10856afc73416922463c8b40a3a2481207d7104e5122f1bda"} Oct 03 09:15:52 crc kubenswrapper[4664]: I1003 09:15:52.739811 4664 generic.go:334] "Generic (PLEG): container finished" podID="c2eb97c7-1877-49e7-8419-171ea71901f2" containerID="679648296bfe59fa540b3baba305f1c6ef0f993cfd0d6022d87fd09350e11f84" exitCode=0 Oct 03 09:15:52 crc kubenswrapper[4664]: I1003 09:15:52.739865 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v7wcj/crc-debug-bwrr8" event={"ID":"c2eb97c7-1877-49e7-8419-171ea71901f2","Type":"ContainerDied","Data":"679648296bfe59fa540b3baba305f1c6ef0f993cfd0d6022d87fd09350e11f84"} Oct 03 09:15:52 crc kubenswrapper[4664]: I1003 09:15:52.781024 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-v7wcj/crc-debug-bwrr8"] Oct 03 09:15:52 crc kubenswrapper[4664]: I1003 09:15:52.791561 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-v7wcj/crc-debug-bwrr8"] Oct 03 09:15:53 crc kubenswrapper[4664]: I1003 09:15:53.852271 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v7wcj/crc-debug-bwrr8" Oct 03 09:15:53 crc kubenswrapper[4664]: I1003 09:15:53.897395 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69rlk\" (UniqueName: \"kubernetes.io/projected/c2eb97c7-1877-49e7-8419-171ea71901f2-kube-api-access-69rlk\") pod \"c2eb97c7-1877-49e7-8419-171ea71901f2\" (UID: \"c2eb97c7-1877-49e7-8419-171ea71901f2\") " Oct 03 09:15:53 crc kubenswrapper[4664]: I1003 09:15:53.897480 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c2eb97c7-1877-49e7-8419-171ea71901f2-host\") pod \"c2eb97c7-1877-49e7-8419-171ea71901f2\" (UID: \"c2eb97c7-1877-49e7-8419-171ea71901f2\") " Oct 03 09:15:53 crc kubenswrapper[4664]: I1003 09:15:53.897944 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c2eb97c7-1877-49e7-8419-171ea71901f2-host" (OuterVolumeSpecName: "host") pod "c2eb97c7-1877-49e7-8419-171ea71901f2" (UID: "c2eb97c7-1877-49e7-8419-171ea71901f2"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 09:15:53 crc kubenswrapper[4664]: I1003 09:15:53.898430 4664 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c2eb97c7-1877-49e7-8419-171ea71901f2-host\") on node \"crc\" DevicePath \"\"" Oct 03 09:15:53 crc kubenswrapper[4664]: I1003 09:15:53.907891 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2eb97c7-1877-49e7-8419-171ea71901f2-kube-api-access-69rlk" (OuterVolumeSpecName: "kube-api-access-69rlk") pod "c2eb97c7-1877-49e7-8419-171ea71901f2" (UID: "c2eb97c7-1877-49e7-8419-171ea71901f2"). InnerVolumeSpecName "kube-api-access-69rlk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:15:54 crc kubenswrapper[4664]: I1003 09:15:54.000090 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69rlk\" (UniqueName: \"kubernetes.io/projected/c2eb97c7-1877-49e7-8419-171ea71901f2-kube-api-access-69rlk\") on node \"crc\" DevicePath \"\"" Oct 03 09:15:54 crc kubenswrapper[4664]: I1003 09:15:54.331479 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a2191722de8a7ecb953dc68cb47425cc2467b1a758620d4a8c9fcb07e5f8s72_a1139aaa-91dd-410c-b4f8-695cad546424/util/0.log" Oct 03 09:15:54 crc kubenswrapper[4664]: I1003 09:15:54.504404 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a2191722de8a7ecb953dc68cb47425cc2467b1a758620d4a8c9fcb07e5f8s72_a1139aaa-91dd-410c-b4f8-695cad546424/util/0.log" Oct 03 09:15:54 crc kubenswrapper[4664]: I1003 09:15:54.543068 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a2191722de8a7ecb953dc68cb47425cc2467b1a758620d4a8c9fcb07e5f8s72_a1139aaa-91dd-410c-b4f8-695cad546424/pull/0.log" Oct 03 09:15:54 crc kubenswrapper[4664]: I1003 09:15:54.548899 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a2191722de8a7ecb953dc68cb47425cc2467b1a758620d4a8c9fcb07e5f8s72_a1139aaa-91dd-410c-b4f8-695cad546424/pull/0.log" Oct 03 09:15:54 crc kubenswrapper[4664]: I1003 09:15:54.754569 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a2191722de8a7ecb953dc68cb47425cc2467b1a758620d4a8c9fcb07e5f8s72_a1139aaa-91dd-410c-b4f8-695cad546424/extract/0.log" Oct 03 09:15:54 crc kubenswrapper[4664]: I1003 09:15:54.758318 4664 scope.go:117] "RemoveContainer" containerID="679648296bfe59fa540b3baba305f1c6ef0f993cfd0d6022d87fd09350e11f84" Oct 03 09:15:54 crc kubenswrapper[4664]: I1003 09:15:54.758454 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v7wcj/crc-debug-bwrr8" Oct 03 09:15:54 crc kubenswrapper[4664]: I1003 09:15:54.772658 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a2191722de8a7ecb953dc68cb47425cc2467b1a758620d4a8c9fcb07e5f8s72_a1139aaa-91dd-410c-b4f8-695cad546424/util/0.log" Oct 03 09:15:54 crc kubenswrapper[4664]: I1003 09:15:54.803088 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a2191722de8a7ecb953dc68cb47425cc2467b1a758620d4a8c9fcb07e5f8s72_a1139aaa-91dd-410c-b4f8-695cad546424/pull/0.log" Oct 03 09:15:54 crc kubenswrapper[4664]: I1003 09:15:54.927041 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6c675fb79f-nw8d7_16a5e165-d1b6-4023-b48e-f4b918730203/kube-rbac-proxy/0.log" Oct 03 09:15:54 crc kubenswrapper[4664]: I1003 09:15:54.998979 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-79d68d6c85-mhzlb_30114f47-bc69-49b2-b3ab-d201c2c146ee/kube-rbac-proxy/0.log" Oct 03 09:15:55 crc kubenswrapper[4664]: I1003 09:15:55.052267 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6c675fb79f-nw8d7_16a5e165-d1b6-4023-b48e-f4b918730203/manager/0.log" Oct 03 09:15:55 crc kubenswrapper[4664]: I1003 09:15:55.153985 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-79d68d6c85-mhzlb_30114f47-bc69-49b2-b3ab-d201c2c146ee/manager/0.log" Oct 03 09:15:55 crc kubenswrapper[4664]: I1003 09:15:55.187089 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-bvgv4_f1eed736-5b4e-4e82-915d-288d59a82b94/kube-rbac-proxy/0.log" Oct 03 09:15:55 crc kubenswrapper[4664]: I1003 09:15:55.249452 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-bvgv4_f1eed736-5b4e-4e82-915d-288d59a82b94/manager/0.log" Oct 03 09:15:55 crc kubenswrapper[4664]: I1003 09:15:55.363183 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-846dff85b5-44trb_d80c9513-6585-415d-a747-2ff4bcca33d7/kube-rbac-proxy/0.log" Oct 03 09:15:55 crc kubenswrapper[4664]: I1003 09:15:55.440238 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-846dff85b5-44trb_d80c9513-6585-415d-a747-2ff4bcca33d7/manager/0.log" Oct 03 09:15:55 crc kubenswrapper[4664]: I1003 09:15:55.535234 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-599898f689-g5h6q_56f9a3d8-3c1e-4d8d-a005-3c5f42beb67e/kube-rbac-proxy/0.log" Oct 03 09:15:55 crc kubenswrapper[4664]: I1003 09:15:55.565189 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-599898f689-g5h6q_56f9a3d8-3c1e-4d8d-a005-3c5f42beb67e/manager/0.log" Oct 03 09:15:55 crc kubenswrapper[4664]: I1003 09:15:55.694092 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6769b867d9-qlgfl_28f90c47-2add-426b-953e-be4842b71cfc/kube-rbac-proxy/0.log" Oct 03 09:15:55 crc kubenswrapper[4664]: I1003 09:15:55.753074 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6769b867d9-qlgfl_28f90c47-2add-426b-953e-be4842b71cfc/manager/0.log" Oct 03 09:15:55 crc kubenswrapper[4664]: I1003 09:15:55.840790 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5fbf469cd7-n8zxl_47fe850e-9d6e-4a75-9e91-2c87cdb2e8bc/kube-rbac-proxy/0.log" Oct 03 09:15:55 crc kubenswrapper[4664]: I1003 09:15:55.898271 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2eb97c7-1877-49e7-8419-171ea71901f2" path="/var/lib/kubelet/pods/c2eb97c7-1877-49e7-8419-171ea71901f2/volumes" Oct 03 09:15:55 crc kubenswrapper[4664]: I1003 09:15:55.985051 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-84bc9db6cc-fhpcz_45229d75-9e65-47bb-a54a-b27742fe2717/kube-rbac-proxy/0.log" Oct 03 09:15:56 crc kubenswrapper[4664]: I1003 09:15:56.063301 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5fbf469cd7-n8zxl_47fe850e-9d6e-4a75-9e91-2c87cdb2e8bc/manager/0.log" Oct 03 09:15:56 crc kubenswrapper[4664]: I1003 09:15:56.086968 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-84bc9db6cc-fhpcz_45229d75-9e65-47bb-a54a-b27742fe2717/manager/0.log" Oct 03 09:15:56 crc kubenswrapper[4664]: I1003 09:15:56.209094 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7f55849f88-qf7qg_2ed5f442-2c30-4c29-a65b-4b3d9262cbce/kube-rbac-proxy/0.log" Oct 03 09:15:56 crc kubenswrapper[4664]: I1003 09:15:56.336781 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7f55849f88-qf7qg_2ed5f442-2c30-4c29-a65b-4b3d9262cbce/manager/0.log" Oct 03 09:15:56 crc kubenswrapper[4664]: I1003 09:15:56.396955 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6fd6854b49-8nvt9_738a79fc-a362-4c7c-a101-f8551019a96b/kube-rbac-proxy/0.log" Oct 03 09:15:56 crc kubenswrapper[4664]: I1003 09:15:56.398508 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6fd6854b49-8nvt9_738a79fc-a362-4c7c-a101-f8551019a96b/manager/0.log" Oct 03 09:15:56 crc kubenswrapper[4664]: I1003 09:15:56.522786 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5c468bf4d4-tz9q5_d5285a08-51d1-4f27-b87e-73fc4c0dc037/kube-rbac-proxy/0.log" Oct 03 09:15:56 crc kubenswrapper[4664]: I1003 09:15:56.630102 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5c468bf4d4-tz9q5_d5285a08-51d1-4f27-b87e-73fc4c0dc037/manager/0.log" Oct 03 09:15:56 crc kubenswrapper[4664]: I1003 09:15:56.708197 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6574bf987d-bf92q_7d9f54b0-ed53-4042-98d1-eb5ceb6f629b/kube-rbac-proxy/0.log" Oct 03 09:15:56 crc kubenswrapper[4664]: I1003 09:15:56.740211 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6574bf987d-bf92q_7d9f54b0-ed53-4042-98d1-eb5ceb6f629b/manager/0.log" Oct 03 09:15:57 crc kubenswrapper[4664]: I1003 09:15:57.610885 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-555c7456bd-59qtz_5bedd109-c928-4b87-8593-21f72b0a6165/kube-rbac-proxy/0.log" Oct 03 09:15:57 crc kubenswrapper[4664]: I1003 09:15:57.636180 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-59d6cfdf45-8jdvq_a931be65-b5d0-4685-82ea-42102c8235c8/kube-rbac-proxy/0.log" Oct 03 09:15:57 crc kubenswrapper[4664]: I1003 09:15:57.641062 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-555c7456bd-59qtz_5bedd109-c928-4b87-8593-21f72b0a6165/manager/0.log" Oct 03 09:15:57 crc kubenswrapper[4664]: I1003 09:15:57.691045 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-59d6cfdf45-8jdvq_a931be65-b5d0-4685-82ea-42102c8235c8/manager/0.log" Oct 03 09:15:57 crc kubenswrapper[4664]: I1003 09:15:57.807651 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6f64c4d678qkr5h_2784f55b-9b0e-49e0-9a5b-df56008a2be9/manager/0.log" Oct 03 09:15:57 crc kubenswrapper[4664]: I1003 09:15:57.810001 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6f64c4d678qkr5h_2784f55b-9b0e-49e0-9a5b-df56008a2be9/kube-rbac-proxy/0.log" Oct 03 09:15:57 crc kubenswrapper[4664]: I1003 09:15:57.887406 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6f9d674864-tpv2h_9d494f25-e54e-4ddf-a4cc-a632d05db780/kube-rbac-proxy/0.log" Oct 03 09:15:57 crc kubenswrapper[4664]: I1003 09:15:57.996674 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-5597f8fd94-dj29q_4d57e4e0-7496-4ae6-8871-97016022a533/kube-rbac-proxy/0.log" Oct 03 09:15:58 crc kubenswrapper[4664]: I1003 09:15:58.228343 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-zg5rg_4fef1902-6311-446b-b38b-4af3c3035b30/registry-server/0.log" Oct 03 09:15:58 crc kubenswrapper[4664]: I1003 09:15:58.330394 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-5597f8fd94-dj29q_4d57e4e0-7496-4ae6-8871-97016022a533/operator/0.log" Oct 03 09:15:58 crc kubenswrapper[4664]: I1003 09:15:58.409862 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-688db7b6c7-ct22p_e8cd4885-34c3-4a78-b23d-9e57aa0517ca/kube-rbac-proxy/0.log" Oct 03 09:15:58 crc kubenswrapper[4664]: I1003 09:15:58.535211 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-7d8bb7f44c-fg9pn_bca141be-db7d-4e1e-b95f-12f9b63522b7/kube-rbac-proxy/0.log" Oct 03 09:15:58 crc kubenswrapper[4664]: I1003 09:15:58.539848 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-688db7b6c7-ct22p_e8cd4885-34c3-4a78-b23d-9e57aa0517ca/manager/0.log" Oct 03 09:15:58 crc kubenswrapper[4664]: I1003 09:15:58.674137 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-7d8bb7f44c-fg9pn_bca141be-db7d-4e1e-b95f-12f9b63522b7/manager/0.log" Oct 03 09:15:58 crc kubenswrapper[4664]: I1003 09:15:58.773598 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-s6fqp_928529ba-1444-4ccc-9fab-cac6102c3375/operator/0.log" Oct 03 09:15:59 crc kubenswrapper[4664]: I1003 09:15:59.504582 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6f9d674864-tpv2h_9d494f25-e54e-4ddf-a4cc-a632d05db780/manager/0.log" Oct 03 09:15:59 crc kubenswrapper[4664]: I1003 09:15:59.617363 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-kcqct_294a44e7-d4f2-4162-9f34-2cd9c4a9aa49/manager/0.log" Oct 03 09:15:59 crc kubenswrapper[4664]: I1003 09:15:59.655944 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-kcqct_294a44e7-d4f2-4162-9f34-2cd9c4a9aa49/kube-rbac-proxy/0.log" Oct 03 09:15:59 crc kubenswrapper[4664]: I1003 09:15:59.672967 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5db5cf686f-fm85d_6492ba28-5e2e-42b3-829e-5b666703bd85/kube-rbac-proxy/0.log" Oct 03 09:15:59 crc kubenswrapper[4664]: I1003 09:15:59.780690 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5db5cf686f-fm85d_6492ba28-5e2e-42b3-829e-5b666703bd85/manager/0.log" Oct 03 09:15:59 crc kubenswrapper[4664]: I1003 09:15:59.834577 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-t48px_07ab2a7c-e330-498d-a1b0-b2155c491839/kube-rbac-proxy/0.log" Oct 03 09:15:59 crc kubenswrapper[4664]: I1003 09:15:59.922470 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-t48px_07ab2a7c-e330-498d-a1b0-b2155c491839/manager/0.log" Oct 03 09:15:59 crc kubenswrapper[4664]: I1003 09:15:59.983500 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-fcd7d9895-8wwpz_25a36ab5-71b4-4660-99a6-86c3c6554c86/kube-rbac-proxy/0.log" Oct 03 09:16:00 crc kubenswrapper[4664]: I1003 09:16:00.037599 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-fcd7d9895-8wwpz_25a36ab5-71b4-4660-99a6-86c3c6554c86/manager/0.log" Oct 03 09:16:11 crc kubenswrapper[4664]: I1003 09:16:11.987486 4664 patch_prober.go:28] interesting pod/machine-config-daemon-x9dgm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:16:11 crc kubenswrapper[4664]: I1003 09:16:11.987949 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:16:15 crc kubenswrapper[4664]: I1003 09:16:15.874575 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-z8tqh_c452acc1-8a32-47e7-9f61-4a7203b878c0/control-plane-machine-set-operator/0.log" Oct 03 09:16:16 crc kubenswrapper[4664]: I1003 09:16:16.058774 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-wd7mj_f3a78dac-6d58-4647-83fb-b0f36f2f660a/kube-rbac-proxy/0.log" Oct 03 09:16:16 crc kubenswrapper[4664]: I1003 09:16:16.086006 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-wd7mj_f3a78dac-6d58-4647-83fb-b0f36f2f660a/machine-api-operator/0.log" Oct 03 09:16:16 crc kubenswrapper[4664]: I1003 09:16:16.544018 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-btbwk"] Oct 03 09:16:16 crc kubenswrapper[4664]: E1003 09:16:16.544595 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2eb97c7-1877-49e7-8419-171ea71901f2" containerName="container-00" Oct 03 09:16:16 crc kubenswrapper[4664]: I1003 09:16:16.544633 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2eb97c7-1877-49e7-8419-171ea71901f2" containerName="container-00" Oct 03 09:16:16 crc kubenswrapper[4664]: I1003 09:16:16.544813 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2eb97c7-1877-49e7-8419-171ea71901f2" containerName="container-00" Oct 03 09:16:16 crc kubenswrapper[4664]: I1003 09:16:16.546194 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-btbwk" Oct 03 09:16:16 crc kubenswrapper[4664]: I1003 09:16:16.572223 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-btbwk"] Oct 03 09:16:16 crc kubenswrapper[4664]: I1003 09:16:16.623754 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plg6h\" (UniqueName: \"kubernetes.io/projected/33be4cfa-29e6-4e1d-99df-278ee7ab5863-kube-api-access-plg6h\") pod \"redhat-marketplace-btbwk\" (UID: \"33be4cfa-29e6-4e1d-99df-278ee7ab5863\") " pod="openshift-marketplace/redhat-marketplace-btbwk" Oct 03 09:16:16 crc kubenswrapper[4664]: I1003 09:16:16.623815 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33be4cfa-29e6-4e1d-99df-278ee7ab5863-catalog-content\") pod \"redhat-marketplace-btbwk\" (UID: \"33be4cfa-29e6-4e1d-99df-278ee7ab5863\") " pod="openshift-marketplace/redhat-marketplace-btbwk" Oct 03 09:16:16 crc kubenswrapper[4664]: I1003 09:16:16.624167 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33be4cfa-29e6-4e1d-99df-278ee7ab5863-utilities\") pod \"redhat-marketplace-btbwk\" (UID: \"33be4cfa-29e6-4e1d-99df-278ee7ab5863\") " pod="openshift-marketplace/redhat-marketplace-btbwk" Oct 03 09:16:16 crc kubenswrapper[4664]: I1003 09:16:16.726068 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33be4cfa-29e6-4e1d-99df-278ee7ab5863-utilities\") pod \"redhat-marketplace-btbwk\" (UID: \"33be4cfa-29e6-4e1d-99df-278ee7ab5863\") " pod="openshift-marketplace/redhat-marketplace-btbwk" Oct 03 09:16:16 crc kubenswrapper[4664]: I1003 09:16:16.726184 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plg6h\" (UniqueName: \"kubernetes.io/projected/33be4cfa-29e6-4e1d-99df-278ee7ab5863-kube-api-access-plg6h\") pod \"redhat-marketplace-btbwk\" (UID: \"33be4cfa-29e6-4e1d-99df-278ee7ab5863\") " pod="openshift-marketplace/redhat-marketplace-btbwk" Oct 03 09:16:16 crc kubenswrapper[4664]: I1003 09:16:16.726224 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33be4cfa-29e6-4e1d-99df-278ee7ab5863-catalog-content\") pod \"redhat-marketplace-btbwk\" (UID: \"33be4cfa-29e6-4e1d-99df-278ee7ab5863\") " pod="openshift-marketplace/redhat-marketplace-btbwk" Oct 03 09:16:16 crc kubenswrapper[4664]: I1003 09:16:16.726501 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33be4cfa-29e6-4e1d-99df-278ee7ab5863-utilities\") pod \"redhat-marketplace-btbwk\" (UID: \"33be4cfa-29e6-4e1d-99df-278ee7ab5863\") " pod="openshift-marketplace/redhat-marketplace-btbwk" Oct 03 09:16:16 crc kubenswrapper[4664]: I1003 09:16:16.726657 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33be4cfa-29e6-4e1d-99df-278ee7ab5863-catalog-content\") pod \"redhat-marketplace-btbwk\" (UID: \"33be4cfa-29e6-4e1d-99df-278ee7ab5863\") " pod="openshift-marketplace/redhat-marketplace-btbwk" Oct 03 09:16:16 crc kubenswrapper[4664]: I1003 09:16:16.754716 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plg6h\" (UniqueName: \"kubernetes.io/projected/33be4cfa-29e6-4e1d-99df-278ee7ab5863-kube-api-access-plg6h\") pod \"redhat-marketplace-btbwk\" (UID: \"33be4cfa-29e6-4e1d-99df-278ee7ab5863\") " pod="openshift-marketplace/redhat-marketplace-btbwk" Oct 03 09:16:16 crc kubenswrapper[4664]: I1003 09:16:16.877943 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-btbwk" Oct 03 09:16:17 crc kubenswrapper[4664]: I1003 09:16:17.401685 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-btbwk"] Oct 03 09:16:17 crc kubenswrapper[4664]: I1003 09:16:17.980470 4664 generic.go:334] "Generic (PLEG): container finished" podID="33be4cfa-29e6-4e1d-99df-278ee7ab5863" containerID="e87454e492ba1c006396b15962e5ee96e2620d04a2ba37ab7fa10f8c179d2aa1" exitCode=0 Oct 03 09:16:17 crc kubenswrapper[4664]: I1003 09:16:17.980520 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-btbwk" event={"ID":"33be4cfa-29e6-4e1d-99df-278ee7ab5863","Type":"ContainerDied","Data":"e87454e492ba1c006396b15962e5ee96e2620d04a2ba37ab7fa10f8c179d2aa1"} Oct 03 09:16:17 crc kubenswrapper[4664]: I1003 09:16:17.980548 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-btbwk" event={"ID":"33be4cfa-29e6-4e1d-99df-278ee7ab5863","Type":"ContainerStarted","Data":"1f7181fd2470113fb5a901dcadf5f85dd2bcdd23c8c64969e4e7a5d35b8ff8dd"} Oct 03 09:16:19 crc kubenswrapper[4664]: I1003 09:16:19.999636 4664 generic.go:334] "Generic (PLEG): container finished" podID="33be4cfa-29e6-4e1d-99df-278ee7ab5863" containerID="fd5aaf7e2293d423fa4cda2216ba21433c8806e01cf6d58888cab2690247233f" exitCode=0 Oct 03 09:16:19 crc kubenswrapper[4664]: I1003 09:16:19.999749 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-btbwk" event={"ID":"33be4cfa-29e6-4e1d-99df-278ee7ab5863","Type":"ContainerDied","Data":"fd5aaf7e2293d423fa4cda2216ba21433c8806e01cf6d58888cab2690247233f"} Oct 03 09:16:22 crc kubenswrapper[4664]: I1003 09:16:22.019651 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-btbwk" event={"ID":"33be4cfa-29e6-4e1d-99df-278ee7ab5863","Type":"ContainerStarted","Data":"b08dd45b749ed4071e77cefbcb3072b03349db5212a4627d837ff143bdd04575"} Oct 03 09:16:22 crc kubenswrapper[4664]: I1003 09:16:22.043443 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-btbwk" podStartSLOduration=3.168615222 podStartE2EDuration="6.043426992s" podCreationTimestamp="2025-10-03 09:16:16 +0000 UTC" firstStartedPulling="2025-10-03 09:16:17.996360685 +0000 UTC m=+5278.817551275" lastFinishedPulling="2025-10-03 09:16:20.871172555 +0000 UTC m=+5281.692363045" observedRunningTime="2025-10-03 09:16:22.034409424 +0000 UTC m=+5282.855599934" watchObservedRunningTime="2025-10-03 09:16:22.043426992 +0000 UTC m=+5282.864617482" Oct 03 09:16:26 crc kubenswrapper[4664]: I1003 09:16:26.878677 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-btbwk" Oct 03 09:16:26 crc kubenswrapper[4664]: I1003 09:16:26.879332 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-btbwk" Oct 03 09:16:26 crc kubenswrapper[4664]: I1003 09:16:26.967368 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-btbwk" Oct 03 09:16:27 crc kubenswrapper[4664]: I1003 09:16:27.147991 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-btbwk" Oct 03 09:16:27 crc kubenswrapper[4664]: I1003 09:16:27.336989 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-btbwk"] Oct 03 09:16:29 crc kubenswrapper[4664]: I1003 09:16:29.094485 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-btbwk" podUID="33be4cfa-29e6-4e1d-99df-278ee7ab5863" containerName="registry-server" containerID="cri-o://b08dd45b749ed4071e77cefbcb3072b03349db5212a4627d837ff143bdd04575" gracePeriod=2 Oct 03 09:16:29 crc kubenswrapper[4664]: I1003 09:16:29.570731 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-btbwk" Oct 03 09:16:29 crc kubenswrapper[4664]: I1003 09:16:29.663798 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-2f8b7_d3d13871-7a04-4b9f-a8f0-7fbaff9cff1e/cert-manager-controller/0.log" Oct 03 09:16:29 crc kubenswrapper[4664]: I1003 09:16:29.738059 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33be4cfa-29e6-4e1d-99df-278ee7ab5863-catalog-content\") pod \"33be4cfa-29e6-4e1d-99df-278ee7ab5863\" (UID: \"33be4cfa-29e6-4e1d-99df-278ee7ab5863\") " Oct 03 09:16:29 crc kubenswrapper[4664]: I1003 09:16:29.738168 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plg6h\" (UniqueName: \"kubernetes.io/projected/33be4cfa-29e6-4e1d-99df-278ee7ab5863-kube-api-access-plg6h\") pod \"33be4cfa-29e6-4e1d-99df-278ee7ab5863\" (UID: \"33be4cfa-29e6-4e1d-99df-278ee7ab5863\") " Oct 03 09:16:29 crc kubenswrapper[4664]: I1003 09:16:29.738239 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33be4cfa-29e6-4e1d-99df-278ee7ab5863-utilities\") pod \"33be4cfa-29e6-4e1d-99df-278ee7ab5863\" (UID: \"33be4cfa-29e6-4e1d-99df-278ee7ab5863\") " Oct 03 09:16:29 crc kubenswrapper[4664]: I1003 09:16:29.739509 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33be4cfa-29e6-4e1d-99df-278ee7ab5863-utilities" (OuterVolumeSpecName: "utilities") pod "33be4cfa-29e6-4e1d-99df-278ee7ab5863" (UID: "33be4cfa-29e6-4e1d-99df-278ee7ab5863"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:16:29 crc kubenswrapper[4664]: I1003 09:16:29.754811 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33be4cfa-29e6-4e1d-99df-278ee7ab5863-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "33be4cfa-29e6-4e1d-99df-278ee7ab5863" (UID: "33be4cfa-29e6-4e1d-99df-278ee7ab5863"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:16:29 crc kubenswrapper[4664]: I1003 09:16:29.755655 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33be4cfa-29e6-4e1d-99df-278ee7ab5863-kube-api-access-plg6h" (OuterVolumeSpecName: "kube-api-access-plg6h") pod "33be4cfa-29e6-4e1d-99df-278ee7ab5863" (UID: "33be4cfa-29e6-4e1d-99df-278ee7ab5863"). InnerVolumeSpecName "kube-api-access-plg6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:16:29 crc kubenswrapper[4664]: I1003 09:16:29.809870 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-zvfms_bc158e46-d2d5-4c58-aa38-a1d395d68991/cert-manager-cainjector/0.log" Oct 03 09:16:29 crc kubenswrapper[4664]: I1003 09:16:29.839994 4664 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33be4cfa-29e6-4e1d-99df-278ee7ab5863-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:16:29 crc kubenswrapper[4664]: I1003 09:16:29.840028 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plg6h\" (UniqueName: \"kubernetes.io/projected/33be4cfa-29e6-4e1d-99df-278ee7ab5863-kube-api-access-plg6h\") on node \"crc\" DevicePath \"\"" Oct 03 09:16:29 crc kubenswrapper[4664]: I1003 09:16:29.840040 4664 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33be4cfa-29e6-4e1d-99df-278ee7ab5863-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:16:29 crc kubenswrapper[4664]: I1003 09:16:29.895980 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-xtrlc_bbaac95d-7035-4d08-97e6-70e1b4ef4b3f/cert-manager-webhook/0.log" Oct 03 09:16:30 crc kubenswrapper[4664]: I1003 09:16:30.104820 4664 generic.go:334] "Generic (PLEG): container finished" podID="33be4cfa-29e6-4e1d-99df-278ee7ab5863" containerID="b08dd45b749ed4071e77cefbcb3072b03349db5212a4627d837ff143bdd04575" exitCode=0 Oct 03 09:16:30 crc kubenswrapper[4664]: I1003 09:16:30.104915 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-btbwk" Oct 03 09:16:30 crc kubenswrapper[4664]: I1003 09:16:30.104909 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-btbwk" event={"ID":"33be4cfa-29e6-4e1d-99df-278ee7ab5863","Type":"ContainerDied","Data":"b08dd45b749ed4071e77cefbcb3072b03349db5212a4627d837ff143bdd04575"} Oct 03 09:16:30 crc kubenswrapper[4664]: I1003 09:16:30.105343 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-btbwk" event={"ID":"33be4cfa-29e6-4e1d-99df-278ee7ab5863","Type":"ContainerDied","Data":"1f7181fd2470113fb5a901dcadf5f85dd2bcdd23c8c64969e4e7a5d35b8ff8dd"} Oct 03 09:16:30 crc kubenswrapper[4664]: I1003 09:16:30.105392 4664 scope.go:117] "RemoveContainer" containerID="b08dd45b749ed4071e77cefbcb3072b03349db5212a4627d837ff143bdd04575" Oct 03 09:16:30 crc kubenswrapper[4664]: I1003 09:16:30.127793 4664 scope.go:117] "RemoveContainer" containerID="fd5aaf7e2293d423fa4cda2216ba21433c8806e01cf6d58888cab2690247233f" Oct 03 09:16:30 crc kubenswrapper[4664]: I1003 09:16:30.134262 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-btbwk"] Oct 03 09:16:30 crc kubenswrapper[4664]: I1003 09:16:30.146577 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-btbwk"] Oct 03 09:16:30 crc kubenswrapper[4664]: I1003 09:16:30.155149 4664 scope.go:117] "RemoveContainer" containerID="e87454e492ba1c006396b15962e5ee96e2620d04a2ba37ab7fa10f8c179d2aa1" Oct 03 09:16:30 crc kubenswrapper[4664]: I1003 09:16:30.202952 4664 scope.go:117] "RemoveContainer" containerID="b08dd45b749ed4071e77cefbcb3072b03349db5212a4627d837ff143bdd04575" Oct 03 09:16:30 crc kubenswrapper[4664]: E1003 09:16:30.203424 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b08dd45b749ed4071e77cefbcb3072b03349db5212a4627d837ff143bdd04575\": container with ID starting with b08dd45b749ed4071e77cefbcb3072b03349db5212a4627d837ff143bdd04575 not found: ID does not exist" containerID="b08dd45b749ed4071e77cefbcb3072b03349db5212a4627d837ff143bdd04575" Oct 03 09:16:30 crc kubenswrapper[4664]: I1003 09:16:30.203475 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b08dd45b749ed4071e77cefbcb3072b03349db5212a4627d837ff143bdd04575"} err="failed to get container status \"b08dd45b749ed4071e77cefbcb3072b03349db5212a4627d837ff143bdd04575\": rpc error: code = NotFound desc = could not find container \"b08dd45b749ed4071e77cefbcb3072b03349db5212a4627d837ff143bdd04575\": container with ID starting with b08dd45b749ed4071e77cefbcb3072b03349db5212a4627d837ff143bdd04575 not found: ID does not exist" Oct 03 09:16:30 crc kubenswrapper[4664]: I1003 09:16:30.203506 4664 scope.go:117] "RemoveContainer" containerID="fd5aaf7e2293d423fa4cda2216ba21433c8806e01cf6d58888cab2690247233f" Oct 03 09:16:30 crc kubenswrapper[4664]: E1003 09:16:30.203816 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd5aaf7e2293d423fa4cda2216ba21433c8806e01cf6d58888cab2690247233f\": container with ID starting with fd5aaf7e2293d423fa4cda2216ba21433c8806e01cf6d58888cab2690247233f not found: ID does not exist" containerID="fd5aaf7e2293d423fa4cda2216ba21433c8806e01cf6d58888cab2690247233f" Oct 03 09:16:30 crc kubenswrapper[4664]: I1003 09:16:30.203847 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd5aaf7e2293d423fa4cda2216ba21433c8806e01cf6d58888cab2690247233f"} err="failed to get container status \"fd5aaf7e2293d423fa4cda2216ba21433c8806e01cf6d58888cab2690247233f\": rpc error: code = NotFound desc = could not find container \"fd5aaf7e2293d423fa4cda2216ba21433c8806e01cf6d58888cab2690247233f\": container with ID starting with fd5aaf7e2293d423fa4cda2216ba21433c8806e01cf6d58888cab2690247233f not found: ID does not exist" Oct 03 09:16:30 crc kubenswrapper[4664]: I1003 09:16:30.203866 4664 scope.go:117] "RemoveContainer" containerID="e87454e492ba1c006396b15962e5ee96e2620d04a2ba37ab7fa10f8c179d2aa1" Oct 03 09:16:30 crc kubenswrapper[4664]: E1003 09:16:30.204060 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e87454e492ba1c006396b15962e5ee96e2620d04a2ba37ab7fa10f8c179d2aa1\": container with ID starting with e87454e492ba1c006396b15962e5ee96e2620d04a2ba37ab7fa10f8c179d2aa1 not found: ID does not exist" containerID="e87454e492ba1c006396b15962e5ee96e2620d04a2ba37ab7fa10f8c179d2aa1" Oct 03 09:16:30 crc kubenswrapper[4664]: I1003 09:16:30.204086 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e87454e492ba1c006396b15962e5ee96e2620d04a2ba37ab7fa10f8c179d2aa1"} err="failed to get container status \"e87454e492ba1c006396b15962e5ee96e2620d04a2ba37ab7fa10f8c179d2aa1\": rpc error: code = NotFound desc = could not find container \"e87454e492ba1c006396b15962e5ee96e2620d04a2ba37ab7fa10f8c179d2aa1\": container with ID starting with e87454e492ba1c006396b15962e5ee96e2620d04a2ba37ab7fa10f8c179d2aa1 not found: ID does not exist" Oct 03 09:16:31 crc kubenswrapper[4664]: I1003 09:16:31.896740 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33be4cfa-29e6-4e1d-99df-278ee7ab5863" path="/var/lib/kubelet/pods/33be4cfa-29e6-4e1d-99df-278ee7ab5863/volumes" Oct 03 09:16:41 crc kubenswrapper[4664]: I1003 09:16:41.987240 4664 patch_prober.go:28] interesting pod/machine-config-daemon-x9dgm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:16:41 crc kubenswrapper[4664]: I1003 09:16:41.987790 4664 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:16:41 crc kubenswrapper[4664]: I1003 09:16:41.987837 4664 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" Oct 03 09:16:41 crc kubenswrapper[4664]: I1003 09:16:41.988595 4664 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ec8cb97235e056eca8cb80e25dc332b41b368618452bc85cd99bf48d88672107"} pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 09:16:41 crc kubenswrapper[4664]: I1003 09:16:41.988711 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerName="machine-config-daemon" containerID="cri-o://ec8cb97235e056eca8cb80e25dc332b41b368618452bc85cd99bf48d88672107" gracePeriod=600 Oct 03 09:16:42 crc kubenswrapper[4664]: E1003 09:16:42.125000 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 09:16:42 crc kubenswrapper[4664]: I1003 09:16:42.214620 4664 generic.go:334] "Generic (PLEG): container finished" podID="598b81ce-0ce7-498f-9337-ae5e6e64682b" containerID="ec8cb97235e056eca8cb80e25dc332b41b368618452bc85cd99bf48d88672107" exitCode=0 Oct 03 09:16:42 crc kubenswrapper[4664]: I1003 09:16:42.214674 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" event={"ID":"598b81ce-0ce7-498f-9337-ae5e6e64682b","Type":"ContainerDied","Data":"ec8cb97235e056eca8cb80e25dc332b41b368618452bc85cd99bf48d88672107"} Oct 03 09:16:42 crc kubenswrapper[4664]: I1003 09:16:42.214716 4664 scope.go:117] "RemoveContainer" containerID="b2f0e8060b32103ae029b4593961836565e27c6c08ac83af035e3afa6eaadd61" Oct 03 09:16:42 crc kubenswrapper[4664]: I1003 09:16:42.215444 4664 scope.go:117] "RemoveContainer" containerID="ec8cb97235e056eca8cb80e25dc332b41b368618452bc85cd99bf48d88672107" Oct 03 09:16:42 crc kubenswrapper[4664]: E1003 09:16:42.215822 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 09:16:43 crc kubenswrapper[4664]: I1003 09:16:43.649693 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-l6sj8_6888a579-64c6-4313-bb63-5d8e09d9389c/nmstate-console-plugin/0.log" Oct 03 09:16:43 crc kubenswrapper[4664]: I1003 09:16:43.799962 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-flh2z_cff70c7f-0a7c-4288-ae5a-74a88043586b/nmstate-handler/0.log" Oct 03 09:16:43 crc kubenswrapper[4664]: I1003 09:16:43.858325 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-5cmx6_53e89226-668e-4f4e-9035-293df2a37944/kube-rbac-proxy/0.log" Oct 03 09:16:43 crc kubenswrapper[4664]: I1003 09:16:43.882719 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-5cmx6_53e89226-668e-4f4e-9035-293df2a37944/nmstate-metrics/0.log" Oct 03 09:16:44 crc kubenswrapper[4664]: I1003 09:16:44.089284 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-fj4m9_ef13533a-c837-42d9-87f7-2025250fd36f/nmstate-operator/0.log" Oct 03 09:16:44 crc kubenswrapper[4664]: I1003 09:16:44.109775 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-jct4m_c3c37a07-83fe-42c4-89b3-ab591db694aa/nmstate-webhook/0.log" Oct 03 09:16:57 crc kubenswrapper[4664]: I1003 09:16:57.876532 4664 scope.go:117] "RemoveContainer" containerID="ec8cb97235e056eca8cb80e25dc332b41b368618452bc85cd99bf48d88672107" Oct 03 09:16:57 crc kubenswrapper[4664]: E1003 09:16:57.877673 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 09:17:00 crc kubenswrapper[4664]: I1003 09:17:00.579736 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-nlqsf_6adfa358-0d56-406d-ac16-a982e3049c21/kube-rbac-proxy/0.log" Oct 03 09:17:00 crc kubenswrapper[4664]: I1003 09:17:00.717473 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-nlqsf_6adfa358-0d56-406d-ac16-a982e3049c21/controller/0.log" Oct 03 09:17:00 crc kubenswrapper[4664]: I1003 09:17:00.796382 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5vzcf_1604a32c-c92b-4bed-9402-1ea47abcc2ea/cp-frr-files/0.log" Oct 03 09:17:00 crc kubenswrapper[4664]: I1003 09:17:00.979004 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5vzcf_1604a32c-c92b-4bed-9402-1ea47abcc2ea/cp-metrics/0.log" Oct 03 09:17:01 crc kubenswrapper[4664]: I1003 09:17:01.009015 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5vzcf_1604a32c-c92b-4bed-9402-1ea47abcc2ea/cp-reloader/0.log" Oct 03 09:17:01 crc kubenswrapper[4664]: I1003 09:17:01.012143 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5vzcf_1604a32c-c92b-4bed-9402-1ea47abcc2ea/cp-reloader/0.log" Oct 03 09:17:01 crc kubenswrapper[4664]: I1003 09:17:01.039373 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5vzcf_1604a32c-c92b-4bed-9402-1ea47abcc2ea/cp-frr-files/0.log" Oct 03 09:17:01 crc kubenswrapper[4664]: I1003 09:17:01.164187 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5vzcf_1604a32c-c92b-4bed-9402-1ea47abcc2ea/cp-reloader/0.log" Oct 03 09:17:01 crc kubenswrapper[4664]: I1003 09:17:01.171411 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5vzcf_1604a32c-c92b-4bed-9402-1ea47abcc2ea/cp-frr-files/0.log" Oct 03 09:17:01 crc kubenswrapper[4664]: I1003 09:17:01.220426 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5vzcf_1604a32c-c92b-4bed-9402-1ea47abcc2ea/cp-metrics/0.log" Oct 03 09:17:01 crc kubenswrapper[4664]: I1003 09:17:01.251432 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5vzcf_1604a32c-c92b-4bed-9402-1ea47abcc2ea/cp-metrics/0.log" Oct 03 09:17:01 crc kubenswrapper[4664]: I1003 09:17:01.429596 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5vzcf_1604a32c-c92b-4bed-9402-1ea47abcc2ea/cp-metrics/0.log" Oct 03 09:17:01 crc kubenswrapper[4664]: I1003 09:17:01.446769 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5vzcf_1604a32c-c92b-4bed-9402-1ea47abcc2ea/cp-reloader/0.log" Oct 03 09:17:01 crc kubenswrapper[4664]: I1003 09:17:01.447583 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5vzcf_1604a32c-c92b-4bed-9402-1ea47abcc2ea/cp-frr-files/0.log" Oct 03 09:17:01 crc kubenswrapper[4664]: I1003 09:17:01.470521 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5vzcf_1604a32c-c92b-4bed-9402-1ea47abcc2ea/controller/0.log" Oct 03 09:17:01 crc kubenswrapper[4664]: I1003 09:17:01.667862 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5vzcf_1604a32c-c92b-4bed-9402-1ea47abcc2ea/kube-rbac-proxy/0.log" Oct 03 09:17:01 crc kubenswrapper[4664]: I1003 09:17:01.690226 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5vzcf_1604a32c-c92b-4bed-9402-1ea47abcc2ea/kube-rbac-proxy-frr/0.log" Oct 03 09:17:01 crc kubenswrapper[4664]: I1003 09:17:01.750475 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5vzcf_1604a32c-c92b-4bed-9402-1ea47abcc2ea/frr-metrics/0.log" Oct 03 09:17:01 crc kubenswrapper[4664]: I1003 09:17:01.853806 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5vzcf_1604a32c-c92b-4bed-9402-1ea47abcc2ea/reloader/0.log" Oct 03 09:17:01 crc kubenswrapper[4664]: I1003 09:17:01.959933 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-j98p4_28c891c2-52c6-400d-95c8-5c29bb6962a7/frr-k8s-webhook-server/0.log" Oct 03 09:17:02 crc kubenswrapper[4664]: I1003 09:17:02.140713 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-78d5f5dfc9-nqdfv_3493d554-e23b-40b4-b583-45650c6e2b0a/manager/0.log" Oct 03 09:17:02 crc kubenswrapper[4664]: I1003 09:17:02.299667 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5df98bb99-xxf5h_10515fdf-6940-4258-bd80-714dbb847505/webhook-server/0.log" Oct 03 09:17:02 crc kubenswrapper[4664]: I1003 09:17:02.449441 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-rq5kc_876be351-8fa7-4af6-b979-17941886901e/kube-rbac-proxy/0.log" Oct 03 09:17:03 crc kubenswrapper[4664]: I1003 09:17:03.016062 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-rq5kc_876be351-8fa7-4af6-b979-17941886901e/speaker/0.log" Oct 03 09:17:03 crc kubenswrapper[4664]: I1003 09:17:03.116282 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5vzcf_1604a32c-c92b-4bed-9402-1ea47abcc2ea/frr/0.log" Oct 03 09:17:12 crc kubenswrapper[4664]: I1003 09:17:12.877165 4664 scope.go:117] "RemoveContainer" containerID="ec8cb97235e056eca8cb80e25dc332b41b368618452bc85cd99bf48d88672107" Oct 03 09:17:12 crc kubenswrapper[4664]: E1003 09:17:12.880085 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 09:17:14 crc kubenswrapper[4664]: I1003 09:17:14.468697 4664 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ctx6m"] Oct 03 09:17:14 crc kubenswrapper[4664]: E1003 09:17:14.469429 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33be4cfa-29e6-4e1d-99df-278ee7ab5863" containerName="registry-server" Oct 03 09:17:14 crc kubenswrapper[4664]: I1003 09:17:14.469443 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="33be4cfa-29e6-4e1d-99df-278ee7ab5863" containerName="registry-server" Oct 03 09:17:14 crc kubenswrapper[4664]: E1003 09:17:14.469455 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33be4cfa-29e6-4e1d-99df-278ee7ab5863" containerName="extract-utilities" Oct 03 09:17:14 crc kubenswrapper[4664]: I1003 09:17:14.469461 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="33be4cfa-29e6-4e1d-99df-278ee7ab5863" containerName="extract-utilities" Oct 03 09:17:14 crc kubenswrapper[4664]: E1003 09:17:14.469473 4664 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33be4cfa-29e6-4e1d-99df-278ee7ab5863" containerName="extract-content" Oct 03 09:17:14 crc kubenswrapper[4664]: I1003 09:17:14.469479 4664 state_mem.go:107] "Deleted CPUSet assignment" podUID="33be4cfa-29e6-4e1d-99df-278ee7ab5863" containerName="extract-content" Oct 03 09:17:14 crc kubenswrapper[4664]: I1003 09:17:14.469696 4664 memory_manager.go:354] "RemoveStaleState removing state" podUID="33be4cfa-29e6-4e1d-99df-278ee7ab5863" containerName="registry-server" Oct 03 09:17:14 crc kubenswrapper[4664]: I1003 09:17:14.471346 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ctx6m" Oct 03 09:17:14 crc kubenswrapper[4664]: I1003 09:17:14.478140 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ctx6m"] Oct 03 09:17:14 crc kubenswrapper[4664]: I1003 09:17:14.609967 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4a4f770-b2b2-4365-9cd5-e033b7909548-utilities\") pod \"community-operators-ctx6m\" (UID: \"e4a4f770-b2b2-4365-9cd5-e033b7909548\") " pod="openshift-marketplace/community-operators-ctx6m" Oct 03 09:17:14 crc kubenswrapper[4664]: I1003 09:17:14.610333 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmmfp\" (UniqueName: \"kubernetes.io/projected/e4a4f770-b2b2-4365-9cd5-e033b7909548-kube-api-access-fmmfp\") pod \"community-operators-ctx6m\" (UID: \"e4a4f770-b2b2-4365-9cd5-e033b7909548\") " pod="openshift-marketplace/community-operators-ctx6m" Oct 03 09:17:14 crc kubenswrapper[4664]: I1003 09:17:14.610433 4664 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4a4f770-b2b2-4365-9cd5-e033b7909548-catalog-content\") pod \"community-operators-ctx6m\" (UID: \"e4a4f770-b2b2-4365-9cd5-e033b7909548\") " pod="openshift-marketplace/community-operators-ctx6m" Oct 03 09:17:14 crc kubenswrapper[4664]: I1003 09:17:14.711916 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4a4f770-b2b2-4365-9cd5-e033b7909548-utilities\") pod \"community-operators-ctx6m\" (UID: \"e4a4f770-b2b2-4365-9cd5-e033b7909548\") " pod="openshift-marketplace/community-operators-ctx6m" Oct 03 09:17:14 crc kubenswrapper[4664]: I1003 09:17:14.711982 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmmfp\" (UniqueName: \"kubernetes.io/projected/e4a4f770-b2b2-4365-9cd5-e033b7909548-kube-api-access-fmmfp\") pod \"community-operators-ctx6m\" (UID: \"e4a4f770-b2b2-4365-9cd5-e033b7909548\") " pod="openshift-marketplace/community-operators-ctx6m" Oct 03 09:17:14 crc kubenswrapper[4664]: I1003 09:17:14.712010 4664 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4a4f770-b2b2-4365-9cd5-e033b7909548-catalog-content\") pod \"community-operators-ctx6m\" (UID: \"e4a4f770-b2b2-4365-9cd5-e033b7909548\") " pod="openshift-marketplace/community-operators-ctx6m" Oct 03 09:17:14 crc kubenswrapper[4664]: I1003 09:17:14.712677 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4a4f770-b2b2-4365-9cd5-e033b7909548-catalog-content\") pod \"community-operators-ctx6m\" (UID: \"e4a4f770-b2b2-4365-9cd5-e033b7909548\") " pod="openshift-marketplace/community-operators-ctx6m" Oct 03 09:17:14 crc kubenswrapper[4664]: I1003 09:17:14.712723 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4a4f770-b2b2-4365-9cd5-e033b7909548-utilities\") pod \"community-operators-ctx6m\" (UID: \"e4a4f770-b2b2-4365-9cd5-e033b7909548\") " pod="openshift-marketplace/community-operators-ctx6m" Oct 03 09:17:14 crc kubenswrapper[4664]: I1003 09:17:14.735086 4664 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmmfp\" (UniqueName: \"kubernetes.io/projected/e4a4f770-b2b2-4365-9cd5-e033b7909548-kube-api-access-fmmfp\") pod \"community-operators-ctx6m\" (UID: \"e4a4f770-b2b2-4365-9cd5-e033b7909548\") " pod="openshift-marketplace/community-operators-ctx6m" Oct 03 09:17:14 crc kubenswrapper[4664]: I1003 09:17:14.788140 4664 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ctx6m" Oct 03 09:17:15 crc kubenswrapper[4664]: I1003 09:17:15.313841 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ctx6m"] Oct 03 09:17:15 crc kubenswrapper[4664]: I1003 09:17:15.518391 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ctx6m" event={"ID":"e4a4f770-b2b2-4365-9cd5-e033b7909548","Type":"ContainerStarted","Data":"608f69f4b2436a6b70ad77eadf6d66ce029d6cfb56ee9df9f9c1e0ce2aff02e5"} Oct 03 09:17:16 crc kubenswrapper[4664]: I1003 09:17:16.529433 4664 generic.go:334] "Generic (PLEG): container finished" podID="e4a4f770-b2b2-4365-9cd5-e033b7909548" containerID="2b58a639829c6fbdc028448a4a7c913ef3ac11ce2e3ff062cf3bae97c0c4a197" exitCode=0 Oct 03 09:17:16 crc kubenswrapper[4664]: I1003 09:17:16.529662 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ctx6m" event={"ID":"e4a4f770-b2b2-4365-9cd5-e033b7909548","Type":"ContainerDied","Data":"2b58a639829c6fbdc028448a4a7c913ef3ac11ce2e3ff062cf3bae97c0c4a197"} Oct 03 09:17:16 crc kubenswrapper[4664]: I1003 09:17:16.640385 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xd4q2_bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49/util/0.log" Oct 03 09:17:16 crc kubenswrapper[4664]: I1003 09:17:16.822713 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xd4q2_bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49/pull/0.log" Oct 03 09:17:16 crc kubenswrapper[4664]: I1003 09:17:16.829866 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xd4q2_bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49/util/0.log" Oct 03 09:17:16 crc kubenswrapper[4664]: I1003 09:17:16.852965 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xd4q2_bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49/pull/0.log" Oct 03 09:17:17 crc kubenswrapper[4664]: I1003 09:17:17.020275 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xd4q2_bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49/pull/0.log" Oct 03 09:17:17 crc kubenswrapper[4664]: I1003 09:17:17.021326 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xd4q2_bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49/util/0.log" Oct 03 09:17:17 crc kubenswrapper[4664]: I1003 09:17:17.054514 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xd4q2_bf4d43c8-bf96-44f4-9c95-0d6bd4ba3b49/extract/0.log" Oct 03 09:17:17 crc kubenswrapper[4664]: I1003 09:17:17.224572 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-27n9z_5844f220-7a74-41b4-9b03-eee894a66f32/extract-utilities/0.log" Oct 03 09:17:17 crc kubenswrapper[4664]: I1003 09:17:17.392908 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-27n9z_5844f220-7a74-41b4-9b03-eee894a66f32/extract-utilities/0.log" Oct 03 09:17:17 crc kubenswrapper[4664]: I1003 09:17:17.394172 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-27n9z_5844f220-7a74-41b4-9b03-eee894a66f32/extract-content/0.log" Oct 03 09:17:17 crc kubenswrapper[4664]: I1003 09:17:17.420113 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-27n9z_5844f220-7a74-41b4-9b03-eee894a66f32/extract-content/0.log" Oct 03 09:17:17 crc kubenswrapper[4664]: I1003 09:17:17.615048 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-27n9z_5844f220-7a74-41b4-9b03-eee894a66f32/extract-content/0.log" Oct 03 09:17:17 crc kubenswrapper[4664]: I1003 09:17:17.647088 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-27n9z_5844f220-7a74-41b4-9b03-eee894a66f32/extract-utilities/0.log" Oct 03 09:17:17 crc kubenswrapper[4664]: I1003 09:17:17.815667 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ctx6m_e4a4f770-b2b2-4365-9cd5-e033b7909548/extract-utilities/0.log" Oct 03 09:17:18 crc kubenswrapper[4664]: I1003 09:17:18.092138 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ctx6m_e4a4f770-b2b2-4365-9cd5-e033b7909548/extract-utilities/0.log" Oct 03 09:17:18 crc kubenswrapper[4664]: I1003 09:17:18.305484 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-27n9z_5844f220-7a74-41b4-9b03-eee894a66f32/registry-server/0.log" Oct 03 09:17:18 crc kubenswrapper[4664]: I1003 09:17:18.348150 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ctx6m_e4a4f770-b2b2-4365-9cd5-e033b7909548/extract-utilities/0.log" Oct 03 09:17:18 crc kubenswrapper[4664]: I1003 09:17:18.506939 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zr8nh_40c95b7d-be3e-4613-b795-f5d636b12ce4/extract-utilities/0.log" Oct 03 09:17:18 crc kubenswrapper[4664]: I1003 09:17:18.716561 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zr8nh_40c95b7d-be3e-4613-b795-f5d636b12ce4/extract-content/0.log" Oct 03 09:17:18 crc kubenswrapper[4664]: I1003 09:17:18.772095 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zr8nh_40c95b7d-be3e-4613-b795-f5d636b12ce4/extract-content/0.log" Oct 03 09:17:18 crc kubenswrapper[4664]: I1003 09:17:18.777337 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zr8nh_40c95b7d-be3e-4613-b795-f5d636b12ce4/extract-utilities/0.log" Oct 03 09:17:18 crc kubenswrapper[4664]: I1003 09:17:18.915059 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zr8nh_40c95b7d-be3e-4613-b795-f5d636b12ce4/extract-content/0.log" Oct 03 09:17:18 crc kubenswrapper[4664]: I1003 09:17:18.920370 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zr8nh_40c95b7d-be3e-4613-b795-f5d636b12ce4/extract-utilities/0.log" Oct 03 09:17:19 crc kubenswrapper[4664]: I1003 09:17:19.120767 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cb76b8_cd00ff11-ad1a-4739-9a03-5c01723dea02/util/0.log" Oct 03 09:17:19 crc kubenswrapper[4664]: I1003 09:17:19.251174 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cb76b8_cd00ff11-ad1a-4739-9a03-5c01723dea02/pull/0.log" Oct 03 09:17:19 crc kubenswrapper[4664]: I1003 09:17:19.321814 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cb76b8_cd00ff11-ad1a-4739-9a03-5c01723dea02/util/0.log" Oct 03 09:17:19 crc kubenswrapper[4664]: I1003 09:17:19.361899 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cb76b8_cd00ff11-ad1a-4739-9a03-5c01723dea02/pull/0.log" Oct 03 09:17:19 crc kubenswrapper[4664]: I1003 09:17:19.622338 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cb76b8_cd00ff11-ad1a-4739-9a03-5c01723dea02/pull/0.log" Oct 03 09:17:19 crc kubenswrapper[4664]: I1003 09:17:19.659172 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cb76b8_cd00ff11-ad1a-4739-9a03-5c01723dea02/util/0.log" Oct 03 09:17:19 crc kubenswrapper[4664]: I1003 09:17:19.664777 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cb76b8_cd00ff11-ad1a-4739-9a03-5c01723dea02/extract/0.log" Oct 03 09:17:19 crc kubenswrapper[4664]: I1003 09:17:19.953949 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-gwncn_d87572d6-1577-487c-a43f-e99ea9b20724/marketplace-operator/0.log" Oct 03 09:17:19 crc kubenswrapper[4664]: I1003 09:17:19.970234 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zr8nh_40c95b7d-be3e-4613-b795-f5d636b12ce4/registry-server/0.log" Oct 03 09:17:20 crc kubenswrapper[4664]: I1003 09:17:20.116257 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pvwj7_3abe4747-28b2-446f-afd5-b0e736c90d03/extract-utilities/0.log" Oct 03 09:17:20 crc kubenswrapper[4664]: I1003 09:17:20.233196 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pvwj7_3abe4747-28b2-446f-afd5-b0e736c90d03/extract-utilities/0.log" Oct 03 09:17:20 crc kubenswrapper[4664]: I1003 09:17:20.282776 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pvwj7_3abe4747-28b2-446f-afd5-b0e736c90d03/extract-content/0.log" Oct 03 09:17:20 crc kubenswrapper[4664]: I1003 09:17:20.298341 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pvwj7_3abe4747-28b2-446f-afd5-b0e736c90d03/extract-content/0.log" Oct 03 09:17:20 crc kubenswrapper[4664]: I1003 09:17:20.451733 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pvwj7_3abe4747-28b2-446f-afd5-b0e736c90d03/extract-utilities/0.log" Oct 03 09:17:20 crc kubenswrapper[4664]: I1003 09:17:20.488482 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pvwj7_3abe4747-28b2-446f-afd5-b0e736c90d03/extract-content/0.log" Oct 03 09:17:20 crc kubenswrapper[4664]: I1003 09:17:20.551207 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dxhhl_61835213-5c1e-47e2-88ed-453c167e750d/extract-utilities/0.log" Oct 03 09:17:20 crc kubenswrapper[4664]: I1003 09:17:20.662442 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pvwj7_3abe4747-28b2-446f-afd5-b0e736c90d03/registry-server/0.log" Oct 03 09:17:20 crc kubenswrapper[4664]: I1003 09:17:20.729141 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dxhhl_61835213-5c1e-47e2-88ed-453c167e750d/extract-content/0.log" Oct 03 09:17:20 crc kubenswrapper[4664]: I1003 09:17:20.752405 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dxhhl_61835213-5c1e-47e2-88ed-453c167e750d/extract-utilities/0.log" Oct 03 09:17:20 crc kubenswrapper[4664]: I1003 09:17:20.762182 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dxhhl_61835213-5c1e-47e2-88ed-453c167e750d/extract-content/0.log" Oct 03 09:17:20 crc kubenswrapper[4664]: I1003 09:17:20.935432 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dxhhl_61835213-5c1e-47e2-88ed-453c167e750d/extract-utilities/0.log" Oct 03 09:17:20 crc kubenswrapper[4664]: I1003 09:17:20.938660 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dxhhl_61835213-5c1e-47e2-88ed-453c167e750d/extract-content/0.log" Oct 03 09:17:21 crc kubenswrapper[4664]: I1003 09:17:21.540200 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dxhhl_61835213-5c1e-47e2-88ed-453c167e750d/registry-server/0.log" Oct 03 09:17:24 crc kubenswrapper[4664]: I1003 09:17:24.610909 4664 generic.go:334] "Generic (PLEG): container finished" podID="e4a4f770-b2b2-4365-9cd5-e033b7909548" containerID="8aec288f299250e4c9eac8f45afeb0c48073d08e61d0f00d274b0e0f58667344" exitCode=0 Oct 03 09:17:24 crc kubenswrapper[4664]: I1003 09:17:24.610960 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ctx6m" event={"ID":"e4a4f770-b2b2-4365-9cd5-e033b7909548","Type":"ContainerDied","Data":"8aec288f299250e4c9eac8f45afeb0c48073d08e61d0f00d274b0e0f58667344"} Oct 03 09:17:26 crc kubenswrapper[4664]: I1003 09:17:26.634629 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ctx6m" event={"ID":"e4a4f770-b2b2-4365-9cd5-e033b7909548","Type":"ContainerStarted","Data":"0129461c2e4c8dcbe1a9f13d56d2acd53a4130040edd1d696b42892ced8ffbc1"} Oct 03 09:17:26 crc kubenswrapper[4664]: I1003 09:17:26.665465 4664 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ctx6m" podStartSLOduration=3.741728411 podStartE2EDuration="12.665441047s" podCreationTimestamp="2025-10-03 09:17:14 +0000 UTC" firstStartedPulling="2025-10-03 09:17:16.534428821 +0000 UTC m=+5337.355619311" lastFinishedPulling="2025-10-03 09:17:25.458141467 +0000 UTC m=+5346.279331947" observedRunningTime="2025-10-03 09:17:26.65751616 +0000 UTC m=+5347.478706660" watchObservedRunningTime="2025-10-03 09:17:26.665441047 +0000 UTC m=+5347.486631547" Oct 03 09:17:26 crc kubenswrapper[4664]: I1003 09:17:26.877217 4664 scope.go:117] "RemoveContainer" containerID="ec8cb97235e056eca8cb80e25dc332b41b368618452bc85cd99bf48d88672107" Oct 03 09:17:26 crc kubenswrapper[4664]: E1003 09:17:26.877529 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 09:17:34 crc kubenswrapper[4664]: I1003 09:17:34.788849 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ctx6m" Oct 03 09:17:34 crc kubenswrapper[4664]: I1003 09:17:34.789346 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ctx6m" Oct 03 09:17:34 crc kubenswrapper[4664]: I1003 09:17:34.847305 4664 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ctx6m" Oct 03 09:17:36 crc kubenswrapper[4664]: I1003 09:17:36.718139 4664 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ctx6m" Oct 03 09:17:36 crc kubenswrapper[4664]: I1003 09:17:36.859745 4664 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ctx6m"] Oct 03 09:17:36 crc kubenswrapper[4664]: I1003 09:17:36.928503 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zr8nh"] Oct 03 09:17:37 crc kubenswrapper[4664]: I1003 09:17:37.764786 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zr8nh" podUID="40c95b7d-be3e-4613-b795-f5d636b12ce4" containerName="registry-server" containerID="cri-o://3330c5000fbbba8a868e43f570760da6b5dcae9570523be540f0e41bd6b6f8d1" gracePeriod=2 Oct 03 09:17:38 crc kubenswrapper[4664]: I1003 09:17:38.395037 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zr8nh" Oct 03 09:17:38 crc kubenswrapper[4664]: I1003 09:17:38.531388 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjf4l\" (UniqueName: \"kubernetes.io/projected/40c95b7d-be3e-4613-b795-f5d636b12ce4-kube-api-access-hjf4l\") pod \"40c95b7d-be3e-4613-b795-f5d636b12ce4\" (UID: \"40c95b7d-be3e-4613-b795-f5d636b12ce4\") " Oct 03 09:17:38 crc kubenswrapper[4664]: I1003 09:17:38.531509 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40c95b7d-be3e-4613-b795-f5d636b12ce4-catalog-content\") pod \"40c95b7d-be3e-4613-b795-f5d636b12ce4\" (UID: \"40c95b7d-be3e-4613-b795-f5d636b12ce4\") " Oct 03 09:17:38 crc kubenswrapper[4664]: I1003 09:17:38.531795 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40c95b7d-be3e-4613-b795-f5d636b12ce4-utilities\") pod \"40c95b7d-be3e-4613-b795-f5d636b12ce4\" (UID: \"40c95b7d-be3e-4613-b795-f5d636b12ce4\") " Oct 03 09:17:38 crc kubenswrapper[4664]: I1003 09:17:38.533498 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40c95b7d-be3e-4613-b795-f5d636b12ce4-utilities" (OuterVolumeSpecName: "utilities") pod "40c95b7d-be3e-4613-b795-f5d636b12ce4" (UID: "40c95b7d-be3e-4613-b795-f5d636b12ce4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:17:38 crc kubenswrapper[4664]: I1003 09:17:38.603372 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40c95b7d-be3e-4613-b795-f5d636b12ce4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "40c95b7d-be3e-4613-b795-f5d636b12ce4" (UID: "40c95b7d-be3e-4613-b795-f5d636b12ce4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:17:38 crc kubenswrapper[4664]: I1003 09:17:38.633907 4664 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40c95b7d-be3e-4613-b795-f5d636b12ce4-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:17:38 crc kubenswrapper[4664]: I1003 09:17:38.633944 4664 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40c95b7d-be3e-4613-b795-f5d636b12ce4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:17:38 crc kubenswrapper[4664]: I1003 09:17:38.780077 4664 generic.go:334] "Generic (PLEG): container finished" podID="40c95b7d-be3e-4613-b795-f5d636b12ce4" containerID="3330c5000fbbba8a868e43f570760da6b5dcae9570523be540f0e41bd6b6f8d1" exitCode=0 Oct 03 09:17:38 crc kubenswrapper[4664]: I1003 09:17:38.780154 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zr8nh" event={"ID":"40c95b7d-be3e-4613-b795-f5d636b12ce4","Type":"ContainerDied","Data":"3330c5000fbbba8a868e43f570760da6b5dcae9570523be540f0e41bd6b6f8d1"} Oct 03 09:17:38 crc kubenswrapper[4664]: I1003 09:17:38.780183 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zr8nh" event={"ID":"40c95b7d-be3e-4613-b795-f5d636b12ce4","Type":"ContainerDied","Data":"436d7e3d93761d0608ac3f96001e37dd9636b14db68b10f34834898a019172ab"} Oct 03 09:17:38 crc kubenswrapper[4664]: I1003 09:17:38.780203 4664 scope.go:117] "RemoveContainer" containerID="3330c5000fbbba8a868e43f570760da6b5dcae9570523be540f0e41bd6b6f8d1" Oct 03 09:17:38 crc kubenswrapper[4664]: I1003 09:17:38.780333 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zr8nh" Oct 03 09:17:38 crc kubenswrapper[4664]: I1003 09:17:38.805995 4664 scope.go:117] "RemoveContainer" containerID="af9e636443a538c3543f4db0aa467d351570d19f417d119f7a40068a46c0507c" Oct 03 09:17:38 crc kubenswrapper[4664]: I1003 09:17:38.876775 4664 scope.go:117] "RemoveContainer" containerID="ec8cb97235e056eca8cb80e25dc332b41b368618452bc85cd99bf48d88672107" Oct 03 09:17:38 crc kubenswrapper[4664]: E1003 09:17:38.877108 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 09:17:38 crc kubenswrapper[4664]: I1003 09:17:38.924934 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40c95b7d-be3e-4613-b795-f5d636b12ce4-kube-api-access-hjf4l" (OuterVolumeSpecName: "kube-api-access-hjf4l") pod "40c95b7d-be3e-4613-b795-f5d636b12ce4" (UID: "40c95b7d-be3e-4613-b795-f5d636b12ce4"). InnerVolumeSpecName "kube-api-access-hjf4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:17:38 crc kubenswrapper[4664]: I1003 09:17:38.937713 4664 scope.go:117] "RemoveContainer" containerID="83388f83612b239f9e1c2d77a9a22ddb080260b1599486772900c36057c7ea34" Oct 03 09:17:38 crc kubenswrapper[4664]: I1003 09:17:38.939383 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjf4l\" (UniqueName: \"kubernetes.io/projected/40c95b7d-be3e-4613-b795-f5d636b12ce4-kube-api-access-hjf4l\") on node \"crc\" DevicePath \"\"" Oct 03 09:17:39 crc kubenswrapper[4664]: I1003 09:17:39.280177 4664 scope.go:117] "RemoveContainer" containerID="3330c5000fbbba8a868e43f570760da6b5dcae9570523be540f0e41bd6b6f8d1" Oct 03 09:17:39 crc kubenswrapper[4664]: E1003 09:17:39.282194 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3330c5000fbbba8a868e43f570760da6b5dcae9570523be540f0e41bd6b6f8d1\": container with ID starting with 3330c5000fbbba8a868e43f570760da6b5dcae9570523be540f0e41bd6b6f8d1 not found: ID does not exist" containerID="3330c5000fbbba8a868e43f570760da6b5dcae9570523be540f0e41bd6b6f8d1" Oct 03 09:17:39 crc kubenswrapper[4664]: I1003 09:17:39.282242 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3330c5000fbbba8a868e43f570760da6b5dcae9570523be540f0e41bd6b6f8d1"} err="failed to get container status \"3330c5000fbbba8a868e43f570760da6b5dcae9570523be540f0e41bd6b6f8d1\": rpc error: code = NotFound desc = could not find container \"3330c5000fbbba8a868e43f570760da6b5dcae9570523be540f0e41bd6b6f8d1\": container with ID starting with 3330c5000fbbba8a868e43f570760da6b5dcae9570523be540f0e41bd6b6f8d1 not found: ID does not exist" Oct 03 09:17:39 crc kubenswrapper[4664]: I1003 09:17:39.282274 4664 scope.go:117] "RemoveContainer" containerID="af9e636443a538c3543f4db0aa467d351570d19f417d119f7a40068a46c0507c" Oct 03 09:17:39 crc kubenswrapper[4664]: E1003 09:17:39.282818 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af9e636443a538c3543f4db0aa467d351570d19f417d119f7a40068a46c0507c\": container with ID starting with af9e636443a538c3543f4db0aa467d351570d19f417d119f7a40068a46c0507c not found: ID does not exist" containerID="af9e636443a538c3543f4db0aa467d351570d19f417d119f7a40068a46c0507c" Oct 03 09:17:39 crc kubenswrapper[4664]: I1003 09:17:39.282849 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af9e636443a538c3543f4db0aa467d351570d19f417d119f7a40068a46c0507c"} err="failed to get container status \"af9e636443a538c3543f4db0aa467d351570d19f417d119f7a40068a46c0507c\": rpc error: code = NotFound desc = could not find container \"af9e636443a538c3543f4db0aa467d351570d19f417d119f7a40068a46c0507c\": container with ID starting with af9e636443a538c3543f4db0aa467d351570d19f417d119f7a40068a46c0507c not found: ID does not exist" Oct 03 09:17:39 crc kubenswrapper[4664]: I1003 09:17:39.282871 4664 scope.go:117] "RemoveContainer" containerID="83388f83612b239f9e1c2d77a9a22ddb080260b1599486772900c36057c7ea34" Oct 03 09:17:39 crc kubenswrapper[4664]: E1003 09:17:39.283440 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83388f83612b239f9e1c2d77a9a22ddb080260b1599486772900c36057c7ea34\": container with ID starting with 83388f83612b239f9e1c2d77a9a22ddb080260b1599486772900c36057c7ea34 not found: ID does not exist" containerID="83388f83612b239f9e1c2d77a9a22ddb080260b1599486772900c36057c7ea34" Oct 03 09:17:39 crc kubenswrapper[4664]: I1003 09:17:39.283466 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83388f83612b239f9e1c2d77a9a22ddb080260b1599486772900c36057c7ea34"} err="failed to get container status \"83388f83612b239f9e1c2d77a9a22ddb080260b1599486772900c36057c7ea34\": rpc error: code = NotFound desc = could not find container \"83388f83612b239f9e1c2d77a9a22ddb080260b1599486772900c36057c7ea34\": container with ID starting with 83388f83612b239f9e1c2d77a9a22ddb080260b1599486772900c36057c7ea34 not found: ID does not exist" Oct 03 09:17:39 crc kubenswrapper[4664]: I1003 09:17:39.306101 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zr8nh"] Oct 03 09:17:39 crc kubenswrapper[4664]: I1003 09:17:39.316080 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zr8nh"] Oct 03 09:17:39 crc kubenswrapper[4664]: I1003 09:17:39.885594 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40c95b7d-be3e-4613-b795-f5d636b12ce4" path="/var/lib/kubelet/pods/40c95b7d-be3e-4613-b795-f5d636b12ce4/volumes" Oct 03 09:17:53 crc kubenswrapper[4664]: I1003 09:17:53.877139 4664 scope.go:117] "RemoveContainer" containerID="ec8cb97235e056eca8cb80e25dc332b41b368618452bc85cd99bf48d88672107" Oct 03 09:17:53 crc kubenswrapper[4664]: E1003 09:17:53.877792 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 09:18:08 crc kubenswrapper[4664]: I1003 09:18:08.877589 4664 scope.go:117] "RemoveContainer" containerID="ec8cb97235e056eca8cb80e25dc332b41b368618452bc85cd99bf48d88672107" Oct 03 09:18:08 crc kubenswrapper[4664]: E1003 09:18:08.879037 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 09:18:19 crc kubenswrapper[4664]: I1003 09:18:19.893070 4664 scope.go:117] "RemoveContainer" containerID="ec8cb97235e056eca8cb80e25dc332b41b368618452bc85cd99bf48d88672107" Oct 03 09:18:19 crc kubenswrapper[4664]: E1003 09:18:19.894300 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 09:18:32 crc kubenswrapper[4664]: I1003 09:18:32.876526 4664 scope.go:117] "RemoveContainer" containerID="ec8cb97235e056eca8cb80e25dc332b41b368618452bc85cd99bf48d88672107" Oct 03 09:18:32 crc kubenswrapper[4664]: E1003 09:18:32.879733 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 09:18:47 crc kubenswrapper[4664]: I1003 09:18:47.876914 4664 scope.go:117] "RemoveContainer" containerID="ec8cb97235e056eca8cb80e25dc332b41b368618452bc85cd99bf48d88672107" Oct 03 09:18:47 crc kubenswrapper[4664]: E1003 09:18:47.877781 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 09:18:59 crc kubenswrapper[4664]: I1003 09:18:59.893302 4664 scope.go:117] "RemoveContainer" containerID="ec8cb97235e056eca8cb80e25dc332b41b368618452bc85cd99bf48d88672107" Oct 03 09:18:59 crc kubenswrapper[4664]: E1003 09:18:59.894409 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 09:19:11 crc kubenswrapper[4664]: I1003 09:19:11.878063 4664 scope.go:117] "RemoveContainer" containerID="ec8cb97235e056eca8cb80e25dc332b41b368618452bc85cd99bf48d88672107" Oct 03 09:19:11 crc kubenswrapper[4664]: E1003 09:19:11.879262 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 09:19:17 crc kubenswrapper[4664]: I1003 09:19:17.893569 4664 generic.go:334] "Generic (PLEG): container finished" podID="b5278517-8bdb-4383-9477-631fa551bf9d" containerID="e33c8e84bd795cd47f958735dda1723f43e435e4ab67440b695bdb89fa27ea12" exitCode=0 Oct 03 09:19:17 crc kubenswrapper[4664]: I1003 09:19:17.894575 4664 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v7wcj/must-gather-8zp6x" event={"ID":"b5278517-8bdb-4383-9477-631fa551bf9d","Type":"ContainerDied","Data":"e33c8e84bd795cd47f958735dda1723f43e435e4ab67440b695bdb89fa27ea12"} Oct 03 09:19:17 crc kubenswrapper[4664]: I1003 09:19:17.895602 4664 scope.go:117] "RemoveContainer" containerID="e33c8e84bd795cd47f958735dda1723f43e435e4ab67440b695bdb89fa27ea12" Oct 03 09:19:18 crc kubenswrapper[4664]: I1003 09:19:18.559404 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-v7wcj_must-gather-8zp6x_b5278517-8bdb-4383-9477-631fa551bf9d/gather/0.log" Oct 03 09:19:25 crc kubenswrapper[4664]: I1003 09:19:25.877425 4664 scope.go:117] "RemoveContainer" containerID="ec8cb97235e056eca8cb80e25dc332b41b368618452bc85cd99bf48d88672107" Oct 03 09:19:25 crc kubenswrapper[4664]: E1003 09:19:25.878547 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 09:19:26 crc kubenswrapper[4664]: I1003 09:19:26.427246 4664 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-v7wcj/must-gather-8zp6x"] Oct 03 09:19:26 crc kubenswrapper[4664]: I1003 09:19:26.427550 4664 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-v7wcj/must-gather-8zp6x" podUID="b5278517-8bdb-4383-9477-631fa551bf9d" containerName="copy" containerID="cri-o://f368ce669fa9b72d5bbca31a768ae5e0196ebab5542a4cd390ff60e5aa30cce9" gracePeriod=2 Oct 03 09:19:26 crc kubenswrapper[4664]: I1003 09:19:26.438036 4664 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-v7wcj/must-gather-8zp6x"] Oct 03 09:19:26 crc kubenswrapper[4664]: I1003 09:19:26.881123 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-v7wcj_must-gather-8zp6x_b5278517-8bdb-4383-9477-631fa551bf9d/copy/0.log" Oct 03 09:19:26 crc kubenswrapper[4664]: I1003 09:19:26.881716 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v7wcj/must-gather-8zp6x" Oct 03 09:19:26 crc kubenswrapper[4664]: I1003 09:19:26.978287 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b5278517-8bdb-4383-9477-631fa551bf9d-must-gather-output\") pod \"b5278517-8bdb-4383-9477-631fa551bf9d\" (UID: \"b5278517-8bdb-4383-9477-631fa551bf9d\") " Oct 03 09:19:26 crc kubenswrapper[4664]: I1003 09:19:26.978553 4664 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxnvh\" (UniqueName: \"kubernetes.io/projected/b5278517-8bdb-4383-9477-631fa551bf9d-kube-api-access-wxnvh\") pod \"b5278517-8bdb-4383-9477-631fa551bf9d\" (UID: \"b5278517-8bdb-4383-9477-631fa551bf9d\") " Oct 03 09:19:26 crc kubenswrapper[4664]: I1003 09:19:26.987810 4664 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-v7wcj_must-gather-8zp6x_b5278517-8bdb-4383-9477-631fa551bf9d/copy/0.log" Oct 03 09:19:26 crc kubenswrapper[4664]: I1003 09:19:26.988291 4664 generic.go:334] "Generic (PLEG): container finished" podID="b5278517-8bdb-4383-9477-631fa551bf9d" containerID="f368ce669fa9b72d5bbca31a768ae5e0196ebab5542a4cd390ff60e5aa30cce9" exitCode=143 Oct 03 09:19:26 crc kubenswrapper[4664]: I1003 09:19:26.988355 4664 scope.go:117] "RemoveContainer" containerID="f368ce669fa9b72d5bbca31a768ae5e0196ebab5542a4cd390ff60e5aa30cce9" Oct 03 09:19:26 crc kubenswrapper[4664]: I1003 09:19:26.988359 4664 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v7wcj/must-gather-8zp6x" Oct 03 09:19:26 crc kubenswrapper[4664]: I1003 09:19:26.993810 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5278517-8bdb-4383-9477-631fa551bf9d-kube-api-access-wxnvh" (OuterVolumeSpecName: "kube-api-access-wxnvh") pod "b5278517-8bdb-4383-9477-631fa551bf9d" (UID: "b5278517-8bdb-4383-9477-631fa551bf9d"). InnerVolumeSpecName "kube-api-access-wxnvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:19:27 crc kubenswrapper[4664]: I1003 09:19:27.068089 4664 scope.go:117] "RemoveContainer" containerID="e33c8e84bd795cd47f958735dda1723f43e435e4ab67440b695bdb89fa27ea12" Oct 03 09:19:27 crc kubenswrapper[4664]: I1003 09:19:27.080878 4664 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxnvh\" (UniqueName: \"kubernetes.io/projected/b5278517-8bdb-4383-9477-631fa551bf9d-kube-api-access-wxnvh\") on node \"crc\" DevicePath \"\"" Oct 03 09:19:27 crc kubenswrapper[4664]: I1003 09:19:27.202824 4664 scope.go:117] "RemoveContainer" containerID="f368ce669fa9b72d5bbca31a768ae5e0196ebab5542a4cd390ff60e5aa30cce9" Oct 03 09:19:27 crc kubenswrapper[4664]: E1003 09:19:27.205644 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f368ce669fa9b72d5bbca31a768ae5e0196ebab5542a4cd390ff60e5aa30cce9\": container with ID starting with f368ce669fa9b72d5bbca31a768ae5e0196ebab5542a4cd390ff60e5aa30cce9 not found: ID does not exist" containerID="f368ce669fa9b72d5bbca31a768ae5e0196ebab5542a4cd390ff60e5aa30cce9" Oct 03 09:19:27 crc kubenswrapper[4664]: I1003 09:19:27.205702 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f368ce669fa9b72d5bbca31a768ae5e0196ebab5542a4cd390ff60e5aa30cce9"} err="failed to get container status \"f368ce669fa9b72d5bbca31a768ae5e0196ebab5542a4cd390ff60e5aa30cce9\": rpc error: code = NotFound desc = could not find container \"f368ce669fa9b72d5bbca31a768ae5e0196ebab5542a4cd390ff60e5aa30cce9\": container with ID starting with f368ce669fa9b72d5bbca31a768ae5e0196ebab5542a4cd390ff60e5aa30cce9 not found: ID does not exist" Oct 03 09:19:27 crc kubenswrapper[4664]: I1003 09:19:27.205735 4664 scope.go:117] "RemoveContainer" containerID="e33c8e84bd795cd47f958735dda1723f43e435e4ab67440b695bdb89fa27ea12" Oct 03 09:19:27 crc kubenswrapper[4664]: E1003 09:19:27.206707 4664 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e33c8e84bd795cd47f958735dda1723f43e435e4ab67440b695bdb89fa27ea12\": container with ID starting with e33c8e84bd795cd47f958735dda1723f43e435e4ab67440b695bdb89fa27ea12 not found: ID does not exist" containerID="e33c8e84bd795cd47f958735dda1723f43e435e4ab67440b695bdb89fa27ea12" Oct 03 09:19:27 crc kubenswrapper[4664]: I1003 09:19:27.206745 4664 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e33c8e84bd795cd47f958735dda1723f43e435e4ab67440b695bdb89fa27ea12"} err="failed to get container status \"e33c8e84bd795cd47f958735dda1723f43e435e4ab67440b695bdb89fa27ea12\": rpc error: code = NotFound desc = could not find container \"e33c8e84bd795cd47f958735dda1723f43e435e4ab67440b695bdb89fa27ea12\": container with ID starting with e33c8e84bd795cd47f958735dda1723f43e435e4ab67440b695bdb89fa27ea12 not found: ID does not exist" Oct 03 09:19:27 crc kubenswrapper[4664]: I1003 09:19:27.208882 4664 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5278517-8bdb-4383-9477-631fa551bf9d-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "b5278517-8bdb-4383-9477-631fa551bf9d" (UID: "b5278517-8bdb-4383-9477-631fa551bf9d"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:19:27 crc kubenswrapper[4664]: I1003 09:19:27.287700 4664 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b5278517-8bdb-4383-9477-631fa551bf9d-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 03 09:19:27 crc kubenswrapper[4664]: I1003 09:19:27.889858 4664 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5278517-8bdb-4383-9477-631fa551bf9d" path="/var/lib/kubelet/pods/b5278517-8bdb-4383-9477-631fa551bf9d/volumes" Oct 03 09:19:37 crc kubenswrapper[4664]: I1003 09:19:37.877023 4664 scope.go:117] "RemoveContainer" containerID="ec8cb97235e056eca8cb80e25dc332b41b368618452bc85cd99bf48d88672107" Oct 03 09:19:37 crc kubenswrapper[4664]: E1003 09:19:37.877806 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 09:19:49 crc kubenswrapper[4664]: I1003 09:19:49.885049 4664 scope.go:117] "RemoveContainer" containerID="ec8cb97235e056eca8cb80e25dc332b41b368618452bc85cd99bf48d88672107" Oct 03 09:19:49 crc kubenswrapper[4664]: E1003 09:19:49.885832 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 09:20:03 crc kubenswrapper[4664]: I1003 09:20:03.876224 4664 scope.go:117] "RemoveContainer" containerID="ec8cb97235e056eca8cb80e25dc332b41b368618452bc85cd99bf48d88672107" Oct 03 09:20:03 crc kubenswrapper[4664]: E1003 09:20:03.877039 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 09:20:18 crc kubenswrapper[4664]: I1003 09:20:18.878838 4664 scope.go:117] "RemoveContainer" containerID="ec8cb97235e056eca8cb80e25dc332b41b368618452bc85cd99bf48d88672107" Oct 03 09:20:18 crc kubenswrapper[4664]: E1003 09:20:18.882023 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 09:20:28 crc kubenswrapper[4664]: I1003 09:20:28.095908 4664 scope.go:117] "RemoveContainer" containerID="a81ddb17cf9d0844c4675cf281837cb42bba29baac0fc4488c75aaf5e125eb73" Oct 03 09:20:30 crc kubenswrapper[4664]: I1003 09:20:30.876474 4664 scope.go:117] "RemoveContainer" containerID="ec8cb97235e056eca8cb80e25dc332b41b368618452bc85cd99bf48d88672107" Oct 03 09:20:30 crc kubenswrapper[4664]: E1003 09:20:30.877328 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 09:20:43 crc kubenswrapper[4664]: I1003 09:20:43.876700 4664 scope.go:117] "RemoveContainer" containerID="ec8cb97235e056eca8cb80e25dc332b41b368618452bc85cd99bf48d88672107" Oct 03 09:20:43 crc kubenswrapper[4664]: E1003 09:20:43.877443 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 09:20:57 crc kubenswrapper[4664]: I1003 09:20:57.877008 4664 scope.go:117] "RemoveContainer" containerID="ec8cb97235e056eca8cb80e25dc332b41b368618452bc85cd99bf48d88672107" Oct 03 09:20:57 crc kubenswrapper[4664]: E1003 09:20:57.877884 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 09:21:09 crc kubenswrapper[4664]: I1003 09:21:09.881389 4664 scope.go:117] "RemoveContainer" containerID="ec8cb97235e056eca8cb80e25dc332b41b368618452bc85cd99bf48d88672107" Oct 03 09:21:09 crc kubenswrapper[4664]: E1003 09:21:09.882354 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b" Oct 03 09:21:21 crc kubenswrapper[4664]: I1003 09:21:21.876876 4664 scope.go:117] "RemoveContainer" containerID="ec8cb97235e056eca8cb80e25dc332b41b368618452bc85cd99bf48d88672107" Oct 03 09:21:21 crc kubenswrapper[4664]: E1003 09:21:21.878211 4664 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x9dgm_openshift-machine-config-operator(598b81ce-0ce7-498f-9337-ae5e6e64682b)\"" pod="openshift-machine-config-operator/machine-config-daemon-x9dgm" podUID="598b81ce-0ce7-498f-9337-ae5e6e64682b"